html_url
stringlengths
48
51
title
stringlengths
5
268
comments
stringlengths
70
51.8k
body
stringlengths
0
29.8k
comment_length
int64
16
1.52k
text
stringlengths
164
54.1k
embeddings
sequence
https://github.com/huggingface/datasets/issues/649
Inconsistent behavior in map
Thanks for reporting ! This issue must have appeared when we refactored type inference in `nlp` By default the library tries to keep the same feature types when applying `map` but apparently it has troubles with nested structures. I'll try to fix that next week
I'm observing inconsistent behavior when applying .map(). This happens specifically when I'm incrementally adding onto a feature that is a nested dictionary. Here's a simple example that reproduces the problem. ```python import datasets # Dataset with a single feature called 'field' consisting of two examples dataset = datasets.Dataset.from_dict({'field': ['a', 'b']}) print(dataset[0]) # outputs {'field': 'a'} # Map this dataset to create another feature called 'otherfield', which is a dictionary containing a key called 'capital' dataset = dataset.map(lambda example: {'otherfield': {'capital': example['field'].capitalize()}}) print(dataset[0]) # output is okay {'field': 'a', 'otherfield': {'capital': 'A'}} # Now I want to map again to modify 'otherfield', by adding another key called 'append_x' to the dictionary under 'otherfield' print(dataset.map(lambda example: {'otherfield': {'append_x': example['field'] + 'x'}})[0]) # printing out the first example after applying the map shows that the new key 'append_x' doesn't get added # it also messes up the value stored at 'capital' {'field': 'a', 'otherfield': {'capital': None}} # Instead, I try to do the same thing by using a different mapped fn print(dataset.map(lambda example: {'otherfield': {'append_x': example['field'] + 'x', 'capital': example['otherfield']['capital']}})[0]) # this preserves the value under capital, but still no 'append_x' {'field': 'a', 'otherfield': {'capital': 'A'}} # Instead, I try to pass 'otherfield' to remove_columns print(dataset.map(lambda example: {'otherfield': {'append_x': example['field'] + 'x', 'capital': example['otherfield']['capital']}}, remove_columns=['otherfield'])[0]) # this still doesn't fix the problem {'field': 'a', 'otherfield': {'capital': 'A'}} # Alternately, here's what happens if I just directly map both 'capital' and 'append_x' on a fresh dataset. # Recreate the dataset dataset = datasets.Dataset.from_dict({'field': ['a', 'b']}) # Now map the entire 'otherfield' dict directly, instead of incrementally as before print(dataset.map(lambda example: {'otherfield': {'append_x': example['field'] + 'x', 'capital': example['field'].capitalize()}})[0]) # This looks good! {'field': 'a', 'otherfield': {'append_x': 'ax', 'capital': 'A'}} ``` This might be a new issue, because I didn't see this behavior in the `nlp` library. Any help is appreciated!
45
Inconsistent behavior in map I'm observing inconsistent behavior when applying .map(). This happens specifically when I'm incrementally adding onto a feature that is a nested dictionary. Here's a simple example that reproduces the problem. ```python import datasets # Dataset with a single feature called 'field' consisting of two examples dataset = datasets.Dataset.from_dict({'field': ['a', 'b']}) print(dataset[0]) # outputs {'field': 'a'} # Map this dataset to create another feature called 'otherfield', which is a dictionary containing a key called 'capital' dataset = dataset.map(lambda example: {'otherfield': {'capital': example['field'].capitalize()}}) print(dataset[0]) # output is okay {'field': 'a', 'otherfield': {'capital': 'A'}} # Now I want to map again to modify 'otherfield', by adding another key called 'append_x' to the dictionary under 'otherfield' print(dataset.map(lambda example: {'otherfield': {'append_x': example['field'] + 'x'}})[0]) # printing out the first example after applying the map shows that the new key 'append_x' doesn't get added # it also messes up the value stored at 'capital' {'field': 'a', 'otherfield': {'capital': None}} # Instead, I try to do the same thing by using a different mapped fn print(dataset.map(lambda example: {'otherfield': {'append_x': example['field'] + 'x', 'capital': example['otherfield']['capital']}})[0]) # this preserves the value under capital, but still no 'append_x' {'field': 'a', 'otherfield': {'capital': 'A'}} # Instead, I try to pass 'otherfield' to remove_columns print(dataset.map(lambda example: {'otherfield': {'append_x': example['field'] + 'x', 'capital': example['otherfield']['capital']}}, remove_columns=['otherfield'])[0]) # this still doesn't fix the problem {'field': 'a', 'otherfield': {'capital': 'A'}} # Alternately, here's what happens if I just directly map both 'capital' and 'append_x' on a fresh dataset. # Recreate the dataset dataset = datasets.Dataset.from_dict({'field': ['a', 'b']}) # Now map the entire 'otherfield' dict directly, instead of incrementally as before print(dataset.map(lambda example: {'otherfield': {'append_x': example['field'] + 'x', 'capital': example['field'].capitalize()}})[0]) # This looks good! {'field': 'a', 'otherfield': {'append_x': 'ax', 'capital': 'A'}} ``` This might be a new issue, because I didn't see this behavior in the `nlp` library. Any help is appreciated! Thanks for reporting ! This issue must have appeared when we refactored type inference in `nlp` By default the library tries to keep the same feature types when applying `map` but apparently it has troubles with nested structures. I'll try to fix that next week
[ 0.3283727169, -0.2924292088, -0.0719690546, 0.0814117119, -0.0720762536, -0.20643574, 0.063771002, 0.0195533261, 0.204148829, -0.0053226799, 0.292031765, 0.5868218541, 0.2092879564, 0.1178403571, -0.3162101507, 0.1631347686, 0.2691354454, -0.0755066872, 0.0736097693, -0.1366800368, -0.2508087754, 0.0666692927, -0.5501971245, -0.1045746803, -0.0512327291, -0.2376287878, -0.2272645384, -0.2153917253, 0.0933093801, -0.0640515462, -0.0802593231, 0.1609657109, -0.3127307594, 0.3500860333, -0.000115136, -0.1009186059, 0.2150339782, -0.0772833675, 0.0711237043, 0.0254445374, -0.0651640221, -0.1851978004, -0.0215640217, -0.2740437388, 0.1911569536, -0.0858947486, -0.2054755986, -0.2903950214, 0.0998269767, -0.2147043049, 0.2083424926, -0.0882749856, 0.0907832459, 0.2036317885, 0.0287715457, 0.1171301156, -0.0643493757, 0.0056963526, 0.0296095759, -0.245784983, 0.1012371406, 0.0323890485, -0.1886448264, -0.162893191, 0.386343658, 0.1219843328, 0.3245941103, -0.3239735365, 0.0626329556, -0.0192072857, 0.0291195996, 0.0251826141, -0.3432815075, -0.2539940774, -0.3001534641, -0.0827325657, 0.2275516093, -0.3730781078, -0.1039251536, -0.1914054155, -0.146864742, -0.1279847771, 0.2984394133, 0.5585368276, -0.3571730554, 0.0947528034, -0.1084651127, 0.2734063268, -0.1603787243, -0.1644424796, -0.2618696392, -0.4508467913, 0.0857140124, 0.3864738941, 0.0043787705, 0.0150336921, 0.3914425075, -0.1868935972, -0.0469609164, -0.4379445314, 0.1130758524, 0.2154433578, -0.3255037665, 0.2683791816, 0.3306059539, 0.1537619978, 0.4373146296, 0.2232000828, 0.1474876702, -0.0318472423, -0.3212690055, 0.0861437619, 0.6086947322, 0.0624117777, 0.2395025939, -0.2713070512, 0.4256376624, 0.0798460096, 0.0098993406, 0.1574282944, -0.5427670479, -0.066373229, 0.0712640956, -0.1117038429, 0.2715510726, -0.0866180509, -0.0785830393, 0.3137060702, 0.020754233, 0.5313534737, -0.1912367344, -0.0386032723, -0.2077517509, 0.0864645243, 0.0024234727, 0.1304294765, 0.2730060816, 0.3178068399, -0.139219597, -0.2177400589, -0.3970100284, -0.0170154124, 0.340040952, -0.1534976661, -0.3614327312, 0.368181169, -0.0495870374, -0.4164008498, 0.0381336361, 0.2708562613, -0.0403648354, 0.1602689624, 0.3031721115, 0.1658837199, -0.1487568915, -0.080744043, -0.1751651168, 0.3520653844, 0.4351249635, -0.299387753, 0.144383505, 0.0354404189, -0.4979200363, -0.4050820172, 0.0301578343, 0.5204486251, -0.1916332543, -0.0488710627, -0.0841387063, 0.3315016031, 0.0023970902, 0.242001608, 0.0136622526, 0.1531412899, -0.124973692, 0.2145013809, 0.2045969963, -0.1615843624, -0.444352895, 0.2402662784, -0.0463323072, 0.4343186021, -0.3960129917, -0.0249241889, 0.1610621959, -0.1515999734, 0.148851037, -0.0462926179, 0.1757986993, -0.0066656098, -0.2617480159, 0.3233522773, 0.2869600952, -0.3480007648, -0.2542613149, 0.4002701044, 0.260843724, -0.080487445, 0.2576496303, 0.1320991814, 0.2061034888, -0.0267863907, 0.1052478924, 0.0331105739, -0.1136967465, 0.0844121799, -0.4560527802, 0.3330090046, -0.0763354003, -0.1763489395, -0.1707534492, 0.0100763738, 0.0096049942, 0.1199483946, -0.079850629, -0.0691365749, 0.1536289901, -0.0008641072, -0.2326337099, -0.0664441213, 0.1253822446, -0.1909983903, 0.2330997288, 0.0806063712, -0.1279176176, 0.0695783868, 0.1376043707, -0.1389885992, -0.4655866027, 0.1393827498, 0.4000687301, -0.1673893332, -0.2414456159, 0.364187628, 0.0692581534, -0.0023019612, -0.2280892283, 0.0989655703, 0.3325875103, 0.3299348354, 0.0358656496, 0.1566068828, 0.14415209, -0.2623307109, -0.0911437124, 0.1735062599, 0.3850868642, 0.3000436127, -0.3314892352, -0.0568487383, -0.1686348617, 0.085157007, -0.0463743955, -0.6482143402, -0.3556043506, 0.1406767368, -0.0350371115, -0.1433358938, 0.1924090236, 0.4853857458, 0.1540119499, -0.116707474, 0.0462061986, -0.1036288515, -0.0340310782, 0.0007394552, -0.1649097204, 0.2468135655, 0.5029405355, -0.0317555293, 0.2145573646, 0.0306693707, 0.3216383755, -0.211724773, 0.0213165656, 0.1173987836, 0.0161566772, 0.146590054, 0.5601218939, 0.1424719691, -0.1684335321, 0.2632066011, 0.1871361881, 0.0386452451, 0.0624045581, 0.0907628089, -0.2531962991, 0.1338758618, 0.051071398, -0.3161162138, -0.098415181, -0.5975274444, -0.0520732813, 0.3707388937, -0.2355757654, 0.0839984864, 0.3379155397, -0.3251422644, 0.0241297185, -0.2906602919, 0.0372630209, -0.2106795013, 0.0101513062, -0.0008532964, -0.1192710996, -0.2419507504, 0.0747871622, 0.4518424273, -0.2425146997, -0.3445886374, -0.7150273919, -0.0899504423, -0.3523363769, 0.1481159776, -0.0713117868, -0.0721297413, -0.015670374, -0.0927321762, 0.2402782291, -0.2040442675, -0.1975306273, 0.0924526602, 0.1870161891, -0.3144247532, -0.1214854717, -0.0873895288, 0.0895958319, -0.2379463017, 0.2975450456, -0.0686320886, 0.0802902356, 0.1324363053, -0.1091018319, -0.2828169465, 0.174283728, -0.0787641704, -0.1740809083, 0.3903701305, 0.2282116562, -0.1205741763, 0.1720119715, -0.0620097034, -0.1167681962, -0.1341578513, 0.1855477244, -0.1187652051, -0.2799408436, -0.1594985127, 0.4333625734, 0.3595966995, 0.0029952414, 0.2989766002, 0.154724434, -0.0664263889, -0.1760397106, -0.4334545434, 0.1408875734, 0.5860911012, 0.0139514506, 0.1631965041, 0.2730639875, 0.0733999833, 0.3057003319, 0.3173776865, -0.0980085135, 0.2232822776, -0.1188822612, -0.0478853919, -0.2906947136, -0.1633781493, -0.221858412, -0.2506218553, 0.0141529664, 0.0295922887, 0.0194459073, -0.4428288937, 0.1400328279, -0.1187645942, -0.2647045553, -0.2848888934, -0.0516304038, -0.2999196351, 0.284106046, 0.1367473602, 0.2810204327, -0.1552036703, -0.138519153, -0.0789594203, -0.3428582549, 0.2689990401, -0.2150086313, -0.2971649766, -0.1210722178, -0.3153411746, 0.388584137, 0.065225333, 0.2098759562, 0.1250798404, -0.4220081568, 0.0919415504, 0.1380855441, 0.4020267725, -0.4266527891, -0.020001255, -0.124615714, -0.3372524977, -0.5130758882, -0.1785172522, 0.0059912652, 0.8075059056, 0.1143295318, 0.4953759909, -0.0577764362, 0.017659165, -0.1742606163, 0.0347507559, 0.0271135196, -0.118802622, 0.0878934786, -0.1159711778, -0.2429100573, -0.2227451503, 0.0133713484, 0.0186488591, 0.3174104095, -0.3497678936, 0.1555934995, 0.1434917599, -0.1378665119, -0.0431596115, 0.2221334279, 0.2328154892, 0.1089469641, 0.040827319, 0.0711775646, -0.3110004365, 0.3243192136, -0.0059176469, 0.1358444691, -0.0720116645, -0.2679104209, -0.011942789, 0.2709792256, 0.1727704108, 0.224178493, -0.2227843553, -0.0291617848, -0.0225874595, -0.291272074, 0.5955765843, 0.0222176872, 0.0258123428, -0.1589709818, 0.4224531054, -0.0279712379, -0.4867108464, 0.3390018344, -0.0815822035, -0.2324020267, 0.5947387815, -0.1880054623, 0.5043509007, 0.1774411052, -0.0333690122, 0.2652246952, 0.0960999206, -0.0781438202, 0.1000581086, 0.2075048536, -0.3149506152, -0.1735153198, -0.0446481556, 0.1884695292, -0.1958249062, 0.189678818, -0.1084642485, 0.0508314855, -0.1657964289, 0.6276271343, 0.0688499734, -0.31015414, 0.3981217742, -0.0345552787, -0.1007143408, 0.0702549592, 0.4636613727, 0.307357192, 0.0203546584, -0.0549087934, -0.1906631291, -0.3429856598, -0.0222575739, 0.0342617631, -0.1484508961, 0.2699086964, 0.088803716, 0.0082462132, 0.0272732973, 0.5211564898, 0.0885115564, 0.1108673811, 0.1789793074, -0.1381909698, 0.5807952285, 0.2329132855, 0.1320387572, -0.2373418212, -0.2910969257, 0.1437287182, -0.2804422379, 0.0871351659, -0.2552814186, -0.0709308684, -0.7718068361, 0.3505956531, -0.2758604288, -0.2989434004, -0.0883855969, 0.0231948979, -0.2038775682, -0.1183100343, 0.0394661687, -0.1617490798, -0.0789024308, 0.3045651019, -0.1264172345, -0.1972382814, 0.1007838324, 0.5037270188, 0.0695550963, 0.462564528, 0.2889646888, -0.2222478092, -0.0142583735, -0.0470950305, -0.1965955347, -0.2833272815, 0.2451337576, 0.0803098977, -0.1415917575, 0.0006986856, 0.1558643281, 0.3267731667, -0.3146662414, 0.0501046479, -0.2087605, -0.2365891039, -0.0785741955, 0.3313897252, 0.1773795933, 0.0771933198, -0.124443844, -0.0876157582, -0.0371807888, 0.5158189535, -0.2492160201, 0.0297315419, 0.5445585251, 0.3482587934, 0.5488522649, 0.1669695079, 0.0330427848, -0.1139921769, 0.1379624903, 0.0576444827, -0.0070284232, -0.196998477, 0.0927836895, 0.0975268781, 0.2467232496, 0.1431360394, -0.0202176198, -0.0817831904, 0.2235706449, -0.3962994814, 0.3992111683, 0.0477069691, -0.1095539629, 0.2654607594, -0.1771475822, 0.3552576602, 0.2055752873, 0.2373935878, -0.1122223586, -0.1242632419, 0.2162541896, 0.0146351643, 0.0578588918, -0.039293088, -0.0661938339, -0.0794740021, -0.3762365282, 0.1798867583, 0.3319212198, -0.2201945782, 0.031651184, 0.2090617716, 0.0847321972, 0.1028604731, -0.1960682869, -0.1993493438, -0.4081460834, 0.2083775252, -0.1368822455, -0.2583904564, 0.7140581608, 0.1178332865, 0.4833858609, 0.5350750089, 0.138788566, -0.3639431, -0.2383735925, -0.0338434279, -0.1895393133, -0.2641140223, 0.1569014043, 0.5044389367, -0.0935209319, 0.0170586295, 0.2735707462, 0.1556529105, -0.0954526216, 0.4887836874, 0.5087662339, 0.0641679987, 0.0838839859, -0.1006647497, 0.1373463273, -0.1960141063, 0.373991251, 0.2221966833, 0.1100254878, 0.1483698189, 0.2638941407, 0.001718346, 0.132278353, -0.3337878883, 0.0190553218, 0.229194656, -0.2112754136, -0.3193446994, 0.0417506248, -0.3345660269, -0.0603919923, -0.2493556142, -0.1686457843, -0.1024260968, 0.523063004, -0.1845661998, -0.0359459929, -0.170066148, 0.0130210873, -0.2495190203, 0.538136065, -0.0387337096, 0.2034346163, -0.0627985895, 0.0051704235, -0.0611836687, -0.1318739504, 0.1170629114, 0.2796712518, -0.3479600847, 0.4438444972, 0.0131660178, -0.3360472322, 0.1206351668, 0.2105062008, -0.2048144639, 0.0788402408, 0.1018047482, 0.1002451181, -0.1397597939, 0.1486284584, 0.3657959402, 0.4342813194, -0.0745405629, 0.2215061188, 0.0665691346, -0.0716385543, 0.013789041, -0.2227042913, 0.166564703, -0.5212205648, 0.0166324861, -0.1729620695, 0.2235325873, 0.0654631853, 0.0667767823, 0.0536084324, 0.2340758145, 0.1249893978, -0.1575347632, -0.2167906761, 0.0029989332, -0.4744947255, 0.1615253091, -0.1510284096, -0.0472113602, -0.1579039246, 0.0311763808, 0.3251593113, -0.0296743475, 0.1237820089, -0.4717045128, -0.2631177604, 0.3394152522, -0.0446448512, -0.114451699, -0.1804468036, -0.2015487999, 0.1241679266, -0.31407547, 0.1728306264, -0.1143606976, 0.1543124616, -0.0288259145, 0.0082057416, -0.3365117311, -0.2416059524, 0.5207401514, 0.3949007094, -0.0324203596, -0.2991228104, -0.2901268601, -0.187825501, 0.011146117, -0.2121375054, 0.3095389903, -0.0964081436, 0.3693118095, 0.0061805435, 0.0086325621, 0.0333506428, 0.1296660304, 0.174318254, -0.3438023031, -0.0990056992, 0.039933905, -0.218318969, 0.1883194149, -0.0171574764, 0.5380633473, -0.1453256458, 0.3856753111, 0.0955808312, -0.3196709454, 0.3257689476, -0.3355542421, -0.1595034152, 0.1842432916, 0.0158362985, 0.0343549699, -0.210764572, 0.0301296711, -0.4007388949, 0.2965137064, -0.3499025404, -0.1761532724, 0.0013842359, -0.2839896977, -0.3851124346, -0.057937555, 0.12563923, 0.0840848535, -0.1349222362, 0.1699389219, -0.1088756919 ]
https://github.com/huggingface/datasets/issues/647
Cannot download dataset_info.json
Thanks for reporting ! We should add support for servers without internet connection indeed I'll do that early next week
I am running my job on a cloud server where does not provide for connections from the standard compute nodes to outside resources. Hence, when I use `dataset.load_dataset()` to load data, I got an error like this: ``` ConnectionError: Couldn't reach https://storage.googleapis.com/huggingface-nlp/cache/datasets/text/default-53ee3045f07ba8ca/0.0.0/dataset_info.json ``` I tried to open this link manually, but I cannot access this file. How can I download this file and pass it through `dataset.load_dataset()` manually? Versions: Python version 3.7.3 PyTorch version 1.6.0 TensorFlow version 2.3.0 datasets version: 1.0.1
20
Cannot download dataset_info.json I am running my job on a cloud server where does not provide for connections from the standard compute nodes to outside resources. Hence, when I use `dataset.load_dataset()` to load data, I got an error like this: ``` ConnectionError: Couldn't reach https://storage.googleapis.com/huggingface-nlp/cache/datasets/text/default-53ee3045f07ba8ca/0.0.0/dataset_info.json ``` I tried to open this link manually, but I cannot access this file. How can I download this file and pass it through `dataset.load_dataset()` manually? Versions: Python version 3.7.3 PyTorch version 1.6.0 TensorFlow version 2.3.0 datasets version: 1.0.1 Thanks for reporting ! We should add support for servers without internet connection indeed I'll do that early next week
[ -0.2581689656, 0.024374336, -0.0594889335, 0.2031505406, 0.0788523555, 0.1288589835, 0.0959737152, 0.2360108197, 0.199166432, 0.0795686767, 0.1454199404, 0.2383176237, 0.2438560724, 0.1896516085, 0.1779325157, -0.1044930443, -0.1757187396, 0.0602591746, 0.0163061116, 0.163995862, -0.2036468983, 0.1125224754, -0.0277885422, 0.0930871665, -0.1486051828, -0.3295296133, 0.1726285815, 0.1674590707, -0.1995364428, -0.048016578, 0.3676276207, 0.1111577973, 0.0231307596, 0.1800838858, -0.0001172043, 0.2482483983, 0.3314515054, -0.0468825027, -0.3855317831, -0.6151013374, -0.2775319815, -0.3104660809, 0.132697165, -0.2921256423, 0.0545549393, -0.0274384283, 0.3667289019, -0.2455896884, 0.3066194057, 0.3828884065, 0.0818872824, 0.197574541, 0.1668197215, -0.0151025113, -0.0796690136, -0.1569041908, 0.1030237526, 0.2234266102, 0.0778492615, 0.0222973675, 0.1356131434, 0.2242699862, -0.1415861398, 0.1934000701, 0.3185082078, 0.0770923495, -0.2284039408, -0.2921460569, 0.3731547594, 0.4352722764, 0.7340173125, -0.258764416, -0.3504028022, -0.0941050127, 0.0685786679, 0.0049103275, 0.4181965292, 0.3318794072, -0.2674552202, 0.0532440431, -0.3854346275, -0.4756147861, -0.513351202, 0.2596527934, -0.0437952057, -0.0062724538, -0.0868522152, 0.0650682002, 0.1307829916, 0.0529989749, 0.11021626, -0.0586894155, 0.0596238635, 0.2255723923, -0.1393044889, 0.0642870516, -0.023155883, -0.3781901598, 0.0988889039, 0.237310499, 0.191016525, 0.0885315463, -0.2979579866, 0.2700927854, 0.2438107431, -0.0192254148, -0.0753265917, 0.050707493, 0.3943024874, 0.1891427487, 0.2085788697, -0.1946887821, -0.0591311753, -0.2078120708, -0.1013943478, 0.0001048837, 0.2709664702, -0.0075751394, -0.2217640281, 0.1743746698, -0.2492110282, 0.0577884316, -0.030844558, 0.286850661, -0.4215430915, 0.1188230067, 0.3688770831, 0.0901179165, -0.0124301091, -0.2255284488, -0.0405954979, 0.0507174842, 0.1518691182, 0.1059541106, 0.2024206072, -0.0358263403, 0.3656516075, -0.0974893272, -0.0094230045, -0.1456917822, 0.3619757891, -0.0676191822, -0.3675274253, 0.4784255028, 0.3598552346, 0.0498908311, -0.0304135978, 0.0404357612, -0.0042618364, 0.1028195471, -0.577280283, -0.4673508406, -0.1304805428, 0.0880939066, -0.173763454, 0.0141785704, -0.3551722169, -0.0608475618, -0.2671116889, -0.279756248, -0.1031835824, 0.0695326775, -0.1675974429, -0.2041429132, 0.3580977619, 0.305293411, -0.4322130084, -0.0583980568, -0.057355091, -0.2180733234, 0.0616179854, 0.0804648325, -0.244832024, 0.2876025736, -0.1839332581, 0.0781298876, 0.4764013886, -0.3475123942, -0.7220585346, 0.5315424204, -0.2805051804, -0.0655643046, 0.044531133, 0.2094904184, 0.1288463026, 0.258805871, 0.4079200327, 0.4810490608, 0.0635357648, -0.0915063843, 0.0023363531, -0.3649246693, 0.0945880339, 0.3108659983, 0.098939456, 0.2225617468, 0.1619278342, -0.0530174822, 0.204198733, 0.0445410237, 0.070053339, 0.4481007457, -0.0124174021, 0.0885309875, -0.1026971489, -0.1130254641, -0.6764417291, 0.1161550805, 0.0002772734, -0.0813719258, -0.6097713113, -0.0802460387, -0.2205837518, -0.0389362946, 0.0883241594, 0.2622841597, 0.0493859947, 0.0496016741, 0.1081951931, 0.1324222833, -0.1000126079, 0.4082683325, -0.31105721, 0.1138600558, -0.3475216031, 0.1198779047, 0.1842235774, 0.1942977309, 0.1091843098, -0.1518644243, 0.0917048454, -0.0541608036, -0.1490964144, 0.4382845163, -0.2289566696, 0.4687089026, -0.0645767227, 0.3214549124, 0.0793492347, -0.0825937986, -0.0710097551, 0.0418956392, 0.1006271392, -0.0280565247, -0.2885226011, 0.3034876585, 0.1970917583, 0.2505660057, 0.0842368454, 0.2263908237, 0.239890337, 0.0106029063, -0.101764001, 0.3073265553, 0.1692442745, 0.1132227629, 0.2009826601, -0.1750229746, -0.521666348, 0.1266986579, 0.3750922978, -0.1219120026, 0.1262110025, 0.1015705466, -0.1663369983, -0.0890584961, 0.1078620404, 0.1592137665, 0.184890449, -0.0357063562, -0.0391117446, 0.1988345534, -0.023585638, -0.2433003932, -0.0912043452, 0.0482790396, 0.2365596443, 0.0827146247, 0.0179685503, -0.0010184087, -0.0160230771, -0.2249883115, 0.1684877276, 0.2603359222, -0.2508513331, 0.2482613921, -0.2147575021, -0.1431508511, 0.112368539, 0.2177844495, -0.2780336738, -0.1969715357, -0.1126687601, 0.4512028694, -0.0530622378, -0.2184860408, -0.2282786965, 0.3127136827, 0.1134808585, -0.5957483649, -0.0996805876, -0.1528214812, -0.0897533, 0.0087623522, 0.1686339676, 0.1447745115, 0.2064514458, -0.1780371666, 0.0105936825, -0.4024782777, 0.1097433493, 0.0560423844, 0.172634095, 0.1398852319, 0.135679841, 0.6456845403, -0.1096472368, 0.0676187426, 0.1128700525, -0.0828176886, -0.1381506324, -0.0324003212, -0.1608320922, 0.146450907, -0.0544033796, -0.5343645215, -0.428578496, -0.299508512, 0.4056402445, -0.0244590379, 0.2796209455, 0.0601894371, 0.0293879062, 0.2110875249, 0.3687652647, 0.1299315989, -0.0944870487, -0.6934443116, 0.3646908104, -0.1951299608, -0.3631877601, 0.253272295, 0.0645145401, 0.1594463587, 0.4275202751, -0.6492446661, -0.1355832815, -0.0504810922, 0.2179581523, -0.2815865576, -0.033886306, 0.493914932, -0.2021096349, 0.070963338, -0.0619996712, -0.0312900133, 0.0147360228, 0.1017192155, 0.2120197713, 0.4684888721, 0.5388237238, 0.1260889173, 0.6983602047, -0.1538119316, 0.085001111, -0.0224630795, -0.3815025985, 0.202292189, 0.1190599352, -0.1221266538, -0.1462518275, -0.0883393288, -0.1300351024, -0.2253789604, -0.0755705684, 0.1261389405, -0.3344646096, -0.1559283286, -0.44941926, -0.2811367214, 0.1039605513, -0.0513809174, 0.2389108837, 0.2231044918, 0.2089971304, -0.4047693908, -0.2367789149, -0.0049997494, 0.3802209198, 0.1709282696, 0.2029418796, -0.1041327715, -0.1379257739, -0.4995947778, 0.4311987162, 0.0101298541, 0.4071516693, -0.2748411596, 0.1849474311, 0.0598677695, -0.0574487187, 0.4644341469, -0.3087022305, 0.3222224414, 0.0798898488, -0.0434513018, -0.2367430925, 0.0476070791, -0.10106989, 0.3014731407, 0.430585742, 0.4523825049, -0.1853969693, -0.1183362305, -0.0836559758, -0.0412399545, -0.2490080893, -0.0373984948, -0.1405755281, -0.5846738815, -0.2606702745, -0.1465644389, 0.0063717291, 0.3492477834, 0.0999022946, 0.2069005072, -0.0585071594, 0.2324883044, -0.0146716684, 0.0018996112, -0.0240557976, 0.1483073384, -0.1793148071, 0.4764508307, 0.2076443285, 0.2092897147, 0.6061325669, 0.3182343245, -0.309795171, 0.0760812312, 0.0074526742, 0.0707545206, 0.1588325053, -0.1001966223, -0.047574915, 0.5560961366, 0.1700889468, -0.226590693, 0.2163725048, 0.2004492283, -0.0449660383, -0.2611280978, -0.5889942646, 0.5028377771, 0.1020911187, -0.0888957232, 0.3096005917, -0.0867451206, -0.0259941891, -0.0179138742, -0.2588955462, 0.7463461161, -0.0078484993, 0.102185443, 0.2200821638, 0.1290266216, 0.7392459512, -0.8184088469, 0.4260356128, 0.0502767041, -0.2318522185, -0.2800002992, -0.1687983125, 0.0427939855, 0.2465155125, 0.13539657, 0.3537698686, 0.0492913574, 0.1398376077, 0.2809301615, 0.3276110888, -0.3123631477, -0.1970204562, -0.1295171529, 0.0276071988, -0.2567070127, 0.573464632, -0.1012447998, -0.3177196383, -0.0105634406, -0.201515317, -0.2205121815, 0.2966548204, -0.3537476957, 0.1829041243, -0.0251923278, 0.0123269409, -0.1197537333, 0.1188767776, 0.0307789072, 0.2270127833, -0.4064328074, 0.0962211192, -0.3807075918, -0.2729481757, 0.0600299686, 0.1930932254, 0.1431874931, -0.1488231122, -0.3436148167, 0.3103336692, -0.2093208581, -0.5062013865, 0.3441849351, -0.151067853, -0.331343323, 0.0829866976, -0.1414984763, -0.1066784114, -0.042058304, 0.0139351487, 0.0564836152, 0.0459987521, 0.0307923034, -0.0586120561, 0.1070295721, -0.1867022961, 0.0720014572, 0.6140072346, -0.0818190426, -0.157610476, 0.4818260968, 0.1507879198, -0.0686288774, -0.1476232111, 0.0328454748, -0.0315196477, 0.1339663863, 0.0476743095, -0.0774002895, 0.4039571881, -0.3512095809, -0.0243558064, 0.1211346686, 0.0978177562, 0.0288176984, -0.8456358314, -0.2271312475, 0.2395491004, 0.0572065413, 0.0895522386, 0.2577362955, -0.0146427713, 0.1545115411, -0.1004176512, -0.1987087131, 0.1689094156, -0.2931170166, -0.2407390773, 0.443769753, 0.2284309566, 0.5434685349, -0.0682902709, -0.0336169228, -0.0724586695, -0.1338687986, -0.0819213167, -0.0315991901, 0.1933620274, 0.0343474373, -0.1537767947, -0.040189039, -0.5816943645, -0.2384193987, 0.2793465853, 0.2583548427, 0.1257843971, -0.0541529618, 0.0894365311, -0.3515668511, 0.4427087009, -0.4098796248, 0.2000609338, 0.0118042603, 0.6186721325, -0.2902218103, 0.189917028, 0.0933983922, 0.0799572021, -0.3963113129, -0.1322663426, 0.0644159019, -0.1562739909, 0.4741695523, -0.3039166629, 0.2632942796, -0.1518126875, 0.1123895049, 0.1734666079, -0.2352960706, 0.2265856415, 0.1939245909, 0.1053430066, -0.0775514543, -0.0710766017, 0.3126707971, 0.1430844367, 0.0582971051, -0.059056893, 0.0815954208, -0.0269314423, -0.0752380192, -0.0132574737, 0.3911983073, -0.2190270573, 0.13760221, 0.2432947308, -0.0124951303, 0.0229648165, -0.0299674813, 0.3261115551, 0.0983023494, 0.2900643349, -0.0794334486, -0.1628963947, 0.0581231639, 0.1574459523, -0.1831635386, -0.5663149357, -0.2402622104, 0.1602426767, -0.3456602693, 0.0673190802, -0.3224353194, 0.1991876811, -0.2051939964, 0.0599442348, -0.4050636888, 0.2330880761, 0.0006374242, -0.0790737942, -0.4354063869, -0.2118588835, 0.0559328496, 0.2156289816, 0.3409397304, -0.2835573852, 0.0105217844, 0.0513611957, -0.058866445, -0.0872228667, 0.0364542194, 0.146137774, 0.0961383134, -0.2760257721, -0.0853727236, -0.1579368114, -0.0975596309, 0.1521982402, 0.2372903526, 0.2586216331, 0.1005764008, -0.0636842847, 0.3023329973, 0.0530358218, -0.1611156166, 0.0035623237, 0.0596504509, 0.4132117927, -0.4068718255, 0.2244318426, 0.1159293279, -0.0663940087, 0.4338054061, -0.0910516083, 0.4080497026, -0.1714950204, 0.196952343, -0.1435177326, -0.054887265, -0.2223402262, -0.1979022026, -0.418505311, -0.3149420321, 0.3104334176, 0.0740807652, 0.3062134981, -0.0984375775, 0.0051386729, -0.2273652554, 0.5094873309, 0.3295612931, 0.3142695725, -0.007812975, -0.0333034471, -0.4118763804, -0.0295042098, -0.2248630971, -0.2211926579, 0.1552861184, 0.2018475235, -0.1512492001, 0.1393019408, -0.1522098035, -0.0737687796, -0.2099646628, 0.1152873337, -0.103156656, -0.3068381846, -0.1324058473, 0.1277772039, -0.0125603229, -0.334064424, 0.0885636359, -0.2397804856, -0.0091694668, -0.1310242116, 0.2173585296, -0.1317515373, 0.1013341993, 0.4135286808, 0.0147083607, 0.4208111465, 0.0513822511, -0.1141483933, -0.2717921734, -0.1231753901, -0.207555443, 0.3100586832, 0.1311536878, 0.2213363349, -0.0996847302, -0.0406364165, -0.3142707646, 0.1450715661, -0.1366642118, -0.0818039775, -0.1601699293, -0.1002270132, -0.3212531209, 0.2325599343, 0.084006831, -0.0241558924, -0.0681616962, -0.0603408478, -0.2181842476, -0.044917617, 0.3933363557, -0.4936444759, -0.1012936011, -0.0289596301, 0.1853543222, -0.0953272134, 0.1094550788, -0.2316053212, 0.2562434971, 0.3378905356, -0.1461078227, -0.1566695273, -0.0595971718, -0.1753626317, -0.023731336, -0.0522246361, 0.34848243, -0.0584433265, -0.3829173446, -0.2968149483, -0.2815040052 ]
https://github.com/huggingface/datasets/issues/647
Cannot download dataset_info.json
Right now the recommended way is to create the dataset on a server with internet connection and then to save it and copy the serialized dataset to the server without internet connection.
I am running my job on a cloud server where does not provide for connections from the standard compute nodes to outside resources. Hence, when I use `dataset.load_dataset()` to load data, I got an error like this: ``` ConnectionError: Couldn't reach https://storage.googleapis.com/huggingface-nlp/cache/datasets/text/default-53ee3045f07ba8ca/0.0.0/dataset_info.json ``` I tried to open this link manually, but I cannot access this file. How can I download this file and pass it through `dataset.load_dataset()` manually? Versions: Python version 3.7.3 PyTorch version 1.6.0 TensorFlow version 2.3.0 datasets version: 1.0.1
32
Cannot download dataset_info.json I am running my job on a cloud server where does not provide for connections from the standard compute nodes to outside resources. Hence, when I use `dataset.load_dataset()` to load data, I got an error like this: ``` ConnectionError: Couldn't reach https://storage.googleapis.com/huggingface-nlp/cache/datasets/text/default-53ee3045f07ba8ca/0.0.0/dataset_info.json ``` I tried to open this link manually, but I cannot access this file. How can I download this file and pass it through `dataset.load_dataset()` manually? Versions: Python version 3.7.3 PyTorch version 1.6.0 TensorFlow version 2.3.0 datasets version: 1.0.1 Right now the recommended way is to create the dataset on a server with internet connection and then to save it and copy the serialized dataset to the server without internet connection.
[ -0.2739144564, 0.0575309843, -0.0347835571, 0.1870904565, 0.0935546458, 0.1803251505, 0.0954200178, 0.2747620046, 0.0829550028, 0.0868137181, 0.1172920987, 0.2410642356, 0.2068438232, 0.1971998811, 0.2280214429, -0.0897161663, -0.1748179197, 0.0688980073, 0.0379331037, 0.1446815133, -0.1514801681, 0.0732108206, 0.0222484767, 0.0637849644, -0.1027823687, -0.3270794451, 0.1324491501, 0.1972537637, -0.2260447145, -0.082882449, 0.3538628817, 0.1435669959, 0.0654656366, 0.1908267587, -0.0001164499, 0.2554251254, 0.2761069536, -0.1264930665, -0.3891865313, -0.5971475244, -0.2190192342, -0.3448015749, 0.1283260584, -0.3310894966, 0.0493182763, -0.0734006017, 0.3594780862, -0.1955471486, 0.3597823977, 0.3856013417, 0.0774664208, 0.2254534364, 0.1947092563, 0.019543767, -0.1284129769, -0.1219936758, 0.0888299868, 0.3027687073, 0.0329987034, 0.1223278567, 0.1463487148, 0.1968403608, -0.1225890815, 0.200521782, 0.3200192153, 0.0434265621, -0.2740690112, -0.3115540743, 0.3755386174, 0.4086200893, 0.734675765, -0.3074289262, -0.3879798353, -0.0871092975, 0.0589406192, -0.0454805903, 0.4212036729, 0.3348723054, -0.2379804105, 0.1375943422, -0.3929384351, -0.509052217, -0.5490766168, 0.2441612482, -0.0238859206, -0.0689403415, -0.0839280486, 0.0655505657, 0.0700826645, 0.0319328383, 0.113157928, -0.1106234267, 0.1374397427, 0.2582968175, -0.0843181685, 0.0147207752, -0.0684521049, -0.4558705688, 0.0996313915, 0.1897502989, 0.1806793958, 0.0876995623, -0.2284601182, 0.2355820239, 0.1796388179, 0.0048503615, -0.0918785632, 0.0451234505, 0.3965955079, 0.1481316984, 0.2023460567, -0.1901382357, -0.1607415378, -0.1193294823, -0.0338570736, 0.0178024359, 0.264645189, -0.0304431915, -0.2177007198, 0.2142811716, -0.2792242467, 0.0577117242, -0.0448976271, 0.3205521405, -0.45173648, 0.0873988047, 0.364179194, 0.0601885132, -0.0415941924, -0.190790236, -0.032249786, 0.0793240741, 0.2298875749, 0.0989318937, 0.1862663329, -0.0769823045, 0.4051516652, -0.105828613, -0.0253472961, -0.1501201391, 0.381500423, -0.0347792581, -0.3565564454, 0.4856556654, 0.4002934396, 0.050005123, -0.0698557794, 0.0087390989, -0.0139076039, 0.116802603, -0.5142012835, -0.464381218, -0.1429092884, 0.0872384608, -0.2037423998, 0.0277115144, -0.3442263603, -0.0477548279, -0.1650042534, -0.2439988554, -0.1405542791, 0.0832345039, -0.1811582744, -0.2239718139, 0.4070424139, 0.2606799603, -0.4162977338, -0.0523281358, 0.0091462918, -0.1822409332, 0.0957822874, 0.0717507899, -0.2450939715, 0.3522921503, -0.1390307248, 0.0617718846, 0.5221265554, -0.3030908406, -0.7184164524, 0.468660444, -0.2963497937, -0.0514526963, 0.0149094872, 0.1851377487, 0.1693566293, 0.1772420257, 0.3453244567, 0.5167125463, 0.0464097485, -0.0720025897, -0.0083475485, -0.3728986979, 0.1220349893, 0.259283334, 0.0820797533, 0.1653039306, 0.1632929444, 0.0451845303, 0.1699098796, 0.0009179851, 0.0579495206, 0.4647130966, -0.0027852277, 0.1030350477, -0.1065412313, -0.0707991496, -0.7039726973, 0.1187506914, -0.0212406516, -0.1210231632, -0.6178975701, -0.1230689213, -0.1917592436, -0.0210739151, 0.0644881427, 0.2469469905, 0.0336726084, 0.0527779534, 0.1737812161, 0.1497739702, -0.0895222276, 0.4002641439, -0.2856489718, 0.1286258101, -0.4003408849, 0.1032397673, 0.1249527782, 0.1720952094, 0.1340138167, -0.1628952473, 0.1166373789, -0.1240880936, -0.1739652157, 0.4517517388, -0.2501600981, 0.5106281638, -0.1213382483, 0.3544617295, 0.1358033419, -0.0870270878, -0.0439140797, 0.0381285176, 0.0842309594, 0.0218502507, -0.3699805439, 0.3021433949, 0.1896647811, 0.1218512058, 0.1134380102, 0.2273875922, 0.3066493273, 0.0234625861, -0.1310728192, 0.2813239098, 0.1746362597, 0.1272786558, 0.2188439965, -0.1274066418, -0.5014267564, 0.2014043629, 0.426148504, -0.135444954, 0.1467633247, 0.1135953665, -0.1700539589, -0.1370071173, 0.0919982046, 0.1641849875, 0.1402626336, -0.0655517355, 0.0136207817, 0.1298472583, -0.0443724245, -0.2295366377, -0.1062690467, 0.026365526, 0.2623517513, 0.09650141, 0.0491822846, 0.0334140696, 0.0066688582, -0.1386483014, 0.1981187165, 0.2735870481, -0.1948744655, 0.2650151849, -0.2267472893, -0.0929264575, 0.0412771516, 0.1424243599, -0.2560818493, -0.1625165492, -0.0925060809, 0.4689163566, -0.0575844646, -0.190946281, -0.1897981763, 0.251511544, 0.13665995, -0.5261315107, -0.0057422072, -0.0959619433, -0.0684396625, -0.018588094, 0.1396003067, 0.1354273856, 0.2400065511, -0.1796418428, 0.0289250463, -0.374414444, 0.0997285396, 0.0843671113, 0.1751836538, 0.1168127358, 0.0857619271, 0.6367667913, -0.1055241674, 0.0176782012, 0.0998934954, -0.0474894494, -0.1694600731, 0.0396547914, -0.1627121866, 0.0653248876, -0.0126698315, -0.522408247, -0.5092212558, -0.2645640969, 0.3670660257, -0.0126499459, 0.3041169643, 0.1039889976, -0.0237413049, 0.1928017139, 0.3947016895, 0.1460024714, -0.095226869, -0.7382702231, 0.4059028327, -0.2604085505, -0.4382023215, 0.2343430966, 0.0621497333, 0.1385689676, 0.429350704, -0.6090542674, -0.1359507442, -0.0391380377, 0.2375537157, -0.2261975557, -0.0720164776, 0.511451602, -0.1925544441, 0.0823359489, -0.1040969864, 0.0211998969, 0.0836033821, 0.1290895641, 0.2441370487, 0.4184114039, 0.5404878259, 0.1681102216, 0.7009676695, -0.0563697927, 0.060444545, -0.0432018824, -0.3582237661, 0.1943773478, 0.1316643655, -0.1063023731, -0.1105110347, -0.1030638069, -0.1522715986, -0.1618840396, -0.0817001164, 0.135122329, -0.2991719246, -0.152061671, -0.4132375717, -0.3150019944, 0.1286227256, -0.0678840801, 0.2963248789, 0.2067392766, 0.2157341838, -0.4287410975, -0.2588942051, 0.0063700974, 0.3762167394, 0.1784724444, 0.2254024148, -0.1324313581, -0.1656573117, -0.6117647886, 0.3463326097, 0.0103154108, 0.4689977169, -0.230175823, 0.179846108, 0.0835050046, -0.0941343457, 0.4290558398, -0.2495357543, 0.310531497, 0.018376939, -0.0356856622, -0.288085252, 0.0349954888, -0.0521409176, 0.3610964417, 0.3792175651, 0.4429050386, -0.2210144103, -0.1046001464, -0.0795644373, -0.0167637467, -0.3497423828, -0.0143493675, -0.1387257129, -0.6285731792, -0.2829693556, -0.1870855987, -0.0445168279, 0.3387083709, 0.1342915297, 0.2128898203, -0.0565805547, 0.27765131, -0.0254855081, 0.010986872, -0.0339394063, 0.195884183, -0.109854728, 0.5311941504, 0.213707462, 0.2450789064, 0.5885628462, 0.332370162, -0.3327037394, 0.0199717879, 0.0245343596, 0.0532651953, 0.1240947172, -0.1331606209, -0.0789606869, 0.5337429047, 0.1537056565, -0.3034187555, 0.2127071172, 0.1396432966, -0.057199128, -0.2989495695, -0.5814875364, 0.4783391654, 0.0775320232, -0.0566985607, 0.2993633151, -0.2010771036, -0.0492853783, -0.0046023019, -0.1940879673, 0.7162200212, -0.0931389555, 0.1033355966, 0.2150267363, 0.1401151419, 0.6555054188, -0.7900180817, 0.3992991149, 0.0381453931, -0.2679976821, -0.2783514261, -0.1457652301, 0.0254390724, 0.1649257988, 0.0266110152, 0.3068061471, 0.0720437169, 0.1196131259, 0.2397859991, 0.4148024619, -0.289367646, -0.2671438754, -0.1212484986, 0.0269073024, -0.2770173848, 0.5673684478, -0.0577683374, -0.3126715422, -0.0194540992, -0.1962103546, -0.1964070797, 0.2838347256, -0.3512869775, 0.2460332513, -0.0581766106, -0.0332474709, -0.1655848324, 0.1115578115, 0.0988959074, 0.2406302392, -0.3796509206, 0.0563734286, -0.3700043559, -0.2870517075, 0.0400021486, 0.1308581978, 0.1812568754, -0.1244321167, -0.2355840206, 0.2666381598, -0.1747033894, -0.4450394809, 0.380535543, -0.242242232, -0.2784992158, 0.0918513015, -0.1555110514, -0.1224189699, -0.087932311, -0.0163860098, 0.0501871258, 0.1084881872, -0.0505951233, -0.0403738432, 0.0692365021, -0.1534146369, 0.0809174106, 0.581664443, -0.0806178451, -0.1587275118, 0.5003039837, 0.1704463959, -0.0681155175, -0.1195203662, 0.1162562668, -0.0412759744, 0.1467311084, 0.0195540078, -0.1293825358, 0.4212915599, -0.3932154775, 0.0289682075, 0.1204618439, 0.0393050015, -0.0330122113, -0.8135488033, -0.2863281369, 0.2287548482, 0.1580989063, 0.1293038428, 0.2596990764, 0.0100227967, 0.1542129815, -0.0297786854, -0.1967709363, 0.1311985254, -0.3208207488, -0.1879940927, 0.4375841618, 0.1676707566, 0.4972368777, -0.0736197233, -0.0389232747, -0.0709861517, -0.1190066189, -0.0706461594, -0.0970103294, 0.1862731129, -0.0162254088, -0.1704066396, -0.0438648015, -0.5901020765, -0.2316230983, 0.26205495, 0.2117797136, 0.1315571964, -0.0495784134, 0.0545421131, -0.3019162416, 0.3776364326, -0.393743217, 0.1543298364, 0.0022253618, 0.6276221275, -0.2808524668, 0.2070872784, 0.0500361323, 0.0412117094, -0.4263061881, -0.1283074915, 0.0890376419, -0.158261016, 0.4299827218, -0.292049706, 0.2547510266, -0.1186932176, 0.1042037904, 0.1905022711, -0.2503335774, 0.2916421294, 0.233757332, 0.0911972374, -0.043665722, -0.0714257061, 0.2956949174, 0.009698499, 0.015097782, -0.0356246158, 0.0838100165, 0.0553014427, -0.1473533958, -0.0251035206, 0.4249931276, -0.1997363567, 0.1090439931, 0.2411958128, -0.0350362509, 0.0313415453, -0.0165634379, 0.3380122483, 0.1041946784, 0.24617365, -0.0894297063, -0.1074879915, 0.168474406, 0.1592969894, -0.1315554529, -0.5190199614, -0.2859696448, 0.0930764377, -0.3006411791, 0.1135273874, -0.2937910855, 0.1715637594, -0.2090360224, -0.0076376908, -0.3982141912, 0.2975950837, 0.0084659811, -0.1158959866, -0.4368094206, -0.171579212, 0.0607246459, 0.2427889854, 0.3312901855, -0.322301209, -0.0130052678, 0.0944706947, -0.0486211218, -0.0815677419, 0.0354831815, 0.1492996216, 0.0825720206, -0.2803865075, 0.0063784327, -0.1056702733, -0.108955577, 0.0429223068, 0.2040773034, 0.2475280315, 0.0892018005, -0.0607505441, 0.385977298, 0.0831937343, -0.1735337377, 0.0214274377, 0.0603293665, 0.3365600407, -0.3860047758, 0.2362741679, 0.0912704766, -0.0563464388, 0.3863758445, -0.0467828475, 0.3695309758, -0.100130856, 0.2297031432, -0.0592054464, -0.0601838715, -0.1932042241, -0.1512306929, -0.4297839999, -0.2587773204, 0.3552363813, 0.0758619905, 0.3142479658, -0.0911460221, 0.013211377, -0.2163452804, 0.5425177813, 0.3017628491, 0.3233074546, -0.0104511064, -0.0313141719, -0.4100428522, -0.0654308721, -0.2089727074, -0.2313195318, 0.180246532, 0.2493041158, -0.1689023674, 0.1777102947, -0.1527422667, -0.0287995432, -0.1564905345, 0.1588187665, -0.0730373636, -0.2842101157, -0.0704197139, 0.1299330741, -0.0345262736, -0.3565248847, 0.1670374274, -0.2185202837, -0.0108183809, -0.1399710625, 0.2111242712, -0.1460439861, 0.0307214148, 0.4213564992, 0.0243800301, 0.3493375778, 0.0227136761, -0.1319889724, -0.2136932313, -0.0371702425, -0.2274541557, 0.3000168204, 0.1238163859, 0.2504251599, -0.1134875864, -0.0519046783, -0.3240870535, 0.0799017251, -0.1384276748, -0.1868671626, -0.164100945, -0.109483555, -0.3572939634, 0.2280039042, 0.129565984, 0.0464156568, -0.0875975043, -0.0353563391, -0.2175741941, -0.0534103885, 0.4117560685, -0.5382962823, -0.0688975975, -0.0045619421, 0.125692904, -0.1282254308, 0.143908605, -0.2383605391, 0.3045871258, 0.328935504, -0.2071260363, -0.1813852191, -0.014490556, -0.1164345294, -0.020649761, -0.0535507239, 0.4048129022, -0.0768159851, -0.3848385215, -0.32075122, -0.26013726 ]
https://github.com/huggingface/datasets/issues/647
Cannot download dataset_info.json
#652 should allow you to load text/json/csv/pandas datasets without an internet connection **IF** you've the dataset script locally. Example: If you have `datasets/text/text.py` locally, then you can do `load_dataset("./datasets/text", data_files=...)`
I am running my job on a cloud server where does not provide for connections from the standard compute nodes to outside resources. Hence, when I use `dataset.load_dataset()` to load data, I got an error like this: ``` ConnectionError: Couldn't reach https://storage.googleapis.com/huggingface-nlp/cache/datasets/text/default-53ee3045f07ba8ca/0.0.0/dataset_info.json ``` I tried to open this link manually, but I cannot access this file. How can I download this file and pass it through `dataset.load_dataset()` manually? Versions: Python version 3.7.3 PyTorch version 1.6.0 TensorFlow version 2.3.0 datasets version: 1.0.1
30
Cannot download dataset_info.json I am running my job on a cloud server where does not provide for connections from the standard compute nodes to outside resources. Hence, when I use `dataset.load_dataset()` to load data, I got an error like this: ``` ConnectionError: Couldn't reach https://storage.googleapis.com/huggingface-nlp/cache/datasets/text/default-53ee3045f07ba8ca/0.0.0/dataset_info.json ``` I tried to open this link manually, but I cannot access this file. How can I download this file and pass it through `dataset.load_dataset()` manually? Versions: Python version 3.7.3 PyTorch version 1.6.0 TensorFlow version 2.3.0 datasets version: 1.0.1 #652 should allow you to load text/json/csv/pandas datasets without an internet connection **IF** you've the dataset script locally. Example: If you have `datasets/text/text.py` locally, then you can do `load_dataset("./datasets/text", data_files=...)`
[ -0.2791209221, 0.0578074455, -0.0465701818, 0.1835986674, 0.1229125559, 0.1868671477, 0.1547660828, 0.2711955607, 0.2081414461, 0.0419170484, 0.0489861146, 0.2596798241, 0.2826955318, 0.1925582141, 0.1847232878, -0.0445270687, -0.2054238766, 0.032313779, 0.0241266601, 0.1765892804, -0.2130242586, 0.0461444035, -0.0344452634, 0.1338432878, -0.116224736, -0.3269086182, 0.1606549621, 0.2022774369, -0.1810079217, -0.0399475396, 0.4320086837, 0.160767287, 0.0582316071, 0.199309051, -0.0001162807, 0.2081343979, 0.3396979272, -0.0523446687, -0.3972236216, -0.6450173855, -0.250235796, -0.2693961561, 0.1729402244, -0.3017245531, 0.0687522441, -0.1249876618, 0.3304024637, -0.2659766078, 0.2997100651, 0.3808619976, 0.0811683834, 0.1861014813, 0.1585744023, -0.0045691617, -0.1447032392, -0.1378959715, 0.1531008333, 0.2840476632, 0.0423735082, -0.0152340755, 0.1086693108, 0.204280898, -0.1773225963, 0.2587246001, 0.3738180101, 0.1103391945, -0.2407456338, -0.277366817, 0.4009684026, 0.4214973152, 0.7367281318, -0.3251824379, -0.3985490203, -0.1495106369, 0.0299514644, -0.0038789809, 0.3894179463, 0.3143301308, -0.2539666295, 0.0717128366, -0.3475771546, -0.4419791698, -0.5005506277, 0.2443241328, -0.0905220658, -0.0712044686, -0.0764268339, 0.0694525689, 0.1398931593, 0.0175052695, 0.2089917958, -0.0750415176, 0.1016552597, 0.2641212642, -0.1614488065, 0.0680158585, 0.0132723376, -0.4202743769, 0.0799995735, 0.1321252733, 0.1859794408, 0.1021030843, -0.324041903, 0.2630484402, 0.2888854742, 0.0031038336, -0.097245723, 0.0537398234, 0.411030829, 0.2029898316, 0.2062125802, -0.1693377346, -0.1028856039, -0.2158844173, -0.0904718339, -0.0342142433, 0.2820253968, -0.0254262611, -0.2750400007, 0.1628290862, -0.2433771193, 0.0810470283, 0.0057739243, 0.3083932698, -0.4633271992, 0.148278147, 0.373077929, 0.0677471086, 0.0011438951, -0.1491357386, -0.0526333116, 0.1320005953, 0.1611262709, 0.0686708763, 0.1769047379, -0.0410547145, 0.3526960313, -0.1015115678, -0.0355792306, -0.1430630535, 0.3078434765, -0.0415491313, -0.3538808525, 0.4735925794, 0.3906004727, 0.0865465924, -0.0025991723, -0.0301783308, -0.0229926705, 0.0905552506, -0.5049898028, -0.4248256683, -0.1023377553, 0.0907280743, -0.1651343405, 0.0188837461, -0.3795017004, -0.0540539585, -0.2387858033, -0.2132171392, -0.1138017401, 0.0657527298, -0.1576314867, -0.193702817, 0.3814212978, 0.3426972032, -0.4553880095, -0.0428898856, -0.0256769564, -0.1867497414, 0.0314613804, 0.1171082556, -0.2610654831, 0.3278615773, -0.1416551173, -0.0006124228, 0.4650777876, -0.3743421733, -0.6786805391, 0.5116189122, -0.2511110306, -0.037378177, 0.0120724924, 0.1940939575, 0.0640427843, 0.2246434391, 0.4172989726, 0.5540284514, 0.0352415666, -0.076147005, 0.0335002691, -0.3742312193, 0.1206463575, 0.3107517362, 0.1238312125, 0.1601825655, 0.1756915599, -0.0048197173, 0.1874801666, 0.012259664, 0.0361299515, 0.4173102379, -0.0478650928, 0.1298308969, -0.0888286904, -0.0869749859, -0.6877628565, 0.1259184182, 0.0302225798, -0.1560216099, -0.650418222, -0.1148979366, -0.2287500501, -0.0189219452, 0.0801191628, 0.2297025472, 0.0302912146, 0.067814678, 0.1056920439, 0.150773257, -0.1127466112, 0.4634070098, -0.2621697485, 0.1545109153, -0.3642697036, 0.054570362, 0.1842840761, 0.2045583427, 0.1370398253, -0.1432599723, 0.072734952, -0.0753006712, -0.1526922137, 0.4997186661, -0.217885986, 0.4904347062, -0.0908864588, 0.316500932, 0.1006955728, -0.0196564049, -0.0989884138, 0.041910708, 0.1325563788, 0.0049108863, -0.3661688566, 0.3218407035, 0.1772814989, 0.2363184988, 0.1218024716, 0.2513243556, 0.2710553408, 0.0193477049, -0.0697235614, 0.2834269702, 0.1404714584, 0.0977993757, 0.2253129184, -0.1177351177, -0.5117950439, 0.1311010867, 0.3453788757, -0.1234256476, 0.1201112121, 0.1088844836, -0.1616202146, -0.1135594249, 0.0792019516, 0.1813415587, 0.1146392375, -0.051586628, -0.0541440025, 0.1559965909, -0.0488837138, -0.2223096788, -0.1094933227, 0.0705573633, 0.1992184669, 0.0120596085, 0.0031578764, 0.0231654309, 0.0083960816, -0.176165849, 0.1933675408, 0.2756211758, -0.2250333428, 0.280605644, -0.2429248691, -0.1875531375, 0.127224341, 0.1716041267, -0.2033436149, -0.1666340679, -0.152994439, 0.4840414226, -0.0454378501, -0.1549714804, -0.191456303, 0.2822311521, 0.1641654074, -0.5675435662, -0.0469842218, -0.1555358917, -0.0562375076, 0.0154020339, 0.2038510144, 0.1700979173, 0.2169909477, -0.1641971469, 0.0632856935, -0.4364121258, 0.1209373772, 0.043838881, 0.132240817, 0.1461336613, 0.1247554421, 0.6109938025, -0.1397791505, 0.0299065821, 0.0867850035, -0.0766617283, -0.0999121293, -0.0275356732, -0.1574816257, 0.0970094651, -0.0283454396, -0.5136219263, -0.4184244275, -0.3151581585, 0.4051456451, 0.0110967085, 0.2497460991, 0.1429154575, 0.0136673599, 0.2127901018, 0.399916172, 0.1146336049, -0.0925634205, -0.6567988396, 0.400585115, -0.211827755, -0.3886094987, 0.1795175374, 0.0410544723, 0.173478052, 0.3741800487, -0.6029787064, -0.1492185295, -0.0957726091, 0.2385060787, -0.2502659559, -0.0587990731, 0.479955703, -0.2010465413, 0.0503725782, -0.0519973785, -0.0412347466, 0.020393515, 0.1076033562, 0.2154724896, 0.4581020474, 0.5635308027, 0.1050901785, 0.6342530847, -0.187181145, 0.0657904446, -0.0232315585, -0.395752728, 0.2789324522, 0.1227039248, -0.1484971941, -0.213207826, -0.0865145475, -0.1262926906, -0.1791211218, -0.0912336856, 0.1277410984, -0.3572418988, -0.1581609547, -0.4311299622, -0.2972104549, 0.0731828511, -0.1085885912, 0.1910111159, 0.2258856893, 0.2070740014, -0.3853976429, -0.2513486147, 0.008250691, 0.3692326546, 0.1811160445, 0.1868810505, -0.0452397093, -0.1204559207, -0.5152065754, 0.3726807833, -0.0426249802, 0.4191332757, -0.1932641864, 0.1551178992, 0.0765388981, -0.0632437468, 0.4792938828, -0.2443974167, 0.3249920309, 0.0773285627, -0.0583407469, -0.3281808794, 0.0437760726, -0.099617824, 0.3088561296, 0.3605068922, 0.4001761973, -0.164108783, -0.11085511, -0.0523648672, -0.0363094881, -0.2664946914, -0.038965717, -0.15161255, -0.6053227782, -0.2809656262, -0.1779873669, 0.0015704483, 0.3210442364, 0.1302581131, 0.2168484926, -0.0666727796, 0.2029845417, 0.000794936, 0.0263556987, -0.0381246023, 0.2138102353, -0.1303320825, 0.4847904742, 0.1835345924, 0.2229401171, 0.5773662925, 0.3251750469, -0.3378048837, 0.0231073461, 0.015353173, 0.0541645437, 0.1097825021, -0.1483044922, -0.0399410091, 0.5120698214, 0.1543314457, -0.2161361426, 0.2679840326, 0.1858837605, -0.0193497054, -0.2554167807, -0.6606376171, 0.5072336793, 0.0906063765, -0.100410372, 0.3094778955, -0.1373146027, -0.0237888135, -0.0360245109, -0.2209879309, 0.6713665724, -0.124517791, 0.1313055456, 0.2163773775, 0.1202803403, 0.7099761367, -0.7732665539, 0.3676520884, 0.0492252186, -0.2292071283, -0.2622097135, -0.1553097963, 0.0091337012, 0.144704178, 0.1403297186, 0.3702159822, 0.0631037802, 0.1699427515, 0.3019887805, 0.3609044552, -0.3123666644, -0.239106819, -0.1296969503, 0.0286571532, -0.237521857, 0.5534265637, -0.0718666315, -0.2736499608, -0.0073234886, -0.2225475609, -0.1852606833, 0.2623910904, -0.3086826801, 0.2724314332, -0.0676860064, -0.0077406615, -0.1633069515, 0.1958074421, 0.0608034432, 0.2248880863, -0.3993470669, 0.0888335258, -0.3730612695, -0.2940767407, 0.0131198615, 0.1559784561, 0.203961581, -0.0827013627, -0.2740245759, 0.3077615499, -0.1745193899, -0.4703130126, 0.3294250965, -0.1210384667, -0.2856193781, 0.1029757038, -0.1716602445, -0.1696080714, -0.061629694, 0.0056702718, 0.0552748069, 0.0283128042, 0.0480153672, -0.0401018336, 0.0483592562, -0.224456802, 0.0567754619, 0.6204459667, -0.1133006662, -0.1472987384, 0.4869609773, 0.2025520205, -0.0932647958, -0.1289633512, 0.0157460272, 0.0206471682, 0.1816702038, 0.0618593283, -0.0970290825, 0.4065271914, -0.3604156375, -0.0296380371, 0.1149357036, 0.0493559912, -0.0129002407, -0.8330101371, -0.2787035108, 0.2436223626, 0.084223792, 0.1087344438, 0.2688505352, -0.0022197999, 0.1604644954, -0.0900821015, -0.1942878664, 0.173163265, -0.2985096276, -0.2202150673, 0.4320725799, 0.223466292, 0.5248360634, -0.0811826959, -0.0177935436, -0.0946032554, -0.0888053402, -0.0914297625, -0.0799657553, 0.1858023703, 0.0011572577, -0.1877686828, -0.0108698942, -0.6070570946, -0.2640090287, 0.2268239856, 0.271720618, 0.1981514096, -0.034951292, 0.0191734843, -0.3432273567, 0.410703361, -0.3544174135, 0.1874037981, -0.0310116448, 0.5695279241, -0.2640037537, 0.173627615, 0.0520728007, 0.0772722214, -0.3644010127, -0.1432740092, 0.0622180067, -0.1929643899, 0.4270820618, -0.2802504003, 0.2403898388, -0.1060445011, 0.1559551507, 0.2138097882, -0.25237602, 0.2427349091, 0.2068295479, 0.0972651988, -0.0943279415, -0.0249110255, 0.2158676386, 0.0390228927, 0.0503952056, -0.0125891194, 0.0665477142, -0.0099798739, -0.107261017, -0.01459397, 0.4498599172, -0.1859988421, 0.1092173308, 0.3178957403, -0.0188897252, -0.001889199, 0.0333694816, 0.3920706511, 0.12674658, 0.293499887, -0.0956421718, -0.209174186, 0.023996029, 0.2001223564, -0.1951173246, -0.5417724848, -0.2135815769, 0.133631289, -0.3624091148, 0.0338806808, -0.3084082603, 0.1986404955, -0.2256320119, 0.0916829482, -0.3848973513, 0.2415826917, 0.0161778051, -0.1153537259, -0.333748877, -0.2156985104, 0.0538561344, 0.2196043432, 0.360496521, -0.3386313617, -0.0658826008, 0.0333778784, -0.041615732, -0.1041030735, 0.087884672, 0.1423382759, 0.0879148245, -0.2790156007, -0.0923999101, -0.147028029, -0.0800864622, 0.1165984124, 0.2271207273, 0.2707596719, 0.1289184988, -0.0918844119, 0.3254322708, 0.0407521874, -0.1868712604, 0.0429279208, 0.0326361731, 0.3506325483, -0.365042299, 0.2370021343, 0.1269575059, -0.0832223594, 0.4458377361, -0.0731198341, 0.4121445119, -0.166326642, 0.1976698935, -0.0692231953, -0.0354487523, -0.2082856148, -0.1467547119, -0.4075517356, -0.3283408284, 0.317642808, 0.0635406226, 0.3068360388, -0.0944894552, 0.0135338716, -0.186003536, 0.5233608484, 0.3315809667, 0.3229910135, 0.0180414952, 0.0121838301, -0.4556889236, -0.0568072498, -0.2268437445, -0.2344714403, 0.1527450532, 0.1690329313, -0.1858548969, 0.1104458198, -0.1380705833, -0.0008357801, -0.2150802463, 0.0705233589, -0.0605571866, -0.348733753, -0.1644016355, 0.1297162771, -0.0251808241, -0.3342962265, 0.0895449519, -0.2687940001, -0.0057374462, -0.1350551248, 0.236874491, -0.1141498387, 0.030236965, 0.3860867023, -0.0399921313, 0.3857759535, 0.065651238, -0.0871569514, -0.2826167941, -0.1070459187, -0.2059087157, 0.3248493373, 0.1207988262, 0.2053452134, -0.1192814112, -0.1100055128, -0.3097412586, 0.1902948618, -0.1096413583, -0.1507734805, -0.1237105206, -0.0994077027, -0.3078344464, 0.2365942895, 0.0698108897, -0.0148072187, -0.0909005255, 0.0097411126, -0.1888251752, -0.0998869315, 0.4315688908, -0.4962150455, -0.1133598611, -0.0368739702, 0.1417094171, -0.1003405377, 0.1042068005, -0.2002294362, 0.3202196658, 0.316521734, -0.1479851902, -0.1897583157, -0.0562835559, -0.1773484051, 0.0039491877, -0.0703321099, 0.3125737309, -0.0593611374, -0.365591079, -0.3102481365, -0.2552458048 ]
https://github.com/huggingface/datasets/issues/643
Caching processed dataset at wrong folder
Thanks for reporting ! It uses a temporary file to write the data. However it looks like the temporary file is not placed in the right directory during the processing
Hi guys, I run this on my Colab (PRO): ```python from datasets import load_dataset dataset = load_dataset('text', data_files='/content/corpus.txt', cache_dir='/content/drive/My Drive', split='train') def encode(examples): return tokenizer(examples['text'], truncation=True, padding='max_length') dataset = dataset.map(encode, batched=True) ``` The file is about 4 GB, so I cannot process it on the Colab HD because there is no enough space. So I decided to mount my Google Drive fs and do it on it. The dataset is cached in the right place but by processing it (applying `encode` function) seems to use a different folder because Colab HD starts to grow and it crashes when it should be done in the Drive fs. What gets me crazy, it prints it is processing/encoding the dataset in the right folder: ``` Testing the mapped function outputs Testing finished, running the mapping function on the dataset Caching processed dataset at /content/drive/My Drive/text/default-ad3e69d6242ee916/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/cache-b16341780a59747d.arrow ```
30
Caching processed dataset at wrong folder Hi guys, I run this on my Colab (PRO): ```python from datasets import load_dataset dataset = load_dataset('text', data_files='/content/corpus.txt', cache_dir='/content/drive/My Drive', split='train') def encode(examples): return tokenizer(examples['text'], truncation=True, padding='max_length') dataset = dataset.map(encode, batched=True) ``` The file is about 4 GB, so I cannot process it on the Colab HD because there is no enough space. So I decided to mount my Google Drive fs and do it on it. The dataset is cached in the right place but by processing it (applying `encode` function) seems to use a different folder because Colab HD starts to grow and it crashes when it should be done in the Drive fs. What gets me crazy, it prints it is processing/encoding the dataset in the right folder: ``` Testing the mapped function outputs Testing finished, running the mapping function on the dataset Caching processed dataset at /content/drive/My Drive/text/default-ad3e69d6242ee916/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/cache-b16341780a59747d.arrow ``` Thanks for reporting ! It uses a temporary file to write the data. However it looks like the temporary file is not placed in the right directory during the processing
[ -0.0562818386, 0.1755032986, -0.048730582, 0.3908852637, -0.0356464945, -0.0572804362, 0.2760778964, -0.0244466141, 0.040812254, 0.1573937535, 0.0506478921, 0.1766865849, 0.0519223697, 0.3638364077, -0.0356576033, 0.3218174279, 0.2768937647, -0.0544235222, 0.054346595, -0.0435269624, -0.3072212934, 0.3765658438, -0.2001904249, -0.0186388455, -0.5099137425, -0.1708419323, -0.2043419182, -0.0366440639, 0.0198710039, 0.2087091208, 0.1927694082, 0.0984713882, 0.1747970879, 0.6240902543, -0.0001294347, -0.1149764732, 0.2048155367, -0.2666313052, -0.1961359978, -0.1022198945, -0.3119003773, -0.0044887029, -0.1497934461, -0.0720395595, -0.1685161144, 0.0581679046, 0.3384489417, -0.5294011831, -0.032483995, 0.1970878839, 0.055054538, -0.4087733626, -0.4787144661, 0.3768619001, 0.1074029654, 0.2737838924, -0.1276574433, 0.2617736161, 0.027408652, -0.1451186538, 0.0063618049, 0.2977358401, 0.0007319674, -0.0749413371, 0.281091392, 0.3585238159, -0.2941530645, -0.3332055807, 0.5126186609, -0.4663318098, 0.3986629248, -0.3778886497, -0.205114007, -0.207455039, -0.3296649158, -0.2811295986, 0.5710797906, 0.1723901629, -0.0678304285, 0.038180571, -0.3426439464, -0.2352153212, 0.080122605, 0.1073581576, -0.1712421626, 0.2584483325, -0.2692753375, 0.0078240875, -0.0813753903, 0.2511203587, 0.6566112041, -0.3971768618, -0.0177653022, 0.3113218248, -0.0171627812, 0.0418484136, -0.2528594136, 0.5198614001, -0.2589482069, -0.1941891313, -0.1401860565, -0.137127474, -0.145316571, 0.111448057, -0.0100335171, 0.3893761039, 0.0142906904, 0.1784570366, 0.109189719, 0.0854987949, -0.464964211, -0.1024237275, -0.1018911153, -0.280589819, 0.0215530321, 0.1529058218, -0.0357464254, -0.2422920913, -0.1104080901, -0.0570117533, -0.3516086936, 0.0353409573, 0.0664603412, 0.2854492962, 0.0223694071, -0.1636579931, -0.2073561549, 0.2409927398, -0.1883879602, 0.3741033673, -0.0681627393, 0.0487458855, -0.3213573694, 0.4333798885, 0.0406437665, -0.2225216478, 0.1916907728, 0.0157994479, -0.0768358037, -0.3220775723, -0.0095078684, -0.461079061, 0.3809150457, 0.2202901989, 0.1426358223, 0.4319471717, 0.0019608643, -0.2612245381, -0.1253500581, 0.2638804018, -0.5655605197, -0.2583987415, 0.2529356182, -0.0420588031, -0.2712314725, -0.0508867614, -0.6118237376, 0.0973578542, 0.4751486778, -0.2496964037, 0.029133983, -0.0071693249, -0.5547556281, -0.2493056059, 0.0841402262, 0.2138253897, -0.3784086704, 0.2292412966, -0.2632401884, 0.4557931423, 0.4782081544, 0.4059206843, 0.0980357304, 0.3541257083, -0.4703491628, -0.018397674, 0.1202926189, -0.3309089541, -0.7670696974, 0.2393061817, -0.0553214327, 0.0022590309, 0.0424809754, 0.0430202186, 0.1667081565, -0.1728424132, 0.1321417689, 0.1466967613, 0.1153549626, 0.2249989808, -0.2722536325, -0.1135419309, -0.0317631885, -0.140868187, 0.0147981932, 0.1906703115, 0.0313851722, -0.3111228347, 0.1627006382, -0.0954519212, 0.1095283553, 0.2840325236, 0.1987292618, -0.1217927188, -0.0088152178, 0.2195263803, -0.1978283823, 0.0785655528, -0.0435929485, -0.1874964237, -0.4925784171, -0.1061089039, -0.0936830491, -0.1442345977, -0.1505865753, -0.2586655319, -0.0705619827, 0.3078175783, 0.1895611733, 0.0718141571, -0.0172848776, 0.558795929, 0.0425691381, -0.1547440588, -0.0659864172, 0.2057197392, -0.1104198769, -0.2670922279, -0.178283155, -0.1684376746, 0.2095976472, -0.2671534419, -0.1758683622, 0.1820200682, 0.2802885473, 0.455044359, 0.0352441221, 0.2760799825, 0.1084620878, 0.1304473281, 0.2053913623, -0.0924355686, 0.0293364972, 0.1651971489, -0.080670692, 0.3651885986, -0.1599468738, 0.0635769069, -0.0642153472, -0.3164079785, -0.0544159897, 0.1412293613, 0.2415257394, -0.1335028708, 0.1456725001, 0.4068561792, 0.3810801208, 0.170389235, 0.1708753556, 0.3384850919, 0.5877078772, 0.0830795914, -0.0072803162, 0.0868360698, -0.1417501271, -0.2576187849, 0.2089811414, 0.4739124775, 0.4611823559, -0.1000748277, 0.2996843159, 0.1259817928, 0.0905264169, -0.1617135555, 0.2070006132, 0.0486469641, 0.1792277396, -0.0102858245, 0.2194527537, -0.0296281055, -0.0411491133, 0.2106043398, 0.0995523259, 0.2070691139, -0.0999975353, 0.5765223503, -0.1749669611, -0.2962674201, -0.2514063418, -0.224674508, 0.1264940351, -0.1912498176, -0.0676635876, 0.2575969994, -0.0080912411, -0.0047089085, 0.3018820286, 0.1664896905, -0.095843263, -0.5050301552, -0.1803771555, 0.1535263807, -0.2005007863, -0.1138209477, 0.3516033888, -0.0197495222, 0.0268060658, -0.2237726748, 0.0200727955, -0.3809751868, -0.0776561499, 0.2111565769, -0.1135219261, 0.2927726805, -0.2416091263, 0.0524747968, -0.3985492289, -0.2574803531, -0.1008349657, -0.061855536, -0.0682768673, -0.4157518148, -0.0122672841, -0.0417398028, -0.1293984652, 0.0482099131, -0.2307546139, -0.0240256973, 0.091084227, -0.003161341, 0.3184534311, -0.0141432881, -0.0393553674, -0.2473150939, 0.3702619672, -0.3497524559, -0.333697021, -0.6516954899, 0.2838870585, -0.1492916048, -0.0440370142, -0.0416814983, -0.0779948086, 0.0555621572, 0.4272267222, -0.4290873706, -0.3207131624, -0.1345356107, -0.0098097101, 0.2001431882, -0.081565395, 0.3003768623, -0.0045481697, 0.2103827, -0.0307018757, -0.3322170973, 0.0428321213, 0.3315359652, 0.3256827891, -0.0178600959, 0.1884341538, 0.0062962323, 0.6751115918, 0.2125790715, 0.0176494773, 0.3855676651, -0.0628991872, 0.2614318132, -0.2494226843, -0.0922999829, 0.2275804281, 0.031353496, -0.661052227, 0.3991059065, 0.0960739404, -0.4041826725, 0.1855950058, -0.2265983224, -0.3275679946, -0.2346795946, 0.2420727015, 0.059365876, 0.2746415734, 0.0666520298, 0.1323851943, -0.2673319578, -0.2672294974, 0.0602855831, 0.1197602823, 0.2946384549, 0.1816901565, 0.347268343, -0.5023316145, -0.3641362786, 0.3330278099, 0.3905139863, 0.1117806137, -0.0613580309, 0.306900382, 0.1638711393, -0.1416221261, 0.6055471897, -0.3414017856, -0.1618321836, -0.1456075311, 0.0838122368, -0.0051113442, 0.0111393407, 0.0911114812, 0.4281992316, 0.0128001794, 0.5021355748, 0.3324736953, 0.0061751641, -0.1754250228, 0.2356708944, -0.32560727, -0.0917086825, -0.2625864744, -0.5155372024, -0.2017365098, -0.0243561566, 0.1427343786, -0.0709049106, 0.1209728122, 0.0633414686, 0.1503104419, -0.0181346685, 0.0420118831, 0.1489115804, 0.418238461, 0.1263736784, -0.1437670887, 0.1408949345, 0.1409352124, 0.3221425414, 0.5147653818, 0.1704251617, 0.2171604037, 0.2110312134, -0.2364501953, 0.3073487878, 0.4408434927, -0.1777642667, -0.1690577269, -0.1695635915, -0.0828564912, -0.3682967126, 0.3819873929, 0.1639522016, 0.3501551747, -0.3786002398, -0.4777140319, 0.4054705501, 0.1832023412, -0.2680817842, 0.3681357503, -0.0032618009, -0.5314046144, 0.4476586878, 0.1192123964, 1.0677213669, -0.0280558467, 0.2187955976, 0.0248616301, 0.0335933417, 0.0263974108, -0.4069091082, 0.2578333914, -0.2166917324, -0.6217983961, -0.1070584431, -0.2104317397, -0.1467546523, 0.0568301417, -0.2196527272, 0.4503042102, 0.0789023638, -0.0052417438, -0.0949505791, -0.1046713293, -0.1614504755, -0.1285874993, -0.1570948958, 0.0151919015, 0.0346293598, 0.4675064683, -0.1815328151, 0.1736757159, -0.2376921177, 0.0873098969, -0.1408261955, -0.0248800702, -0.5036340356, -0.0072962269, -0.3956757486, -0.278301537, -0.1324394494, 0.3288987875, 0.5609945059, 0.2409274578, 0.0507681072, -0.0050175148, 0.134610638, 0.2099208981, -0.0000258312, -0.4559619129, 0.0346980542, 0.0874914229, 0.1809631884, -0.3113735616, 0.090187788, -0.2999633253, -0.0884054005, 0.0175642967, -0.0760428235, 0.0428936668, -0.2532404959, 0.0199302435, -0.1287476122, 0.0843429118, 0.0125715761, 0.0075322166, 0.2728302181, 0.4122649133, -0.5614529848, -0.3694347739, -0.202730909, 0.1790426075, 0.2824055851, 0.0935891122, 0.524255991, -0.2457650304, -0.1774656773, 0.0112959221, 0.0560789108, -0.3277436197, 0.2327896804, 0.1281806082, -0.4396623969, -0.1521254331, -0.1758618802, 0.0732949674, 0.2788395584, -0.0159514062, -0.4446373582, -0.371807158, -0.4002512693, 0.0330339037, 0.0493195802, 0.0435316898, 0.0281926692, 0.2562392652, 0.1701102555, -0.2793209851, -0.1557126641, 0.1606870592, -0.0612230748, 0.1131734475, 0.0303947218, 0.0429067835, 0.4526516795, -0.1833742559, -0.1235072836, 0.1809016317, 0.0278478935, -0.0776692852, -0.0461228639, 0.1795364767, 0.1671640277, -0.0533492863, -0.024050843, -0.2914881706, 0.1014399305, -0.4307426214, 0.2056397349, 0.2780201435, 0.1040155962, -0.1492013037, 0.1424136758, 0.0012866613, -0.0319461524, -0.0352668427, -0.1286033541, 0.3755896389, 0.1459937096, 0.2723152041, 0.175476864, 0.0813814029, -0.0299716983, -0.0383948386, 0.1442056894, 0.3954290152, 0.1295499504, -0.2500987649, 0.0136540383, -0.069772698, 0.0310944617, 0.1242487952, -0.1378896683, -0.375916034, 0.3196120858, -0.0090325326, 0.1303749382, -0.0355185792, 0.0503741987, -0.1860328764, -0.0418297909, 0.5113440752, 0.2504442334, 0.1900666207, -0.0409479551, 0.031554535, 0.4879701734, -0.2109058946, 0.0090664998, -0.2042761147, 0.0334894359, 0.5536413193, 0.3623778522, 0.0902797282, -0.071824342, 0.5481165648, 0.1639937907, 0.1840906292, 0.403670758, -0.0341613479, -0.0197871961, -0.1628611982, -0.0979243889, 0.2346388847, -0.1800170839, 0.3569233119, 0.2207549214, 0.0212629754, -0.4289578795, -0.0185193513, -0.2301103771, 0.2464828193, -0.1843438447, -0.2571744621, 0.2252786607, -0.1393989921, 0.1097766906, 0.3992194533, -0.1051694751, -0.0902860239, 0.0895916149, 0.1580914706, -0.194773972, 0.3890993595, 0.0247045718, 0.1284036636, 0.2157681435, -0.0189614221, 0.6912631989, -0.401789844, 0.1211554408, -0.0657233298, 0.1271681488, 0.2669976652, 0.3885889947, -0.3067553341, -0.1319033206, 0.1379083991, 0.0722924396, -0.0580660999, 0.2541579902, -0.0454964936, 0.0189012848, 0.0825253502, 0.0403177887, -0.0241407212, 0.1920214295, 0.1807074547, 0.1906844825, 0.2427740693, 0.0923809409, -0.1546780616, 0.0720104426, -0.1111398339, 0.0404134057, -0.3237195611, 0.1414382458, 0.1943962872, -0.2693225145, 0.3378168643, 0.2822555304, -0.0039547235, 0.1737372726, 0.4144848883, 0.4470075965, 0.381023407, -0.3458942771, -0.2810311913, -0.3214672804, 0.3492797315, -0.0205256194, 0.1844320297, -0.5769680142, 0.0726151913, 0.0396595187, -0.0608090311, 0.0117180608, 0.0538608395, -0.1230436265, 0.1567475498, -0.2668755651, -0.1203643382, -0.1544594169, -0.0252308529, -0.125164479, -0.4647899568, 0.217420578, -0.2235410362, 0.0293815508, 0.0444850698, 0.031553708, -0.2476910353, -0.0166682899, 0.4364028871, 0.1388330311, 0.1036729217, -0.220871225, -0.0176571235, -0.224477753, -0.0005780198, -0.3665597141, -0.0425506718, -0.007967269, 0.3166202605, -0.314981252, 0.0972123146, -0.372924, 0.0406817421, 0.2318440676, 0.1112588197, 0.0731412545, -0.0877920687, -0.2125247866, 0.0728410184, 0.1720268279, 0.5756816268, 0.0119074956, 0.2167342752, -0.2610248625, 0.0048550814, 0.2358811796, -0.3160803914, -0.4929458499, 0.3831424415, 0.0332283229, 0.1937093735, -0.0659842715, -0.283210516, 0.0543286502, 0.1061113477, -0.2017671764, -0.162127018, -0.0056030694, 0.095265314, 0.1294670105, -0.0604171753, 0.3633386195, 0.1185598671, -0.3589941263, -0.0725695491, -0.2732177377 ]
https://github.com/huggingface/datasets/issues/643
Caching processed dataset at wrong folder
Well actually I just tested and the temporary file is placed in the same directory, so it should work as expected. Which version of `datasets` are you using ?
Hi guys, I run this on my Colab (PRO): ```python from datasets import load_dataset dataset = load_dataset('text', data_files='/content/corpus.txt', cache_dir='/content/drive/My Drive', split='train') def encode(examples): return tokenizer(examples['text'], truncation=True, padding='max_length') dataset = dataset.map(encode, batched=True) ``` The file is about 4 GB, so I cannot process it on the Colab HD because there is no enough space. So I decided to mount my Google Drive fs and do it on it. The dataset is cached in the right place but by processing it (applying `encode` function) seems to use a different folder because Colab HD starts to grow and it crashes when it should be done in the Drive fs. What gets me crazy, it prints it is processing/encoding the dataset in the right folder: ``` Testing the mapped function outputs Testing finished, running the mapping function on the dataset Caching processed dataset at /content/drive/My Drive/text/default-ad3e69d6242ee916/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/cache-b16341780a59747d.arrow ```
29
Caching processed dataset at wrong folder Hi guys, I run this on my Colab (PRO): ```python from datasets import load_dataset dataset = load_dataset('text', data_files='/content/corpus.txt', cache_dir='/content/drive/My Drive', split='train') def encode(examples): return tokenizer(examples['text'], truncation=True, padding='max_length') dataset = dataset.map(encode, batched=True) ``` The file is about 4 GB, so I cannot process it on the Colab HD because there is no enough space. So I decided to mount my Google Drive fs and do it on it. The dataset is cached in the right place but by processing it (applying `encode` function) seems to use a different folder because Colab HD starts to grow and it crashes when it should be done in the Drive fs. What gets me crazy, it prints it is processing/encoding the dataset in the right folder: ``` Testing the mapped function outputs Testing finished, running the mapping function on the dataset Caching processed dataset at /content/drive/My Drive/text/default-ad3e69d6242ee916/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/cache-b16341780a59747d.arrow ``` Well actually I just tested and the temporary file is placed in the same directory, so it should work as expected. Which version of `datasets` are you using ?
[ -0.0980261564, 0.1880934238, -0.0280871242, 0.3890112936, -0.0152984057, -0.0114682093, 0.3272336721, -0.0096283183, 0.0338670313, 0.2286567986, 0.0054276399, 0.2275298834, 0.0270035528, 0.3596073985, -0.0818919539, 0.3205985725, 0.2760052085, -0.040012069, 0.0644234121, -0.0552732237, -0.3491487503, 0.3673864007, -0.2507934868, -0.0342622735, -0.5128512383, -0.1368790567, -0.1799270809, -0.0654985905, -0.0533353761, 0.2139816284, 0.2417697012, 0.0878143385, 0.2495014518, 0.6615496278, -0.0001283521, -0.1254281104, 0.2665182352, -0.2071331143, -0.2330142856, -0.1373614967, -0.3181045353, -0.0141901933, -0.1339171976, -0.0676873624, -0.1839045435, 0.0559142604, 0.272248745, -0.5808053017, -0.0250830688, 0.2409746051, 0.0725319609, -0.4288322031, -0.4749286771, 0.3053100407, 0.0975686535, 0.2478015721, -0.1422924101, 0.243068561, 0.010071313, -0.1486694515, -0.0139405616, 0.3300213814, -0.0270180814, -0.0138577642, 0.2550145686, 0.3406345546, -0.3194048107, -0.3421724141, 0.5276230574, -0.4035203457, 0.4292228818, -0.3404008448, -0.1767735779, -0.2474314719, -0.3338010013, -0.2652029395, 0.5508043766, 0.1417495459, -0.0569065586, 0.0546892546, -0.4403162003, -0.1930937767, 0.1015329435, 0.1382197738, -0.2382398248, 0.211184144, -0.2950585783, 0.0276472606, -0.0116533441, 0.1779491454, 0.6766309142, -0.3875517845, -0.010365596, 0.2630081475, -0.0764713064, 0.0587411523, -0.1897368431, 0.5381626487, -0.2099022567, -0.1787819564, -0.2083128989, -0.1328070611, -0.1667218804, 0.0946200788, 0.0385345742, 0.4278604686, -0.000203127, 0.2039845288, 0.1246514022, 0.1027066559, -0.5209600329, -0.0831511766, -0.0767242983, -0.2948049307, 0.0310731381, 0.0969408154, 0.018293798, -0.2827686071, -0.1402123272, -0.0378730744, -0.331420064, 0.0076375045, 0.1045963317, 0.2721994519, 0.0197713524, -0.1099094898, -0.2434049547, 0.2486742437, -0.2118349969, 0.3456624746, -0.0870072618, 0.0086986236, -0.3563788533, 0.4384667575, 0.0830165446, -0.251698494, 0.1514323652, 0.0271904059, -0.0714057684, -0.289983511, -0.0525117069, -0.446180582, 0.3620025218, 0.1732681841, 0.1398662329, 0.4446721971, 0.0133798588, -0.282995522, -0.1326736212, 0.2426326573, -0.5642427206, -0.2653875947, 0.1985358149, -0.0222925059, -0.2773581147, -0.069346115, -0.6552342176, 0.1527510583, 0.4416985512, -0.2127338797, 0.0085251778, -0.0149789397, -0.5274880528, -0.2238630205, 0.0653403327, 0.2004461586, -0.3652798533, 0.2058353126, -0.3317602873, 0.4557799399, 0.4756260216, 0.3935821354, 0.0600951202, 0.2928663492, -0.4933998287, -0.027843006, 0.1094956547, -0.3094348311, -0.8372095823, 0.2986419797, -0.0138160214, -0.0036941841, -0.0160195939, -0.017527543, 0.1869793981, -0.1316636056, 0.163833335, 0.1282262951, 0.1112388745, 0.1921884418, -0.2752439082, -0.1077735573, -0.0107340217, -0.1389639378, 0.037767414, 0.2096468508, 0.041476585, -0.3141260743, 0.1381217539, -0.0697862208, 0.1110015139, 0.2728103697, 0.2326659262, -0.129824549, 0.0115751922, 0.1633970439, -0.2243789136, 0.1474623978, 0.0282336697, -0.146434769, -0.4751336575, -0.1171611026, -0.1194384992, -0.1630002856, -0.1702291518, -0.2517787516, -0.0769414455, 0.3072750568, 0.1577134728, 0.0284266025, -0.0407035463, 0.5251292586, 0.0626681969, -0.159321785, -0.0775784105, 0.2420239151, -0.1074184179, -0.2392101437, -0.1487957835, -0.195563063, 0.2205366641, -0.2613434196, -0.2108384371, 0.1778933108, 0.3059597909, 0.4031630754, 0.0000760776, 0.2722761929, 0.1150536984, 0.1092059761, 0.1694867015, -0.0730654821, 0.002863355, 0.1589084268, -0.0870772749, 0.2959052026, -0.1469184458, 0.1218438148, -0.1049630493, -0.2768553495, -0.0485452078, 0.128516227, 0.227247417, -0.1897710264, 0.0760216489, 0.3430963755, 0.3831255138, 0.1982019991, 0.2105251551, 0.3193439543, 0.6449166536, 0.0745250285, -0.0350592881, 0.1307908893, -0.1365967989, -0.2520188391, 0.2219440937, 0.5235062242, 0.4602491856, -0.0905335844, 0.3149170876, 0.1629968882, 0.0578841195, -0.1506623924, 0.1834230721, 0.0628223717, 0.2291883528, -0.0392770991, 0.2224124521, -0.0436757728, -0.0493960418, 0.2005114853, 0.0795627534, 0.2584432065, -0.1204504296, 0.5334352851, -0.288333416, -0.2838529944, -0.2435618639, -0.2431816161, 0.1216462851, -0.2136328369, -0.0976685137, 0.28372401, 0.0229831412, 0.0302868262, 0.2321427315, 0.1779785603, -0.1119578779, -0.4812887311, -0.1549044549, 0.1786361486, -0.2874041796, -0.1078282446, 0.3575818241, -0.0645391345, -0.0030335458, -0.2168626785, 0.0499560907, -0.3820486367, -0.1052475944, 0.2430864125, -0.1462058425, 0.2984905839, -0.2173312306, 0.0096436478, -0.4151868522, -0.2589136362, -0.0426565371, -0.0731206015, -0.0663107038, -0.3781528771, -0.0269041862, -0.0367621034, -0.0858887881, 0.0030578561, -0.1907087266, -0.0507301316, 0.0393038131, -0.0032558441, 0.2771962881, 0.0263258815, -0.0282736085, -0.206023261, 0.3607290685, -0.3355196416, -0.3527884185, -0.6440096498, 0.3036520779, -0.1445790827, -0.0496171638, -0.0091053098, -0.0806198716, 0.0811305493, 0.4302973449, -0.4716597199, -0.3017314672, -0.1893894821, 0.0136479065, 0.2360910773, -0.078621164, 0.30279544, 0.0290110521, 0.1964473128, -0.039106939, -0.401825875, 0.0120687112, 0.3081807494, 0.3579261899, 0.0200458728, 0.2512148619, 0.0542705804, 0.6362167597, 0.311411202, -0.0254185796, 0.4196687937, -0.052501969, 0.2791057229, -0.2759059966, -0.133495301, 0.2206902355, 0.0085320324, -0.6468424201, 0.4143178463, 0.1350243092, -0.4364434481, 0.12933667, -0.2338130772, -0.3097453415, -0.220863834, 0.2010633647, 0.0150644425, 0.2418788671, 0.0931671411, 0.14086999, -0.2238602936, -0.2583942115, 0.0733993798, 0.0829024166, 0.2794158459, 0.1986620426, 0.2841050327, -0.4492783844, -0.3418409824, 0.3656335473, 0.3487119675, 0.1174187139, -0.05171974, 0.2863924801, 0.1670887321, -0.1253132373, 0.6017550826, -0.3549862802, -0.157630384, -0.1396560669, 0.0860994831, -0.0286172256, -0.0127254277, 0.0868601128, 0.414377898, -0.0153197143, 0.5273714662, 0.3371327817, -0.0092128403, -0.1358058453, 0.2135933936, -0.2908639014, -0.1111326218, -0.2496886104, -0.5072107911, -0.203581512, -0.030514814, 0.1430049241, -0.0360445306, 0.0577826723, -0.0157501996, 0.1317581087, -0.0788707659, 0.0373288468, 0.1687455028, 0.398745358, 0.1434882581, -0.0929075554, 0.1817072779, 0.1506510675, 0.3403626382, 0.5511040688, 0.1500351131, 0.149583742, 0.1981315613, -0.1751904786, 0.2572018802, 0.3737203777, -0.1885134429, -0.2240608335, -0.1297323257, -0.0401514918, -0.3493866324, 0.3809078336, 0.1846684813, 0.3165334463, -0.4214710891, -0.500972271, 0.4355788231, 0.2205659002, -0.2880977988, 0.3781585395, 0.0354241356, -0.5233603716, 0.3892887533, 0.1044136882, 1.0422184467, -0.0291998461, 0.1440988481, 0.0531247035, 0.0830091089, 0.0313163735, -0.3644899726, 0.2190928161, -0.1676380634, -0.6126426458, -0.0993561968, -0.1702267677, -0.1494571567, 0.1020817161, -0.1965595186, 0.4737476707, 0.046432659, 0.0202118419, -0.1003162488, -0.0861565173, -0.147577852, -0.1369806826, -0.2015276849, 0.0424226522, -0.01387918, 0.4463691413, -0.1584410369, 0.2061469257, -0.1769950092, 0.0774845555, -0.1612953246, -0.0300465152, -0.4129764438, -0.0100580044, -0.3488592207, -0.2777053714, -0.1216037273, 0.3321137428, 0.5052714944, 0.2408724278, 0.0065476336, 0.0173621848, 0.1277463734, 0.2272513807, -0.0037383959, -0.4449216127, 0.054381907, 0.07110852, 0.1271182895, -0.3398969173, 0.1443746388, -0.3299236596, -0.0805933475, 0.0498801097, -0.0724173635, 0.012218155, -0.2223534286, 0.0186886843, -0.0997203812, 0.0818675905, 0.0187698901, -0.0277275294, 0.2760847509, 0.4491718411, -0.506220758, -0.3761130571, -0.2003277838, 0.2312145829, 0.2537479401, 0.0779752955, 0.4709458649, -0.2033416033, -0.1989326775, 0.0163188949, 0.0493042767, -0.2708587348, 0.265440464, 0.1176895946, -0.4326001108, -0.0760935098, -0.1049762368, 0.0623804964, 0.2794386148, -0.0690016747, -0.3804731071, -0.3362634778, -0.3996812701, 0.0158160031, 0.0075659724, 0.0767894834, 0.0372358412, 0.1834813654, 0.1978898644, -0.2522158027, -0.1632394791, 0.1962478608, -0.0291682631, 0.089333877, 0.0308363326, 0.0201766752, 0.3836662173, -0.1811817735, -0.1201847345, 0.187204659, 0.0288529173, -0.0975006521, -0.0426417589, 0.1753437072, 0.2000726461, -0.0873261392, -0.0261494406, -0.3177047372, 0.0501990169, -0.4659600854, 0.2392885834, 0.3294806778, 0.1140274405, -0.1467582285, 0.1601732373, -0.09015055, 0.0041617304, -0.0279727802, -0.1430329084, 0.3741375208, 0.1804344952, 0.2766547203, 0.2307609916, 0.0583929941, -0.0458596088, -0.0354673043, 0.1661788374, 0.3798241019, 0.1416236758, -0.28769961, 0.0398343205, -0.0435175821, 0.0818061754, 0.1289950907, -0.1184121668, -0.4044561386, 0.3410646915, 0.0122531354, 0.0788660944, -0.0586060584, 0.0762963966, -0.1434562355, -0.0362133123, 0.4677838683, 0.2167765796, 0.1726542562, -0.0198743492, 0.0120630264, 0.5274107456, -0.2263323963, 0.0627217591, -0.1645874977, 0.0231251642, 0.513148427, 0.3814444244, 0.1008139998, -0.0556069799, 0.5689260364, 0.1279404908, 0.161386162, 0.3975926638, -0.0264827013, 0.031163251, -0.2019391209, -0.0550165623, 0.2214680016, -0.2355239391, 0.3087992072, 0.2096164823, 0.0180902574, -0.4070118964, 0.0104146712, -0.2349977493, 0.1744942963, -0.1533947885, -0.2313525677, 0.2195526659, -0.1457256824, 0.0759932399, 0.4029559493, -0.1542876959, -0.1516584158, 0.1393974125, 0.1362302899, -0.2383458167, 0.3084789217, 0.0470375866, 0.1173305884, 0.2149894387, 0.0212370008, 0.7285472155, -0.3807321787, 0.1181314439, -0.0495731682, 0.1587447375, 0.2729341984, 0.3947118521, -0.2817907631, -0.1670731008, 0.1221453249, 0.0612510741, -0.1060801297, 0.2734852135, -0.0934420377, -0.0007922649, 0.1241961345, 0.0586619228, -0.0224678218, 0.1964730322, 0.1348848343, 0.22195144, 0.1697978675, 0.1599489897, -0.1646051109, 0.1446319073, -0.1275213957, 0.041375868, -0.345408082, 0.1061101109, 0.2801468372, -0.2671584189, 0.3705638945, 0.2498253435, 0.0027985759, 0.1581299901, 0.4029340148, 0.4333434403, 0.378410399, -0.31799528, -0.2564052939, -0.3567010164, 0.3485017717, -0.0395745523, 0.2108885646, -0.5317531824, 0.0837219134, 0.009192802, -0.0288419519, 0.0147683322, 0.0835058987, -0.1280094981, 0.1944004744, -0.2590574026, -0.1269489527, -0.1365110874, 0.0002453439, -0.1157818884, -0.453848213, 0.2380294204, -0.216874361, 0.0133955628, 0.0219858438, 0.0698786303, -0.2033717781, -0.0798681676, 0.4682747126, 0.2058279067, 0.1373893321, -0.2023868263, 0.0295156762, -0.3125998378, 0.0013137609, -0.3460720778, -0.0226150826, -0.0177887138, 0.3184145093, -0.3162147403, 0.0922100171, -0.3713302016, 0.0409183204, 0.2394524813, 0.1391157806, 0.0661614612, -0.1014294773, -0.2200343311, 0.0427592434, 0.1457409263, 0.5836617947, 0.0304109976, 0.2649637759, -0.2504310906, -0.0683841333, 0.276488483, -0.3251813352, -0.5561157465, 0.3612316251, 0.0431546159, 0.236136198, -0.142843917, -0.27226156, 0.0014823079, 0.1511980891, -0.2202394307, -0.1296273768, 0.0169421881, 0.1076121181, 0.133899048, -0.0762365013, 0.3681736588, 0.1175733879, -0.3408055007, -0.0224710032, -0.2614035904 ]
https://github.com/huggingface/datasets/issues/643
Caching processed dataset at wrong folder
It looks like a pyarrow issue with google colab. For some reason this code increases the disk usage of google colab while it actually writes into google drive: ```python import pyarrow as pa stream = pa.OSFile("/content/drive/My Drive/path/to/file.arrow", "wb") writer = pa.RecordBatchStreamWriter(stream, schema=pa.schema({"text": pa.string()})) writer.write_table(pa.Table.from_pydict({"text": ["a"*511 + "\n"] * ((1 << 30) // 512)})) # 1GiB writer.close() stream.close() ``` Moreover if I `rm` the file on google drive, it frees disk space on google colab.
Hi guys, I run this on my Colab (PRO): ```python from datasets import load_dataset dataset = load_dataset('text', data_files='/content/corpus.txt', cache_dir='/content/drive/My Drive', split='train') def encode(examples): return tokenizer(examples['text'], truncation=True, padding='max_length') dataset = dataset.map(encode, batched=True) ``` The file is about 4 GB, so I cannot process it on the Colab HD because there is no enough space. So I decided to mount my Google Drive fs and do it on it. The dataset is cached in the right place but by processing it (applying `encode` function) seems to use a different folder because Colab HD starts to grow and it crashes when it should be done in the Drive fs. What gets me crazy, it prints it is processing/encoding the dataset in the right folder: ``` Testing the mapped function outputs Testing finished, running the mapping function on the dataset Caching processed dataset at /content/drive/My Drive/text/default-ad3e69d6242ee916/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/cache-b16341780a59747d.arrow ```
74
Caching processed dataset at wrong folder Hi guys, I run this on my Colab (PRO): ```python from datasets import load_dataset dataset = load_dataset('text', data_files='/content/corpus.txt', cache_dir='/content/drive/My Drive', split='train') def encode(examples): return tokenizer(examples['text'], truncation=True, padding='max_length') dataset = dataset.map(encode, batched=True) ``` The file is about 4 GB, so I cannot process it on the Colab HD because there is no enough space. So I decided to mount my Google Drive fs and do it on it. The dataset is cached in the right place but by processing it (applying `encode` function) seems to use a different folder because Colab HD starts to grow and it crashes when it should be done in the Drive fs. What gets me crazy, it prints it is processing/encoding the dataset in the right folder: ``` Testing the mapped function outputs Testing finished, running the mapping function on the dataset Caching processed dataset at /content/drive/My Drive/text/default-ad3e69d6242ee916/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/cache-b16341780a59747d.arrow ``` It looks like a pyarrow issue with google colab. For some reason this code increases the disk usage of google colab while it actually writes into google drive: ```python import pyarrow as pa stream = pa.OSFile("/content/drive/My Drive/path/to/file.arrow", "wb") writer = pa.RecordBatchStreamWriter(stream, schema=pa.schema({"text": pa.string()})) writer.write_table(pa.Table.from_pydict({"text": ["a"*511 + "\n"] * ((1 << 30) // 512)})) # 1GiB writer.close() stream.close() ``` Moreover if I `rm` the file on google drive, it frees disk space on google colab.
[ 0.0214206427, 0.2508090734, -0.0044860207, 0.404622972, -0.0593292825, -0.0976661742, 0.3005951047, 0.0051358175, -0.215258956, 0.1271970868, -0.0451294035, 0.3386108279, 0.1068907827, 0.2934120595, 0.0605556145, 0.2235909551, 0.2913429737, -0.0080433972, 0.1109900177, -0.0101505741, -0.2467477471, 0.3662293255, -0.2008793354, -0.0313707143, -0.4776649773, -0.0688779578, -0.0705260783, -0.1316779703, -0.0325233527, 0.1094124615, 0.1837548614, 0.07857842, 0.1501134634, 0.5543875694, -0.0001274843, -0.2014524937, 0.2319875509, -0.1646424681, -0.1672950387, 0.0952372998, -0.0660736412, -0.0158472583, 0.0897296295, -0.0740429088, -0.0477135591, 0.0367301442, 0.2621205449, -0.1535115242, -0.0225631855, 0.1620320976, 0.0549755581, -0.3191691041, -0.3850808442, 0.4330035746, 0.3903358877, 0.2227912545, -0.1758394837, 0.5310473442, 0.0761611313, -0.183021158, -0.3298396766, 0.2640750408, -0.1264403164, 0.1106779724, 0.3018000424, 0.4598478973, -0.2333252877, -0.1505205631, 0.4259729385, -0.4988747835, 0.3272492886, -0.5334556103, -0.002896972, -0.3041200042, -0.2905578315, -0.3609189987, 0.5149067044, 0.279895097, -0.2408692539, 0.0373576246, -0.3895552754, -0.1254131198, 0.0084923282, 0.0503373966, -0.1800257564, 0.1711018831, -0.1223680526, 0.0516219251, 0.02879861, 0.3116212785, 0.767845571, -0.3648163378, 0.066157788, 0.1391306818, -0.0734798089, 0.1814908087, -0.1290129721, 0.5854723454, -0.1312907487, -0.1340880692, -0.1266845167, -0.1530453712, 0.0074446015, 0.1737575829, 0.1315621585, 0.3484871387, -0.2132811546, 0.0425848849, 0.0428296663, 0.0529011674, -0.3369434178, -0.078231439, 0.031696979, -0.2644616961, 0.1953685433, -0.0778723508, 0.0231228918, -0.2491856217, -0.1689239442, -0.2116444856, -0.3935552239, -0.0601107068, -0.0610284805, 0.2024488002, -0.0228126962, -0.2642133832, -0.2638734877, 0.2894010246, -0.2768118978, 0.3575041294, -0.0012717992, 0.0600017607, -0.2810414433, 0.5400342345, 0.1548735499, -0.1799013019, 0.0203174129, 0.1410443932, -0.0431124568, -0.2354214638, -0.0890351981, -0.4710317254, 0.2895699143, 0.1914194822, 0.1892022789, 0.3132679462, -0.1785364747, -0.2473846078, -0.1625049114, 0.4494559169, -0.5062212944, -0.3123242855, 0.1799287051, -0.0252874382, -0.2993509173, -0.0670902506, -0.7675721645, 0.1321818233, 0.4474853575, -0.2971972227, 0.0309301913, -0.0965607762, -0.4285755754, -0.3032743037, 0.0149896741, 0.1452413946, -0.3597571552, 0.060381189, -0.1958560497, 0.4015367925, 0.4847249389, 0.4746412337, -0.0242265109, 0.3547287881, -0.4302557111, 0.0213740915, -0.0106689855, -0.1567400098, -0.8236623406, 0.1317307949, -0.1323920786, 0.069586657, 0.1081510782, 0.0055568516, 0.1588882506, -0.0654983222, 0.2246366143, 0.2069314718, 0.0737836286, 0.2635891736, -0.3758736551, -0.183239907, -0.0158577021, -0.059440095, -0.0387477539, 0.0208459068, 0.0070684925, -0.4193621874, 0.2358667552, -0.1587568372, 0.1791306734, 0.2395591587, 0.3115640879, -0.2237322927, 0.0086978041, 0.2210534215, -0.2592787147, 0.0565236285, -0.0845235735, -0.1852924675, -0.6225896478, -0.1273638904, -0.1777253151, -0.0660640523, -0.1020756066, -0.1924061775, -0.0981779099, 0.2356910259, 0.1794311553, 0.0773899034, 0.1373387426, 0.3301823735, 0.1836966276, -0.1190627515, 0.0269474536, 0.3751266301, -0.1567519158, -0.3961026967, -0.0702989101, -0.1427290887, 0.2049587816, -0.1596706212, -0.0452095307, 0.0729755014, 0.1840240806, 0.4450155497, -0.0777893811, 0.2858203948, 0.1316982955, 0.021610029, 0.173271209, -0.0894049555, 0.0138270222, 0.0375896469, -0.0361161567, 0.3025755286, -0.0142472126, 0.1336647123, -0.1301428676, -0.348552078, -0.13776353, 0.0490127951, 0.3428660929, 0.0856741741, 0.0891670585, 0.5186511874, 0.4237043262, 0.2823850513, 0.1714912653, 0.3646708131, 0.632881999, -0.0050884113, -0.054119423, 0.2575048506, -0.2285195887, -0.3635296822, 0.2713557184, 0.2931197882, 0.4082556963, -0.0139625296, 0.1979500651, 0.1831988543, 0.1056673527, -0.230192408, 0.200195685, 0.1337824762, 0.2639003992, -0.0150433946, 0.3801979125, -0.1243996769, -0.1518717855, 0.1923526824, 0.0603039414, 0.1992246807, -0.2093027085, 0.5275090933, -0.1733885407, -0.2081393301, -0.2764719129, -0.2005177438, 0.0451707914, -0.1497698575, -0.062613681, 0.4318821728, 0.0306842029, -0.0486449972, 0.3401046991, 0.1422445923, -0.1621336639, -0.5259184837, -0.1383045614, -0.022013478, -0.3286528289, -0.0774947926, 0.4027001262, -0.0441010073, 0.0771535784, -0.1384210289, 0.0884587243, -0.3343406618, -0.12146952, 0.2740840018, -0.1896560788, 0.358725369, -0.3514126539, -0.1115451977, -0.4793851078, -0.3280753791, -0.1017478257, -0.0617398322, -0.1142811179, -0.3009620905, -0.0578717813, 0.0006672218, -0.0771656856, 0.0741542578, -0.130137071, 0.0200143978, 0.1932439804, -0.0068308339, 0.3806486726, -0.0050654486, -0.0432369672, -0.1388842762, 0.3479842544, -0.3142755032, -0.2478320599, -0.4356374443, 0.2898029685, -0.0098418221, -0.0552159101, -0.0459034741, -0.1516481936, -0.0423311554, 0.4133678973, -0.5115942359, -0.361677736, -0.1239703521, 0.2218838632, 0.152157411, -0.1106738299, 0.2670274377, 0.1321385801, 0.1791775227, 0.0847616941, -0.4711602926, -0.1082862914, 0.3099482059, 0.2349281758, 0.0253390744, 0.006628627, 0.0264171883, 0.6789852977, -0.0198745169, 0.096868813, 0.5025280714, -0.0739438459, 0.2042861283, -0.1957440525, -0.0140867755, 0.1454436481, -0.1160619929, -0.6394325495, 0.3694888353, 0.0848571733, -0.4844219089, 0.2561869025, -0.275003314, -0.2847720683, -0.3084433675, 0.2834477127, 0.2377591431, 0.1873609424, 0.0623417646, 0.0377471596, -0.1657354832, -0.3295606673, 0.0454920754, -0.0543028191, 0.2175505012, 0.0898288041, 0.3859018683, -0.413477093, -0.4909830689, 0.2205985039, 0.3826883137, -0.1792143136, 0.0343791395, 0.2228516936, 0.3623175025, -0.2491302043, 0.5580063462, -0.0694049746, -0.1339288056, -0.146288082, 0.1664872319, -0.1333399117, -0.0457108356, -0.0675489232, 0.2604496777, -0.0163357873, 0.3377737999, 0.1728352457, 0.0868891031, -0.0843028799, 0.3391501307, -0.2835314274, -0.0867725164, -0.1964410692, -0.7121967077, -0.0780664459, 0.0588039085, 0.1275705993, 0.0980686396, 0.0418758169, 0.173192203, 0.2353619188, -0.0124086887, 0.0576720797, 0.150838837, 0.5057429075, 0.0489224866, -0.1249403805, -0.006402554, 0.0524300188, 0.4683279395, 0.4879971743, 0.3116281033, 0.2212159932, 0.1519512236, -0.1782696992, 0.2390087098, 0.2694659829, -0.1086310446, -0.106631726, -0.2813797593, -0.0630209893, -0.2542048097, 0.1934582144, 0.114424333, 0.2429377586, -0.2523652315, -0.2127580196, 0.5412884951, 0.1812481731, -0.3203698397, 0.1657178402, 0.0798250958, -0.5004059076, 0.4107102752, 0.1969548464, 1.0303822756, -0.0471594594, 0.3019302487, 0.3087446094, -0.1668764353, -0.1174321175, -0.2488426566, 0.3039293587, -0.2209102362, -0.539670229, -0.0110264793, -0.2475276887, -0.190570429, -0.0188330561, -0.2668169141, 0.3062215447, -0.0438310429, 0.0509449653, -0.0009034462, -0.1211186573, 0.0458541587, -0.1328751445, -0.2634561062, -0.0213886462, -0.0880286396, 0.2598496675, -0.1332018375, 0.1381905079, -0.1858513653, 0.0161062628, -0.1680463254, -0.016022604, -0.439468056, -0.0721304342, -0.2420893312, -0.3970454335, -0.1454387307, 0.282338202, 0.2300016135, -0.0147829344, 0.0704173073, 0.0020577563, 0.1275440902, 0.286444813, 0.0846181959, -0.4260694087, -0.1445913911, 0.0226756409, 0.0339438319, -0.2728815377, 0.1001210958, -0.3597153723, -0.1363669783, 0.2026908398, -0.1626833528, 0.0294707753, -0.0940958411, -0.1133508161, -0.0550971664, 0.0660779849, 0.0095067536, -0.0463660024, 0.2639182806, 0.4637503028, -0.4630709589, -0.3630080521, -0.1878153384, 0.2531950772, 0.445002377, 0.2145128548, 0.5582182407, -0.2028562278, -0.2048256099, 0.0576997995, -0.1367770135, -0.2237820923, 0.1204760522, 0.1326521635, -0.3732172847, 0.0511844158, -0.1787276566, 0.0684332401, 0.2830267549, -0.0346840099, -0.4491631985, -0.1557340175, -0.3818814754, 0.0471600294, 0.1669677496, -0.087484777, 0.1780773103, 0.1550748795, 0.2216192186, -0.0953020826, -0.145644173, 0.2007077485, 0.0055035651, 0.2432678193, 0.0172678381, -0.1123384833, 0.4713478684, -0.1918704659, -0.1179880425, 0.1168154329, 0.1549623907, -0.1169966757, -0.0153459758, 0.1827224046, 0.1044030339, -0.1024020538, -0.1249284744, -0.1001677513, 0.2199306339, -0.4058307707, 0.3253193498, 0.2486816347, 0.1182096079, -0.2841389477, 0.2652663589, -0.0952178538, -0.1122512221, -0.1452153921, -0.1768476963, 0.3615043461, 0.2094566822, 0.2111810595, 0.2580584288, 0.097541213, 0.0806858689, -0.0012961589, 0.2764174044, 0.3049296141, 0.0272612236, -0.2269962132, -0.1670792997, -0.1834326237, 0.0302476287, 0.1225224063, -0.0035720803, -0.4958040714, 0.3037507534, 0.0041009318, 0.1544913203, -0.1203544289, 0.1704877317, -0.2577297091, -0.1299338341, 0.4528477788, 0.2115067989, 0.1399542391, -0.0298392214, -0.0397719257, 0.5303686857, 0.0392148979, -0.1085444167, -0.1534612477, 0.0309160948, 0.4009896517, 0.4879715443, 0.1087318361, 0.0028849319, 0.2444179952, 0.0172579996, 0.0839072242, 0.2110324651, -0.0905223787, -0.0388354138, -0.1549647599, -0.1460747421, 0.2240419686, -0.01101375, 0.1774435937, 0.2570496798, 0.1577576101, -0.1078217477, -0.059541367, -0.1887117326, 0.2701647878, -0.2121590823, -0.2904262543, 0.2382965982, -0.1484884769, 0.0976769924, 0.4762371182, -0.2112546861, -0.2921562195, 0.3226990402, 0.2024368942, -0.2835075557, 0.2390036881, 0.1534791887, 0.194201082, 0.1675259918, 0.0098427162, 0.5851390362, -0.4067664146, 0.0345787257, -0.1898215115, 0.1489820033, 0.4157555699, 0.3311623931, -0.2545665503, -0.2914423048, 0.0924549848, 0.1278852373, -0.130944863, 0.1330518872, -0.0176011622, 0.232779175, 0.0003341697, 0.0493431687, -0.0455654562, 0.1688592732, 0.2412272692, 0.1468365788, 0.1848759353, -0.0148401186, -0.1873384565, -0.0714154989, -0.0255148672, -0.0618172623, -0.2494093478, 0.1596646756, 0.2407687753, -0.2312212288, 0.301276803, 0.3104696274, -0.0102103222, 0.1933278739, 0.5336732864, 0.368565917, 0.3064683676, -0.3856615424, -0.1242972612, -0.2901499271, 0.3773026466, -0.2015380859, 0.1256182194, -0.3924017549, 0.1516054869, -0.1680914313, 0.0277755484, -0.0537368059, 0.3108823597, -0.1991760582, 0.0368194133, -0.3342729509, -0.0064274063, -0.0134856477, -0.1722969711, -0.1404055953, -0.4481002092, 0.3833132684, -0.2276223749, 0.0577224195, 0.0032161437, 0.0626590699, -0.171734035, 0.051243227, 0.5267399549, 0.1537845582, 0.1620919704, -0.1921782792, 0.0589523613, -0.2873102725, -0.0165703446, -0.321207583, 0.0060100378, -0.0835125297, 0.2568938732, -0.2164870501, 0.0752681345, -0.316920191, 0.3819306195, 0.2212303579, 0.0362060219, 0.3712444305, -0.114062205, -0.1697267443, 0.0451947674, -0.1173973829, 0.3468997777, 0.0570165962, 0.2356338352, -0.4045064747, -0.0970255882, 0.2255554497, -0.4353226125, -0.4823940098, 0.4828672707, 0.1074677557, 0.2992265821, -0.2077958286, -0.2479836047, 0.0159879178, 0.1334016472, -0.1762092412, -0.2149588764, 0.141854465, -0.0739011914, 0.0904602557, -0.0241558328, 0.2754197419, 0.0608072057, -0.4763140082, -0.1921879947, -0.300021708 ]
https://github.com/huggingface/datasets/issues/643
Caching processed dataset at wrong folder
Actually I did more tests it doesn't >.< I'll let you know if I find a way to fix that
Hi guys, I run this on my Colab (PRO): ```python from datasets import load_dataset dataset = load_dataset('text', data_files='/content/corpus.txt', cache_dir='/content/drive/My Drive', split='train') def encode(examples): return tokenizer(examples['text'], truncation=True, padding='max_length') dataset = dataset.map(encode, batched=True) ``` The file is about 4 GB, so I cannot process it on the Colab HD because there is no enough space. So I decided to mount my Google Drive fs and do it on it. The dataset is cached in the right place but by processing it (applying `encode` function) seems to use a different folder because Colab HD starts to grow and it crashes when it should be done in the Drive fs. What gets me crazy, it prints it is processing/encoding the dataset in the right folder: ``` Testing the mapped function outputs Testing finished, running the mapping function on the dataset Caching processed dataset at /content/drive/My Drive/text/default-ad3e69d6242ee916/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/cache-b16341780a59747d.arrow ```
20
Caching processed dataset at wrong folder Hi guys, I run this on my Colab (PRO): ```python from datasets import load_dataset dataset = load_dataset('text', data_files='/content/corpus.txt', cache_dir='/content/drive/My Drive', split='train') def encode(examples): return tokenizer(examples['text'], truncation=True, padding='max_length') dataset = dataset.map(encode, batched=True) ``` The file is about 4 GB, so I cannot process it on the Colab HD because there is no enough space. So I decided to mount my Google Drive fs and do it on it. The dataset is cached in the right place but by processing it (applying `encode` function) seems to use a different folder because Colab HD starts to grow and it crashes when it should be done in the Drive fs. What gets me crazy, it prints it is processing/encoding the dataset in the right folder: ``` Testing the mapped function outputs Testing finished, running the mapping function on the dataset Caching processed dataset at /content/drive/My Drive/text/default-ad3e69d6242ee916/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/cache-b16341780a59747d.arrow ``` Actually I did more tests it doesn't >.< I'll let you know if I find a way to fix that
[ -0.1001971811, 0.1869408637, -0.0199802928, 0.4581357837, -0.0414947867, -0.0328618437, 0.3107518256, 0.0137844849, 0.0069225132, 0.1962368786, 0.0152607113, 0.2812231183, 0.0530099049, 0.4453215301, -0.0918111429, 0.3288264871, 0.2770379782, -0.0716737211, 0.0422291905, -0.0833310187, -0.3645293117, 0.3949061632, -0.2226699889, -0.0297362991, -0.4760737121, -0.1265687644, -0.1334362924, -0.0599122979, -0.027535161, 0.2656661868, 0.2077789307, 0.0172539782, 0.1959410161, 0.6396949887, -0.0001290814, -0.1540848315, 0.2132696509, -0.245255962, -0.1593501866, -0.1146849543, -0.2909754813, 0.0105283558, -0.1092144996, -0.0579331815, -0.21695593, 0.1224936619, 0.2166626155, -0.5325360894, -0.0100783855, 0.2681663036, 0.064094618, -0.4739491642, -0.480668515, 0.383351922, 0.0587673821, 0.2437628806, -0.1412622333, 0.2848824561, -0.0153785646, -0.1163644344, -0.0842501372, 0.2958208621, -0.0695915744, 0.0128159113, 0.2305928171, 0.3311593533, -0.3048782647, -0.3038527966, 0.5335618258, -0.440862447, 0.3466918468, -0.3541081548, -0.1448861957, -0.2651274204, -0.3225087523, -0.3541123271, 0.5375028253, 0.1383069605, -0.1107283086, 0.0715741068, -0.48109442, -0.2326905876, 0.065659225, 0.0965538621, -0.190410465, 0.3148013949, -0.2614331543, 0.0345509574, -0.0387205295, 0.2343809009, 0.6895992756, -0.3677738309, -0.0195309073, 0.1927597374, -0.1213611662, 0.0573781431, -0.2128070891, 0.551689446, -0.2121063918, -0.1210528985, -0.2416573614, -0.1189972013, -0.1252045333, 0.0866357237, 0.0504911356, 0.4452636838, -0.0332958028, 0.1330260336, 0.0818040967, 0.0395884439, -0.5268154144, -0.0845655501, -0.027151525, -0.2695398033, -0.0042478517, 0.1287913024, -0.0369932316, -0.2902015448, -0.0993201733, -0.0460206829, -0.312728554, -0.0137998872, 0.0954968482, 0.3146100342, -0.043442633, -0.1035687625, -0.2572103739, 0.2454317808, -0.1969505697, 0.3418698013, -0.0803740621, 0.0207409933, -0.3455395997, 0.4818634391, 0.0642976165, -0.2464166284, 0.1298308074, 0.0363996401, -0.0103649329, -0.3284161091, -0.058349859, -0.4485406876, 0.3813796043, 0.1436960399, 0.1453277171, 0.4247845113, 0.0019368511, -0.2195817679, -0.1311644316, 0.2365234792, -0.6200782657, -0.2533977032, 0.1688307524, -0.0321040899, -0.2495457828, -0.0673458502, -0.699995935, 0.1224300265, 0.4770283103, -0.2050566077, -0.0080384761, -0.004333281, -0.4935466647, -0.2447815388, 0.0572935864, 0.1244079024, -0.2343388498, 0.1792040318, -0.3634160459, 0.4979801476, 0.5139262676, 0.3871516287, 0.0752898753, 0.2675045729, -0.4960411787, -0.0413406566, 0.1093341783, -0.2899126112, -0.84691751, 0.3187883496, -0.0423516184, -0.0128071457, 0.0056164339, -0.0255612079, 0.2197382599, -0.0894949064, 0.1782173961, 0.1592388898, 0.0948839784, 0.2119772732, -0.2583329082, -0.1464413404, -0.0520193763, -0.1498498917, 0.0301357824, 0.2176419497, -0.0006996058, -0.3706173301, 0.150918752, -0.1175172105, 0.1293628663, 0.2562038004, 0.2577469945, -0.1585954279, 0.0000067689, 0.1345372349, -0.2021969855, 0.130584538, 0.0093785152, -0.166339606, -0.5060368776, -0.0681095272, -0.1266945004, -0.164160952, -0.1428065151, -0.2228370756, -0.0874699205, 0.3029134572, 0.1267006248, 0.0550165772, -0.0214776471, 0.5375715494, 0.1039921939, -0.1637995988, -0.031033786, 0.227119118, -0.1092195809, -0.2241535485, -0.1367474496, -0.1725660414, 0.2570613027, -0.2123560458, -0.2030740678, 0.1432767659, 0.2454459816, 0.4036560357, -0.0317769684, 0.2653982639, 0.0925488472, 0.0413974375, 0.1671623439, -0.0784722418, -0.0313065797, 0.1556566358, -0.0009805607, 0.2876315117, -0.098949641, 0.1851483583, -0.0919763595, -0.2702074945, -0.0442132317, 0.082006149, 0.2814544141, -0.1440524757, 0.0922963768, 0.334758848, 0.3947926462, 0.2348681986, 0.2349324226, 0.3371485174, 0.6447877884, 0.1191153377, -0.0171499588, 0.1671896428, -0.1317230612, -0.2494186163, 0.2548363507, 0.4952819347, 0.4230061173, -0.1029381752, 0.2637453377, 0.2036378384, 0.0837231427, -0.1512748897, 0.1672003567, 0.0601977743, 0.3180695772, 0.0076514557, 0.2109297812, -0.0649538785, -0.0552661903, 0.19154194, 0.0658397824, 0.2555199862, -0.128478542, 0.5481224656, -0.3072595298, -0.2564251721, -0.2413407564, -0.2760950923, 0.0781548917, -0.1602676213, -0.1182286739, 0.3370000422, 0.0332643688, 0.0506811589, 0.2371834368, 0.1976033598, -0.1381629556, -0.4697014093, -0.143736124, 0.1527376026, -0.2454110682, -0.1133155897, 0.3678134978, -0.0569793843, 0.0241772514, -0.1859461069, 0.0227218866, -0.3783481717, -0.0986185521, 0.2697837055, -0.1195577607, 0.2959057689, -0.2688423097, 0.0188587625, -0.4218040705, -0.3589331806, -0.080045104, -0.0346794277, -0.0787685513, -0.4026956856, -0.0332611911, -0.0269002728, -0.0811650604, 0.0032077581, -0.1645309627, -0.0334002487, 0.061182756, 0.0083255321, 0.3051473796, -0.0450579971, -0.0811834037, -0.2303549498, 0.372326225, -0.3175052702, -0.3663460016, -0.6225438714, 0.2493044734, -0.1268669665, -0.0361842066, -0.0311377048, -0.0876354724, 0.1213361174, 0.3739612699, -0.4566192329, -0.2531045675, -0.1389434934, -0.038432084, 0.2074796855, -0.0992783457, 0.2927909195, 0.0628480017, 0.2210260332, -0.0134772286, -0.427994132, 0.0216266606, 0.3045813441, 0.3820427358, 0.0778238624, 0.2560873032, 0.0397078395, 0.6177141666, 0.2947871685, -0.0088367406, 0.4568147063, -0.0352965184, 0.2197225839, -0.2286722809, -0.1513238996, 0.2661587, 0.0503715239, -0.6271561384, 0.3973251581, 0.1080929786, -0.4347505569, 0.1437237263, -0.281453222, -0.3260061741, -0.2251603752, 0.2219087332, -0.0011851108, 0.2085201144, 0.066393815, 0.1401650012, -0.2165210694, -0.2471467555, 0.106016323, 0.0449222177, 0.2869083583, 0.1469969898, 0.2787693739, -0.496786952, -0.3790456951, 0.3464760184, 0.3606936932, 0.0563130453, -0.064753592, 0.3000008464, 0.1742333174, -0.1240522265, 0.581671536, -0.3319553435, -0.1878365427, -0.1880344152, 0.0243848786, 0.0193959624, 0.0048052818, 0.1159654856, 0.398604542, -0.0556098148, 0.4732915163, 0.3217414021, 0.0159067772, -0.1249706447, 0.1998544037, -0.3449239135, -0.1482805759, -0.1974349171, -0.5270223618, -0.2029920965, -0.0049555302, 0.1355163306, -0.0128360949, 0.0748147294, -0.0141314715, 0.1147591397, -0.0878756195, 0.0675265267, 0.1530639231, 0.4317627549, 0.1444838345, -0.1085924059, 0.0938734487, 0.2191274166, 0.381564647, 0.5146506429, 0.1817272007, 0.1613638997, 0.1798894107, -0.1523649842, 0.3045823276, 0.3195317686, -0.1802613586, -0.210950911, -0.0914971381, -0.0381052829, -0.3330706954, 0.3399730921, 0.1309190691, 0.3543566465, -0.4325092733, -0.4281259179, 0.4614939988, 0.1713367701, -0.2966612577, 0.3499471843, 0.044799041, -0.5196495652, 0.4166850746, 0.161318481, 1.1192378998, 0.006420787, 0.1492103338, 0.0563519038, 0.0863897279, -0.0224037282, -0.3167361021, 0.2254666686, -0.1897275746, -0.6439607143, -0.0953601897, -0.1686271429, -0.2084510475, 0.0929965153, -0.1833430529, 0.42816028, 0.0131865144, -0.0684457123, -0.1066497117, -0.0882827491, -0.029119866, -0.1644828916, -0.2493912131, 0.0234040096, -0.0669641793, 0.4125226736, -0.137720868, 0.1827720404, -0.15452604, 0.0587961376, -0.1939361393, -0.0522646829, -0.3953529596, -0.0228394289, -0.3558652103, -0.2806153893, -0.1223725528, 0.3432165682, 0.5044688582, 0.2127476335, 0.0109196138, 0.0062494054, 0.123587355, 0.2426979542, 0.0332191512, -0.4295631945, 0.0051002074, 0.0530465245, 0.1005422473, -0.3044877946, 0.1799135655, -0.405015111, -0.0289117843, 0.0571522899, -0.1012033969, 0.0512930453, -0.2298831493, 0.0082567427, -0.0810012594, 0.0929942578, 0.0212133694, -0.0150383376, 0.2631598413, 0.4860743284, -0.4386590719, -0.4045326114, -0.2367360592, 0.2274432778, 0.3079238236, 0.0721367747, 0.4588258266, -0.1814828813, -0.2117977738, 0.0217543691, 0.0081970617, -0.2481897175, 0.3098765314, 0.068978928, -0.4246726334, -0.0832344741, -0.1431317329, 0.0346490517, 0.315929085, 0.0340313278, -0.4057691693, -0.297554791, -0.3905525506, 0.0329499878, -0.0539820492, 0.001842309, 0.0533904731, 0.1469670534, 0.1759289503, -0.1808771044, -0.1588265598, 0.1416149288, -0.0857369453, 0.070745945, 0.055378776, 0.0067400285, 0.3965462446, -0.1677690744, -0.1059672236, 0.1616746783, 0.063921392, -0.08797355, -0.0432715192, 0.1717988253, 0.1972264051, -0.0620708689, -0.052682057, -0.2885947824, 0.047990106, -0.4657147825, 0.2235800922, 0.3048146069, 0.1255896986, -0.1487080306, 0.2037124038, -0.1439113766, -0.079640314, -0.0604652017, -0.130143255, 0.3830968738, 0.1570236385, 0.2263221592, 0.2681725621, 0.0695643872, -0.0093628764, -0.0734360069, 0.2125531435, 0.3936019242, 0.1422573477, -0.202592209, 0.0021379665, -0.0084061138, 0.0643016994, 0.092219606, -0.0775457025, -0.4096876085, 0.3800754249, 0.0100467242, 0.0969908834, -0.0744945928, 0.085003376, -0.210076794, -0.0563186109, 0.5034117699, 0.1240139455, 0.1653405279, 0.030188961, 0.0092836618, 0.5287905335, -0.1984931231, 0.0644593239, -0.2298062444, 0.0299999192, 0.5227379799, 0.4158559442, 0.0957126915, -0.0363952443, 0.4889544547, 0.1238826662, 0.0548812225, 0.3511895835, -0.039125029, 0.0481877774, -0.1636124551, -0.0937611461, 0.1594879329, -0.2239948958, 0.2865586281, 0.2315383255, -0.0386953205, -0.3612741828, -0.0633785203, -0.2279269248, 0.2280164063, -0.1960700452, -0.2139560431, 0.2110279649, -0.1182357743, 0.1159965992, 0.3922674656, -0.1613473296, -0.2112932205, 0.1820071638, 0.1608033925, -0.2490401566, 0.3454979658, -0.0114986552, 0.1392012835, 0.1760273427, 0.0654564649, 0.7366920114, -0.4153044224, 0.1058181077, -0.1174077764, 0.1387733668, 0.2489474565, 0.4229890704, -0.2619303465, -0.1962724924, 0.1766311824, 0.0808220357, -0.1100917011, 0.2239325792, -0.0668057054, 0.0656223297, 0.0744924694, 0.0567408577, -0.0181809198, 0.197873652, 0.1304047108, 0.2323055565, 0.241475001, 0.1912727058, -0.1268614382, 0.1268391609, -0.1530820131, 0.0317872763, -0.3571462035, 0.1267788708, 0.2589238286, -0.2709531188, 0.3677130044, 0.3034541607, -0.0048088692, 0.1969356686, 0.4202011228, 0.3629055619, 0.3622789979, -0.3203472495, -0.2182143331, -0.3224577308, 0.380294174, -0.0063212533, 0.2020303607, -0.519497633, 0.0457506478, 0.0079023242, -0.0216596127, 0.0384821668, 0.1716013551, -0.1097049788, 0.1593045145, -0.315735966, -0.0987095088, -0.1103774682, -0.0025159661, -0.1039051712, -0.40724352, 0.3230722547, -0.1130489558, 0.0270747095, 0.039779678, 0.0766295493, -0.2620250285, -0.1165412515, 0.4589865506, 0.2134210765, 0.1491035074, -0.2523422241, 0.0602664202, -0.2260858566, 0.0038754754, -0.3273198307, 0.0119492849, 0.0091759767, 0.3091532886, -0.2791392505, 0.0806865841, -0.4037922621, 0.0749665052, 0.2575178146, 0.0394625179, 0.0847368538, -0.077389583, -0.1713287532, 0.0262787864, 0.0999202132, 0.5492198467, 0.0676571727, 0.2419802397, -0.2468037009, -0.022204265, 0.2681181431, -0.3252443671, -0.5462808609, 0.3673077226, 0.0814203024, 0.2363242358, -0.1652344018, -0.2404949367, -0.0269386768, 0.1208177581, -0.2052072734, -0.1055412889, 0.0260223076, 0.0342705436, 0.1519043148, -0.0629814565, 0.3229027689, 0.131068483, -0.3947972059, -0.0453304276, -0.2368303984 ]
https://github.com/huggingface/datasets/issues/643
Caching processed dataset at wrong folder
Actually I also have the issue when writing a regular text file ```python f = open("/content/drive/My Drive/path/to/file", "w") f.write(("a"*511 + "\n") * ((1 << 30) // 512)) # 1GiB f.close() ``` Is that supposed to happen ?
Hi guys, I run this on my Colab (PRO): ```python from datasets import load_dataset dataset = load_dataset('text', data_files='/content/corpus.txt', cache_dir='/content/drive/My Drive', split='train') def encode(examples): return tokenizer(examples['text'], truncation=True, padding='max_length') dataset = dataset.map(encode, batched=True) ``` The file is about 4 GB, so I cannot process it on the Colab HD because there is no enough space. So I decided to mount my Google Drive fs and do it on it. The dataset is cached in the right place but by processing it (applying `encode` function) seems to use a different folder because Colab HD starts to grow and it crashes when it should be done in the Drive fs. What gets me crazy, it prints it is processing/encoding the dataset in the right folder: ``` Testing the mapped function outputs Testing finished, running the mapping function on the dataset Caching processed dataset at /content/drive/My Drive/text/default-ad3e69d6242ee916/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/cache-b16341780a59747d.arrow ```
37
Caching processed dataset at wrong folder Hi guys, I run this on my Colab (PRO): ```python from datasets import load_dataset dataset = load_dataset('text', data_files='/content/corpus.txt', cache_dir='/content/drive/My Drive', split='train') def encode(examples): return tokenizer(examples['text'], truncation=True, padding='max_length') dataset = dataset.map(encode, batched=True) ``` The file is about 4 GB, so I cannot process it on the Colab HD because there is no enough space. So I decided to mount my Google Drive fs and do it on it. The dataset is cached in the right place but by processing it (applying `encode` function) seems to use a different folder because Colab HD starts to grow and it crashes when it should be done in the Drive fs. What gets me crazy, it prints it is processing/encoding the dataset in the right folder: ``` Testing the mapped function outputs Testing finished, running the mapping function on the dataset Caching processed dataset at /content/drive/My Drive/text/default-ad3e69d6242ee916/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/cache-b16341780a59747d.arrow ``` Actually I also have the issue when writing a regular text file ```python f = open("/content/drive/My Drive/path/to/file", "w") f.write(("a"*511 + "\n") * ((1 << 30) // 512)) # 1GiB f.close() ``` Is that supposed to happen ?
[ -0.0809621662, 0.1509338617, -0.0047161244, 0.4641731977, -0.0074457275, -0.0837902278, 0.3782776594, -0.0160657354, -0.0122855939, 0.1514236033, 0.0223206878, 0.2040353864, 0.1237220913, 0.4594926536, -0.0932224244, 0.3096235693, 0.290045321, -0.0745366663, 0.0555245541, -0.1031740531, -0.3542970121, 0.3774952888, -0.2202058733, -0.0087545998, -0.495901227, -0.194345355, -0.1198795587, 0.0271141287, -0.0074649937, 0.2972818315, 0.1181404516, 0.0325630791, 0.1704263687, 0.636682868, -0.0001297842, -0.1508616656, 0.2038342655, -0.2284534872, -0.1701294184, -0.1342802793, -0.2650019825, 0.0300457329, -0.0569733568, -0.0505862013, -0.1384721249, 0.1548901796, 0.2752762139, -0.536054492, 0.0094065703, 0.276714474, 0.0580062307, -0.4531144798, -0.4936719835, 0.3889504075, 0.1365725696, 0.2704540491, -0.1813370734, 0.2528254986, -0.0109168626, -0.130318284, -0.0711398944, 0.2785564959, -0.0806726664, 0.0198997967, 0.2697372437, 0.4275930226, -0.3152778745, -0.2621913552, 0.5415553451, -0.4134800732, 0.3608883917, -0.390203774, -0.2110207081, -0.2957026958, -0.334498018, -0.3674359024, 0.5965529084, 0.1802715063, -0.134796083, 0.04447712, -0.4733821154, -0.2119873464, 0.0828587115, 0.1022456586, -0.1576580256, 0.2304857373, -0.2503767908, 0.05058752, -0.0516200475, 0.2697181702, 0.6813457012, -0.4417132437, -0.0237598587, 0.2037516683, -0.0917871073, 0.0940777287, -0.2643800676, 0.4885317087, -0.2613625526, -0.1318337172, -0.1807598174, -0.1316574067, -0.1469575018, 0.1340814084, 0.1013707221, 0.3952642679, -0.0123559758, 0.118258357, 0.1236554831, 0.0157679431, -0.5297672153, -0.1027591079, -0.0362677649, -0.2775632441, 0.0065116286, 0.0582702272, -0.0306736343, -0.2944574356, -0.1228456944, -0.0581407696, -0.3343103528, 0.0305990279, 0.0391493551, 0.2700134814, -0.0227348991, -0.1497122049, -0.2075526118, 0.23878631, -0.2323746979, 0.3985635638, -0.0660964176, 0.0024269, -0.3374916315, 0.4626069665, 0.0347469449, -0.2426496595, 0.1173252463, 0.065671131, -0.0227639712, -0.3494031429, -0.0363063663, -0.491787672, 0.3772358298, 0.1531204283, 0.1754185855, 0.4545668364, -0.0162567124, -0.2907528579, -0.0823217332, 0.2838456035, -0.5681078434, -0.2272556424, 0.1737807095, -0.0337879099, -0.2504401207, -0.0980441123, -0.6474568844, 0.0480873324, 0.5046126246, -0.240637809, 0.0103449151, -0.0212469101, -0.5231411457, -0.2621286809, 0.0020734593, 0.1776813567, -0.2439186871, 0.2162410021, -0.2958571911, 0.5004326105, 0.5286245942, 0.4728857875, 0.0916795284, 0.2983113825, -0.5009071231, -0.0403810218, 0.1156427115, -0.2453134209, -0.774526, 0.3366719782, -0.0865652859, 0.0202926472, 0.0238601193, 0.0039781667, 0.2072509825, -0.1113531142, 0.2035692334, 0.1161362231, 0.1152516976, 0.1914680004, -0.260109663, -0.0995405912, -0.0675880164, -0.1769415289, -0.0160326455, 0.1607617885, 0.0332442373, -0.379047066, 0.1345540285, -0.1305041462, 0.1587212235, 0.2919361591, 0.2254761755, -0.1097868383, 0.0023403279, 0.1582441032, -0.1636752635, 0.0663114637, 0.0064318255, -0.1549728513, -0.4688415527, -0.0954165757, -0.0691837966, -0.170411557, -0.1157702655, -0.2090335488, -0.0990910083, 0.302028209, 0.0527631789, 0.1034018099, -0.0371864513, 0.5598999262, 0.0584910214, -0.1376341879, -0.0100799482, 0.1996802688, -0.0974132642, -0.2990061045, -0.1839173436, -0.1400810927, 0.2100962847, -0.1822099984, -0.2029640079, 0.1439807266, 0.2620844543, 0.4544189274, -0.0678557754, 0.2846866846, 0.1313076019, 0.1178166121, 0.1736064106, -0.0596174449, 0.0129096583, 0.0975576639, -0.0596394464, 0.2911247015, -0.0939188302, 0.1649130732, -0.0630542487, -0.311547488, -0.1043290347, 0.0928655118, 0.2769774497, -0.0767953992, 0.1137636751, 0.3560960293, 0.3650044501, 0.2202542573, 0.2212859988, 0.281445235, 0.6594769955, 0.0796382874, 0.0169114787, 0.13166143, -0.1434538215, -0.2267147005, 0.2522937059, 0.4938208461, 0.4509123564, -0.1102533042, 0.2818647623, 0.2308764905, 0.0572352223, -0.1920550764, 0.2052624226, 0.0635757595, 0.2396271229, 0.0193324443, 0.199972406, -0.0383974239, -0.1153916717, 0.2090266496, 0.0695497915, 0.1952136606, -0.1690085232, 0.548972249, -0.2330794483, -0.2392105311, -0.3021416664, -0.1950260997, 0.0795689821, -0.1342737526, -0.1480572075, 0.327583462, 0.0119454265, -0.0300213192, 0.2582431436, 0.2365769893, -0.1091115698, -0.4411348999, -0.1046399474, 0.1154156998, -0.2608534694, -0.1139014065, 0.3723793626, -0.0132913329, 0.0179706868, -0.205874458, 0.0274478048, -0.3316387236, -0.0950437784, 0.2160345167, -0.1245895475, 0.3008138835, -0.2806511819, 0.0052449815, -0.4323663116, -0.3232482076, -0.1116877943, -0.0163675658, -0.1328596473, -0.4609754384, 0.0380086042, -0.0903610289, -0.1600343883, -0.0169565156, -0.1087305322, -0.0706206635, 0.1383947134, -0.0190439671, 0.3141640425, 0.0401262082, -0.1042648256, -0.2069062889, 0.397364378, -0.2842164934, -0.34327811, -0.5802718401, 0.2480849922, -0.12918064, -0.0765837356, 0.0040049925, -0.0649521127, 0.0509702154, 0.4050844312, -0.4348989725, -0.2244686186, -0.1609027237, -0.0584501363, 0.1966932714, -0.0321758762, 0.2875188589, 0.0443621501, 0.2087786794, -0.0488302708, -0.4330034852, 0.1043196321, 0.273560077, 0.3252646327, 0.0411882699, 0.2671673, 0.0133695006, 0.5328207016, 0.3042556643, 0.0319023803, 0.4304858744, -0.0257743485, 0.2479052693, -0.2646460235, -0.1220803931, 0.2533507943, -0.0026006922, -0.6874216795, 0.3849374056, 0.1331447065, -0.3752193451, 0.148845315, -0.2473293096, -0.2869774699, -0.2395713925, 0.2274460196, 0.0160556957, 0.2355138808, 0.0311375484, 0.1894521415, -0.1962173581, -0.2596499026, 0.0710704625, 0.1155292839, 0.28327021, 0.078795664, 0.2494220436, -0.4221400917, -0.3851804435, 0.3280620575, 0.4100993574, 0.0784635618, -0.0790634677, 0.2768160999, 0.2013735473, -0.128429845, 0.5718343258, -0.3225115538, -0.2128905058, -0.1575903893, 0.0055421591, -0.0180006623, 0.0042310134, 0.0476927087, 0.4170280397, -0.0004010331, 0.4902396202, 0.3160839677, 0.0031768195, -0.12577416, 0.1545048356, -0.3040200174, -0.16543594, -0.1865907013, -0.504624784, -0.1751075685, -0.074358061, 0.1343414336, 0.0022161342, -0.0032348558, -0.0233536437, 0.1278807074, -0.0532520749, 0.0945353657, 0.091032654, 0.4122694135, 0.107991308, -0.1451215893, 0.1175668314, 0.2242969722, 0.3180164099, 0.5354545116, 0.1898303032, 0.1655131131, 0.1603083611, -0.2400104702, 0.2840310037, 0.3559737206, -0.127395317, -0.1553618908, -0.1032547951, -0.1012605131, -0.3344504237, 0.3314047754, 0.1584889293, 0.3463457823, -0.3968760371, -0.4340336919, 0.4349760711, 0.1569949239, -0.2913613319, 0.3219600618, 0.0002827048, -0.562994957, 0.4082658291, 0.1468818635, 1.1026697159, -0.025092911, 0.2338026613, 0.0235594139, -0.0046500862, -0.0545311421, -0.3294466734, 0.2526037693, -0.2281484008, -0.6414886117, -0.0915871859, -0.2005324513, -0.1712309718, 0.0673605874, -0.1693916619, 0.3570495546, 0.0608158857, -0.0344429836, -0.0861866102, -0.0958178788, -0.1250860542, -0.1322516948, -0.2325708866, 0.000031475, -0.0144962166, 0.4019238353, -0.1263516396, 0.1470893472, -0.1871935725, 0.0651530251, -0.2133187354, -0.0509143546, -0.4018423557, -0.0048370883, -0.3165280223, -0.3073406816, -0.0720213801, 0.3442330062, 0.579488337, 0.2448797077, 0.009438036, -0.0311705712, 0.0687988326, 0.2260992825, 0.023799628, -0.4319207668, -0.0044454392, 0.0783523545, 0.0978337228, -0.3306050301, 0.1395052969, -0.377897501, -0.0554659516, 0.0662569255, -0.1121946722, -0.0004934035, -0.2233910412, -0.029429622, -0.0849102661, 0.0734181106, 0.0106683038, -0.0022155587, 0.283031702, 0.4514399767, -0.4418175817, -0.4067651927, -0.2347947806, 0.2583928704, 0.2941417992, 0.0805194527, 0.505718112, -0.1924352944, -0.1908168197, 0.0337103978, 0.0553312749, -0.2769656777, 0.3419546187, 0.1261822134, -0.4228161871, -0.1077384651, -0.1517354697, 0.128330797, 0.2605073154, 0.0372665226, -0.4263397455, -0.3310598731, -0.3691377938, 0.0153499153, -0.0511789955, 0.0413536951, 0.0808200464, 0.1399010867, 0.2029253393, -0.2439491302, -0.1427310705, 0.1914769262, -0.0776705965, 0.1080357656, 0.1253316402, -0.0303896405, 0.3905189037, -0.1557261348, -0.1167277396, 0.1853383183, 0.0732200518, -0.0771495104, -0.0613899007, 0.1893731058, 0.2049026489, -0.1101852283, -0.0476794466, -0.2540996075, 0.0772400275, -0.4438060522, 0.2715403736, 0.3231215179, 0.0687292516, -0.1394192576, 0.1464877278, -0.0420566127, -0.1052664071, -0.0374791324, -0.1940356046, 0.4067795575, 0.1452094316, 0.1737908125, 0.2661371231, 0.0864083469, 0.002730526, -0.0581722409, 0.2556658983, 0.3912899196, 0.1391524076, -0.2817211449, 0.0091888458, -0.0093172342, 0.0877391696, 0.1118074208, -0.1098467261, -0.3855490685, 0.3469544053, 0.0061655249, 0.1675996184, -0.0860013664, 0.0664681196, -0.2295811772, -0.0495780818, 0.4888733625, 0.1787822843, 0.0983166546, -0.0034019649, 0.0161503181, 0.4908652604, -0.174336195, 0.0308381543, -0.2170125246, 0.0416655391, 0.4911317825, 0.3654711843, 0.0658303127, -0.0110052936, 0.5228963494, 0.1776360869, 0.0692605823, 0.3879832625, 0.0145952459, 0.0412261561, -0.2444337904, -0.0556931198, 0.1413016617, -0.2105810642, 0.3062294126, 0.2023259401, -0.0969420373, -0.4165742993, -0.0124939308, -0.2229883075, 0.1945796758, -0.2766157389, -0.2702247202, 0.2230636477, -0.125696525, 0.134389177, 0.4146282971, -0.1829532683, -0.1224092841, 0.1584742367, 0.1388184428, -0.1914564818, 0.3510167003, 0.0748236924, 0.1025700122, 0.2082946748, 0.0153050199, 0.7186051607, -0.3996395767, 0.1384305358, -0.1246141493, 0.2011361867, 0.2932034433, 0.4530898035, -0.2599590421, -0.1737222075, 0.1562866867, 0.0734252483, -0.0617406294, 0.2570312321, -0.0532394163, 0.1493944377, 0.1064754203, 0.0463066474, -0.0061291382, 0.2349722981, 0.1404091567, 0.1613077819, 0.1536075771, 0.1668096334, -0.1433655918, 0.1003962085, -0.1489122212, 0.0240981057, -0.339622885, 0.0856767148, 0.3113992214, -0.2772235572, 0.3688078225, 0.2808134258, -0.0135310795, 0.2127439678, 0.4522303939, 0.4078080058, 0.3765889704, -0.3265296221, -0.1923317015, -0.2515757084, 0.3914071023, -0.0224453695, 0.1984592974, -0.5478373766, 0.0420851707, -0.0205492936, -0.0433785059, 0.004484728, 0.1747305989, -0.1757803708, 0.1482003331, -0.3036431074, -0.1035670936, -0.0763938278, -0.1082217172, -0.1096822023, -0.4672411978, 0.349294126, -0.1440929621, 0.0327430516, 0.072681725, 0.1453402936, -0.2700288892, -0.1587569863, 0.4588569701, 0.2459574938, 0.1691599935, -0.2327440977, 0.0059668571, -0.2120885253, -0.0001604892, -0.31162709, 0.0455618501, 0.0111171585, 0.340074271, -0.2860615253, 0.086505495, -0.3210943937, 0.1406545788, 0.2385008782, 0.0232110433, 0.1386781484, -0.074158527, -0.2107337415, -0.0015979521, 0.1609109044, 0.5694396496, 0.0424068943, 0.1978937984, -0.2249976993, -0.0082822032, 0.2339948267, -0.3523561656, -0.5315518379, 0.3948778808, 0.0739598572, 0.2498635203, -0.1458755285, -0.3200004399, -0.025337331, 0.1369531155, -0.2490009815, -0.1409199983, -0.0034687333, 0.0467935614, 0.0967401564, -0.022526525, 0.3537873626, 0.1031755656, -0.4071531892, -0.0464942567, -0.2741344273 ]
https://github.com/huggingface/datasets/issues/643
Caching processed dataset at wrong folder
The code you wrote should write a 1GB file in the Google Drive folder. Doesn't it?
Hi guys, I run this on my Colab (PRO): ```python from datasets import load_dataset dataset = load_dataset('text', data_files='/content/corpus.txt', cache_dir='/content/drive/My Drive', split='train') def encode(examples): return tokenizer(examples['text'], truncation=True, padding='max_length') dataset = dataset.map(encode, batched=True) ``` The file is about 4 GB, so I cannot process it on the Colab HD because there is no enough space. So I decided to mount my Google Drive fs and do it on it. The dataset is cached in the right place but by processing it (applying `encode` function) seems to use a different folder because Colab HD starts to grow and it crashes when it should be done in the Drive fs. What gets me crazy, it prints it is processing/encoding the dataset in the right folder: ``` Testing the mapped function outputs Testing finished, running the mapping function on the dataset Caching processed dataset at /content/drive/My Drive/text/default-ad3e69d6242ee916/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/cache-b16341780a59747d.arrow ```
16
Caching processed dataset at wrong folder Hi guys, I run this on my Colab (PRO): ```python from datasets import load_dataset dataset = load_dataset('text', data_files='/content/corpus.txt', cache_dir='/content/drive/My Drive', split='train') def encode(examples): return tokenizer(examples['text'], truncation=True, padding='max_length') dataset = dataset.map(encode, batched=True) ``` The file is about 4 GB, so I cannot process it on the Colab HD because there is no enough space. So I decided to mount my Google Drive fs and do it on it. The dataset is cached in the right place but by processing it (applying `encode` function) seems to use a different folder because Colab HD starts to grow and it crashes when it should be done in the Drive fs. What gets me crazy, it prints it is processing/encoding the dataset in the right folder: ``` Testing the mapped function outputs Testing finished, running the mapping function on the dataset Caching processed dataset at /content/drive/My Drive/text/default-ad3e69d6242ee916/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/cache-b16341780a59747d.arrow ``` The code you wrote should write a 1GB file in the Google Drive folder. Doesn't it?
[ -0.0446971953, 0.1795121431, -0.0474338681, 0.4067886472, -0.0353464335, -0.0158605725, 0.3132926226, 0.0269286036, -0.003272973, 0.2124623209, 0.0527833477, 0.2332565039, 0.0173060745, 0.4683627486, -0.0888161063, 0.3117387295, 0.2533088028, -0.0614043549, 0.0652204901, -0.0793859363, -0.3149406016, 0.3846561909, -0.2003738284, -0.0362798832, -0.5003236532, -0.1209213138, -0.1952660382, -0.0283289105, -0.0948131979, 0.2490338683, 0.1699589491, 0.0328487977, 0.211313948, 0.6244875789, -0.0001260697, -0.1791577339, 0.2552548051, -0.237076059, -0.187533021, -0.0942381024, -0.2913304865, -0.0120649859, -0.1296515167, -0.0519671291, -0.173884064, 0.1368011534, 0.241081655, -0.5128407478, 0.0125835873, 0.2712925076, 0.0960835367, -0.4741630852, -0.4019343555, 0.3541021943, 0.1001479626, 0.2302197814, -0.0978730321, 0.2973213792, -0.0064148307, -0.1276716888, -0.0666756034, 0.304025352, -0.0224507004, -0.0017923731, 0.2682513297, 0.3478749692, -0.3101869822, -0.3176037073, 0.5124368072, -0.4732313752, 0.3450898528, -0.3447915316, -0.16219908, -0.2686657906, -0.3249444067, -0.3767522275, 0.527546525, 0.1636801064, -0.146506995, 0.0504785702, -0.4966208041, -0.2640627027, 0.0583116636, 0.0859937072, -0.2157362401, 0.3035005629, -0.2949537635, 0.0281175766, -0.0502061322, 0.2880085409, 0.6402241588, -0.3850525022, 0.0206761863, 0.1959231794, -0.0750688836, 0.0538776666, -0.1898899525, 0.5359109044, -0.205376789, -0.080694221, -0.2508223653, -0.1914848387, -0.0957580954, 0.0858408585, 0.0577562638, 0.4821519554, -0.0581906289, 0.1287787557, 0.0558158159, 0.0115777776, -0.4982736409, -0.0858851671, -0.0615559295, -0.2610613406, -0.0232867971, 0.1011673287, -0.0176240653, -0.2813464105, -0.1566878259, -0.0024419017, -0.3255878985, -0.0047461838, 0.0736408234, 0.2545739412, -0.035205584, -0.1727343649, -0.2061343789, 0.2438773066, -0.2096229196, 0.3257509172, -0.0973590016, 0.0834291577, -0.3553806245, 0.4614126682, 0.0643578693, -0.2175328881, 0.1618286967, -0.0303344317, 0.0486509129, -0.3234558105, -0.0199801289, -0.4359337986, 0.346786797, 0.1586115956, 0.1547556967, 0.4147908092, 0.0052947765, -0.3118018508, -0.1208237186, 0.2784829736, -0.6148864031, -0.2724706531, 0.167401731, 0.0043114629, -0.2542262971, -0.0784918517, -0.7056846023, 0.1296860874, 0.5126652718, -0.2033644021, 0.0143640004, -0.0296661034, -0.4853774309, -0.2789668441, 0.0931212157, 0.1397345662, -0.276424855, 0.2111438215, -0.324093461, 0.5097716451, 0.5368669033, 0.4086745381, 0.0867166743, 0.3118775189, -0.4806714356, -0.0023117214, 0.1090576947, -0.2603105307, -0.7887856364, 0.329482317, -0.0921202302, -0.0341042913, 0.010874033, -0.052456703, 0.2498146892, -0.128184393, 0.2223533988, 0.1184075326, 0.0447631478, 0.1605667174, -0.2890418172, -0.1415534317, -0.0388854071, -0.1678702682, 0.0084933639, 0.1557977647, 0.0203559119, -0.348627001, 0.1816390902, -0.0968571082, 0.1932919621, 0.2729936242, 0.2430141419, -0.1512928307, -0.0257409532, 0.1923051327, -0.1705409288, 0.1176257581, 0.0051350296, -0.1445206255, -0.4764554203, -0.1168646812, -0.1284799576, -0.1662073433, -0.1146268323, -0.2664223909, -0.0477709249, 0.2823032439, 0.1235268787, 0.0425950885, -0.0230108947, 0.524279058, 0.1113478243, -0.1764963567, -0.0565922521, 0.2039712369, -0.160229668, -0.2727030516, -0.1193597019, -0.1635743529, 0.1950162947, -0.2037996054, -0.1764293313, 0.1463668942, 0.2399625778, 0.3614383638, 0.0290911216, 0.2707648873, 0.0838492885, 0.0333185568, 0.209800601, -0.0554021448, -0.0060468502, 0.1278684884, -0.1146658063, 0.2786549926, -0.0738998279, 0.1376666725, -0.0880084261, -0.2931885421, -0.1379476935, 0.1056081429, 0.2040586472, -0.1038917676, 0.1000167951, 0.3206194937, 0.3659144044, 0.2343899012, 0.2518414259, 0.3290376663, 0.6469292641, 0.0343541652, -0.0320183374, 0.1619457603, -0.1168722063, -0.3037320673, 0.2609196901, 0.457529366, 0.4510046244, -0.0714946091, 0.3330639303, 0.1795523912, 0.0596501231, -0.1460269094, 0.1589971483, 0.0625513569, 0.2426033914, -0.0450441241, 0.2130972147, -0.0463404246, -0.0528766699, 0.1504987329, 0.0499277487, 0.2370591611, -0.1375340819, 0.5223640203, -0.2170301676, -0.212284863, -0.2391374707, -0.2586258948, 0.0896014944, -0.1758671105, -0.1378312558, 0.272913754, 0.0500507057, -0.0241098963, 0.2173446715, 0.2124045938, -0.0968778729, -0.4409404695, -0.0934198424, 0.1598258018, -0.2728133202, -0.0783884376, 0.3454492092, -0.0601834767, 0.067739673, -0.1669666618, 0.0556153208, -0.3132525384, -0.1032665893, 0.2399734408, -0.18262586, 0.2825630009, -0.2589116096, 0.0043535382, -0.4489737153, -0.3210091591, -0.1142050028, -0.0281107388, -0.1064752266, -0.4497763813, 0.0043181218, -0.0534781292, -0.1324045658, -0.0062674098, -0.1664133668, -0.0508486405, 0.1070084199, 0.0462806076, 0.2824146748, -0.0013547577, -0.0263721831, -0.1866545081, 0.4291410446, -0.2935033143, -0.3800826669, -0.6091431379, 0.2402901649, -0.1338402182, -0.0644772053, 0.0137088075, -0.0646499321, 0.1100764722, 0.3217848539, -0.4183179736, -0.2801471353, -0.1830432266, -0.0565824471, 0.1905388981, -0.0270391442, 0.2956297398, 0.034634199, 0.1806827486, -0.0232599601, -0.4232107699, -0.0011002272, 0.3015035093, 0.3348651826, -0.0046830941, 0.2609410584, 0.0532183349, 0.5737910867, 0.3420695066, 0.0703260452, 0.4111798704, -0.0185389388, 0.283570677, -0.2689017653, -0.082540758, 0.3358348012, -0.0045605078, -0.6965684295, 0.3890521526, 0.1380960047, -0.4020673633, 0.1778076142, -0.2560751736, -0.2944075465, -0.1955452859, 0.2278749645, 0.0192517303, 0.1546322852, 0.0519254804, 0.1103790551, -0.201757282, -0.2173900306, 0.062659964, 0.0561019368, 0.287121743, 0.1901309341, 0.3035384715, -0.4826379716, -0.4505158663, 0.3193285167, 0.3583855331, 0.0268736854, -0.0282575712, 0.2993691564, 0.2221169472, -0.143109411, 0.5819000006, -0.3227652907, -0.2359463125, -0.1722199023, 0.0360447466, 0.0184008703, 0.0002368167, 0.0674705803, 0.4076421559, -0.0278031453, 0.4173499644, 0.3009567261, 0.0232450515, -0.076606974, 0.2138883471, -0.2958991826, -0.1170051545, -0.2367080301, -0.5745477676, -0.2460455596, -0.0253198892, 0.2117002308, 0.0146663934, 0.0607153103, -0.0048598424, 0.1458207071, -0.0785795227, 0.0907553583, 0.131039381, 0.4218706787, 0.1577143669, -0.1334288418, 0.0870031118, 0.2162815183, 0.326094985, 0.548058033, 0.166585207, 0.2173838168, 0.2050191462, -0.2190315723, 0.3270573616, 0.3367365301, -0.1730907112, -0.2543380857, -0.081273295, 0.0071183257, -0.3672323227, 0.3636331856, 0.1331374496, 0.3152443767, -0.4551063776, -0.4510666728, 0.4400534332, 0.170837149, -0.259547174, 0.3553853929, 0.081983164, -0.5158584714, 0.4137576818, 0.1381798834, 1.127928853, -0.0231649429, 0.2007927448, 0.0686906725, 0.0818096772, 0.0057592131, -0.3252651691, 0.2268906832, -0.1491582692, -0.613078475, -0.0723867193, -0.1585838795, -0.2013929039, 0.0647544786, -0.2076789886, 0.4101580083, 0.0273531377, -0.0350209586, -0.0974771827, -0.0545843467, -0.0675399825, -0.1100804135, -0.1576208323, 0.0595993921, 0.005640815, 0.407137543, -0.1616106778, 0.1993331611, -0.1928609014, 0.0326416716, -0.1974647194, -0.0513985194, -0.3769136965, -0.0154410498, -0.3699805737, -0.2988742888, -0.1157847345, 0.3191702366, 0.5089959502, 0.2731174827, 0.0094286948, -0.0298536792, 0.0898996294, 0.2153796554, 0.0090976395, -0.4728897512, 0.0128273908, 0.0297754332, 0.1223492473, -0.2711936533, 0.1495261639, -0.3859468997, -0.0631942898, 0.1162365377, -0.116757527, 0.0370572694, -0.2118512392, -0.0182973593, -0.1191811264, 0.0690047592, 0.0394491553, 0.0435188562, 0.2638121247, 0.457198292, -0.4170381427, -0.4305545092, -0.2661256492, 0.1776323617, 0.3450030982, 0.0716151297, 0.4959999025, -0.2081113756, -0.1910770237, 0.0185072199, 0.0000649393, -0.195566386, 0.3140620589, 0.1060143933, -0.4377087355, -0.0703067929, -0.1496841311, 0.0698467195, 0.3320939243, 0.0437689722, -0.38257429, -0.2999255657, -0.4178440869, 0.0382338539, -0.0361522697, 0.0407302082, 0.0147995576, 0.1367594004, 0.2235814929, -0.1935913563, -0.1928515881, 0.1412617415, -0.0920388997, 0.0669262707, 0.0898921415, -0.0069836304, 0.4290529788, -0.1722287685, -0.0915494487, 0.1663731337, 0.0423534922, -0.1221255213, -0.0323458835, 0.1566826701, 0.2242709398, -0.0360695459, -0.0263478477, -0.2317686379, 0.0401277542, -0.4575184286, 0.2492558509, 0.3022362292, 0.1360568404, -0.2050192654, 0.1941810846, -0.1127569526, -0.0335192755, -0.1197800338, -0.1145362109, 0.3726638556, 0.1349277496, 0.2175251991, 0.3037362099, 0.0779422969, 0.0291693844, -0.0210777074, 0.1902551353, 0.4145394266, 0.1018228084, -0.2399726808, 0.0245174989, -0.0547637269, 0.1027784497, 0.1373426318, -0.088926658, -0.3767094016, 0.3508293331, 0.0365208127, 0.1507931054, -0.0523607507, 0.0547361597, -0.2197594196, -0.0100615919, 0.526044786, 0.1290272772, 0.1877488494, -0.0440782942, 0.0206918195, 0.537512064, -0.1673856378, -0.0024447255, -0.2392361015, 0.0114917159, 0.4959449768, 0.4118014276, 0.0894749463, -0.0147215612, 0.4912987947, 0.1121209711, 0.1070732549, 0.3679321408, -0.0412073918, -0.0124511048, -0.1857866198, -0.0598005466, 0.1554002315, -0.1773815155, 0.301884234, 0.2154383212, -0.0786964446, -0.3884577453, -0.0414208956, -0.2103205323, 0.2080656886, -0.208190158, -0.2179647088, 0.2021915317, -0.1233751625, 0.0614823848, 0.4115560353, -0.1338238418, -0.19268471, 0.1497226506, 0.0943595171, -0.2575374246, 0.3588050306, 0.032256525, 0.0874739438, 0.149779886, 0.0274598822, 0.7081154585, -0.4393174052, 0.1149682179, -0.1275031567, 0.1237793937, 0.3178444505, 0.4450237453, -0.2584782243, -0.1782463789, 0.1497452706, 0.0477908105, -0.1281426102, 0.2253486961, -0.0409547165, 0.0705205351, 0.0697559267, 0.0933060199, -0.0532026403, 0.1945133358, 0.14852871, 0.1457563192, 0.2058076113, 0.1708899885, -0.1727921963, 0.111458391, -0.1520686001, 0.0414843298, -0.4139544666, 0.1071760654, 0.2566115856, -0.2570847571, 0.3453412354, 0.2927151918, 0.0105913244, 0.1742863804, 0.4269955754, 0.4116213322, 0.3664237261, -0.3534596264, -0.1867633611, -0.3090298176, 0.3833362758, -0.0509147979, 0.140567705, -0.5579476953, 0.1094649658, -0.0119940117, -0.0461655706, -0.0060198903, 0.1920230836, -0.1385639906, 0.127084136, -0.2760282159, -0.1136430055, -0.0776506737, -0.0166891702, -0.0771116465, -0.4317477942, 0.3348549306, -0.1419654191, 0.0563005954, 0.0217557922, 0.0962530375, -0.2522624731, -0.156367898, 0.4937506914, 0.1967360079, 0.1923654675, -0.260397166, 0.000313364, -0.2698324025, 0.0980950966, -0.3428789377, 0.0407128334, -0.0193719734, 0.338894546, -0.2815995812, 0.0439137593, -0.3746964931, 0.142072618, 0.2724717259, 0.0337095484, 0.1308255196, -0.0283606388, -0.2041527033, 0.0594849512, 0.1460772306, 0.5029762983, 0.0188889503, 0.2355626225, -0.2690110207, -0.0395551845, 0.2869608402, -0.315687567, -0.5447865725, 0.3355141878, 0.0593462475, 0.2711541951, -0.1600672901, -0.2820852697, 0.0071513504, 0.1244508475, -0.2202488184, -0.2036886513, 0.0575486124, 0.0194396861, 0.1309786737, -0.0469449982, 0.33100155, 0.0756136775, -0.398312211, -0.0506490655, -0.2439451516 ]
https://github.com/huggingface/datasets/issues/643
Caching processed dataset at wrong folder
I could check it and as you say as I write to te Drive disk the colab disk also increases...
Hi guys, I run this on my Colab (PRO): ```python from datasets import load_dataset dataset = load_dataset('text', data_files='/content/corpus.txt', cache_dir='/content/drive/My Drive', split='train') def encode(examples): return tokenizer(examples['text'], truncation=True, padding='max_length') dataset = dataset.map(encode, batched=True) ``` The file is about 4 GB, so I cannot process it on the Colab HD because there is no enough space. So I decided to mount my Google Drive fs and do it on it. The dataset is cached in the right place but by processing it (applying `encode` function) seems to use a different folder because Colab HD starts to grow and it crashes when it should be done in the Drive fs. What gets me crazy, it prints it is processing/encoding the dataset in the right folder: ``` Testing the mapped function outputs Testing finished, running the mapping function on the dataset Caching processed dataset at /content/drive/My Drive/text/default-ad3e69d6242ee916/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/cache-b16341780a59747d.arrow ```
20
Caching processed dataset at wrong folder Hi guys, I run this on my Colab (PRO): ```python from datasets import load_dataset dataset = load_dataset('text', data_files='/content/corpus.txt', cache_dir='/content/drive/My Drive', split='train') def encode(examples): return tokenizer(examples['text'], truncation=True, padding='max_length') dataset = dataset.map(encode, batched=True) ``` The file is about 4 GB, so I cannot process it on the Colab HD because there is no enough space. So I decided to mount my Google Drive fs and do it on it. The dataset is cached in the right place but by processing it (applying `encode` function) seems to use a different folder because Colab HD starts to grow and it crashes when it should be done in the Drive fs. What gets me crazy, it prints it is processing/encoding the dataset in the right folder: ``` Testing the mapped function outputs Testing finished, running the mapping function on the dataset Caching processed dataset at /content/drive/My Drive/text/default-ad3e69d6242ee916/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/cache-b16341780a59747d.arrow ``` I could check it and as you say as I write to te Drive disk the colab disk also increases...
[ -0.0908999741, 0.0898180008, -0.0347921103, 0.4908719361, -0.0633709952, -0.0270622373, 0.2676787674, 0.004958963, 0.0242156833, 0.2279234678, -0.0148066618, 0.2577185333, 0.0650781393, 0.4686146975, -0.0855877846, 0.325891912, 0.2614867687, -0.0908620805, 0.0236124787, -0.0889638215, -0.3442031741, 0.381418407, -0.187889725, -0.0883714855, -0.4606522918, -0.1489056051, -0.1486401558, -0.0555906408, -0.0616809204, 0.2551701665, 0.1518819034, 0.0476345159, 0.2141141295, 0.6586294174, -0.0001266875, -0.2107375562, 0.1852643937, -0.2339996845, -0.1972978264, -0.0303513631, -0.2913540602, -0.0428780615, -0.0804466605, -0.0939843729, -0.1805125028, 0.1525763273, 0.1875962466, -0.4624531567, 0.0369791389, 0.2215131521, 0.0815563053, -0.4583999515, -0.4723294973, 0.3777892888, 0.0973049998, 0.2221826166, -0.1482015103, 0.3222097754, -0.0102287233, -0.0661062077, -0.1228184998, 0.2910980284, -0.045729395, 0.0100275911, 0.2436987013, 0.3522853255, -0.2692088485, -0.33867383, 0.6088120937, -0.4549609423, 0.3356354535, -0.3624156713, -0.1185454652, -0.2765687108, -0.2856234908, -0.3757873476, 0.5314129591, 0.138943404, -0.1478154361, 0.0686532706, -0.4924884439, -0.2548328936, 0.0755867511, 0.0571189635, -0.2038834244, 0.2674257755, -0.2745118141, 0.0091091469, -0.0179537944, 0.2676405013, 0.7121443748, -0.3867536187, -0.0097833229, 0.2255636007, -0.1349749714, 0.0430149958, -0.1761122495, 0.5634052753, -0.2399381995, -0.1454799771, -0.231812194, -0.1449785233, -0.1336896867, 0.1113654524, 0.0579790249, 0.4367703795, -0.0656075776, 0.1153590083, 0.0733884722, 0.0440179966, -0.4660319686, -0.078812778, -0.05303308, -0.2441564798, 0.0195789933, 0.0858404785, -0.0243463516, -0.2801482379, -0.1593989581, -0.0670679882, -0.2925983071, 0.0194891095, 0.0709375069, 0.2782581449, -0.0084351562, -0.1163732186, -0.2690791488, 0.2244541198, -0.1827168465, 0.3975837827, -0.0862435475, 0.0280137211, -0.3113509417, 0.4865096509, 0.0744593069, -0.2300517261, 0.1022621542, 0.0751510859, -0.0056711733, -0.332251668, -0.1181938052, -0.4248938859, 0.3353986442, 0.1824306846, 0.1418439299, 0.3487937152, -0.0752953142, -0.2226860225, -0.1101973429, 0.260731101, -0.6085547805, -0.2279908657, 0.1420245618, -0.0007225852, -0.2446968853, -0.0833471864, -0.7278814316, 0.0973594636, 0.452542603, -0.1691077054, 0.0063447207, -0.0134531418, -0.476354599, -0.2047982812, 0.0397766754, 0.1366303563, -0.2087151408, 0.1826035827, -0.3155365884, 0.532546103, 0.4800924659, 0.4041323662, 0.0826626867, 0.250936389, -0.5047855377, -0.1302542835, 0.068114005, -0.2428759634, -0.8206090331, 0.2948427498, -0.0977446362, 0.0086051151, 0.0828310251, -0.0241561681, 0.2168330103, -0.0997236595, 0.1954145432, 0.1250828505, 0.0738058835, 0.2005303204, -0.3050266802, -0.1327727139, -0.1084548533, -0.1719166338, -0.0028995862, 0.1939236671, 0.0233739242, -0.3604539037, 0.1965493113, -0.1077610403, 0.1663277, 0.2490559071, 0.2758967876, -0.2232206166, -0.0158250779, 0.1475937665, -0.1878913939, 0.0984163061, 0.0309985504, -0.1838449091, -0.4914020598, -0.1147515923, -0.1692599952, -0.1446655691, -0.1153862476, -0.2338032275, -0.0712445155, 0.3111446798, 0.0979516357, 0.0034815073, -0.0143396631, 0.526314795, 0.0860158131, -0.1585881263, -0.0184128936, 0.2007098049, -0.0831228867, -0.2301698625, -0.1330536902, -0.1328957528, 0.1908375323, -0.2405846417, -0.1907078326, 0.0976368934, 0.2659989595, 0.4339229465, 0.038332738, 0.2717314065, 0.1190790311, 0.0998928323, 0.2144971937, -0.0961884707, -0.0499376692, 0.1104995683, -0.0246760771, 0.2426781058, -0.0756840408, 0.1617137045, -0.1038878709, -0.3116762638, -0.0743962005, 0.0543083996, 0.3153132498, -0.1267095208, 0.017002102, 0.3911198378, 0.4046429992, 0.2553060055, 0.2483257651, 0.3756079376, 0.6352424622, 0.0773173273, -0.0819763541, 0.2210305631, -0.1640895605, -0.3080347776, 0.2351090908, 0.5566414595, 0.41963166, -0.0789887756, 0.3083800972, 0.2244253159, 0.0988955349, -0.1601509452, 0.1938730478, 0.0511634126, 0.2598302066, 0.0632753894, 0.2444452643, -0.030998772, -0.0644110292, 0.1818400323, 0.0606861338, 0.2382086962, -0.1505656838, 0.5132126808, -0.2193128318, -0.219853133, -0.2537247539, -0.2095604092, 0.117136851, -0.1727665365, -0.1311655939, 0.3562394381, 0.0576348379, -0.0108533576, 0.2667402327, 0.2275639921, -0.1232323721, -0.4242451489, -0.1358308494, 0.1687888503, -0.2426109612, -0.0781315044, 0.3596922755, -0.0710115507, 0.0865671337, -0.1987027228, 0.0729769617, -0.3626461923, -0.1350428313, 0.2516104281, -0.1367729604, 0.3073514402, -0.2948589027, 0.006540183, -0.4329466224, -0.326102227, -0.1517668664, -0.048066549, -0.1187815964, -0.44218117, -0.0419284813, -0.0632461831, -0.1138276458, 0.0201722551, -0.1713448316, -0.0622777864, 0.0999836177, 0.0376607552, 0.3234494627, 0.0060570017, -0.0844253898, -0.1918919683, 0.4517858326, -0.2782468796, -0.3682191372, -0.6094023585, 0.2131617218, -0.0835436881, 0.0003339229, 0.0212866589, -0.1330576241, 0.1271717846, 0.3549329638, -0.4617931247, -0.3288872838, -0.1489286274, 0.0233004168, 0.1904875636, -0.0547219887, 0.266366154, 0.0645633638, 0.1716390848, 0.0243137628, -0.426440388, -0.0048502088, 0.3264300823, 0.3931534588, 0.0035666265, 0.2174667269, 0.0104874671, 0.5704646707, 0.2698994279, -0.0205929093, 0.4525631368, -0.0104254028, 0.2274505198, -0.2605687976, -0.1322253048, 0.2276983559, 0.0203286186, -0.6987558007, 0.4155741036, 0.0959087461, -0.4083206654, 0.1828237325, -0.3095014989, -0.2439014912, -0.2038915753, 0.2121493071, 0.0300386436, 0.2055122256, 0.0212327093, 0.1008075401, -0.1913899779, -0.2594246566, 0.0551406182, 0.0401586406, 0.2412414402, 0.1229617968, 0.3073002696, -0.460454911, -0.4348347187, 0.3341541886, 0.3555981815, -0.0023158789, -0.0438428819, 0.2870715261, 0.2458268106, -0.1482652575, 0.564909935, -0.3025718331, -0.165947929, -0.1720979512, 0.0289181769, 0.0175499171, 0.0298031718, 0.0448262021, 0.3730852306, 0.0043943599, 0.5102785826, 0.3682121336, 0.0015425943, -0.0915671736, 0.2732628584, -0.3322289586, -0.172917515, -0.184439078, -0.5728094578, -0.1761686802, 0.0130245816, 0.1412501782, 0.0300491005, 0.0221445113, -0.0319686644, 0.1181978509, -0.1086959541, 0.0976107344, 0.1747449487, 0.4427566826, 0.2253114581, -0.0914559811, 0.0405408032, 0.2281247079, 0.3766057491, 0.5109024048, 0.152031824, 0.2301543951, 0.1836990118, -0.2098184228, 0.2359926999, 0.3207579851, -0.1442792714, -0.1745631397, -0.1821285784, -0.0112457648, -0.3184022605, 0.3036259413, 0.1171125174, 0.3674888611, -0.4066759348, -0.4075475931, 0.4499431849, 0.1345498413, -0.3216319978, 0.238962993, -0.0022429936, -0.5306395292, 0.4041656256, 0.1716741621, 1.1117691994, 0.040186584, 0.1765855849, 0.0392424501, 0.0319478661, -0.0652518794, -0.3078402877, 0.1893229038, -0.1908807606, -0.6514496803, -0.0732112229, -0.1613128185, -0.2037962675, 0.0769560039, -0.2160163522, 0.3886811137, -0.0033977777, -0.0716829747, -0.1122513935, -0.0528871566, -0.0460922271, -0.1305936873, -0.2803533971, 0.0377536789, -0.060143102, 0.3965789676, -0.1411875188, 0.1791493595, -0.1214802712, 0.0727494657, -0.1955318749, -0.0794794708, -0.3821716905, -0.0182945095, -0.3670309186, -0.3077130318, -0.1050221249, 0.3306064606, 0.5086340308, 0.2187629193, 0.030099323, -0.0138406157, 0.0932265297, 0.2404084355, 0.0285435766, -0.4196964204, -0.0287545659, 0.0454718769, 0.081591174, -0.2918022871, 0.1761634052, -0.3630618453, -0.0291384757, 0.0981051624, -0.1210694611, 0.0322279409, -0.2401294708, 0.0087008551, -0.1003762558, 0.1044992357, 0.0404587612, -0.0503647476, 0.2496071607, 0.5033817291, -0.4237064123, -0.4663745165, -0.2362974435, 0.2391019017, 0.3680250645, 0.1011443138, 0.473441571, -0.244892478, -0.1814547181, 0.0002700984, 0.025036484, -0.2598957717, 0.3429083824, 0.0776614845, -0.3460294306, -0.0873567685, -0.1469555348, 0.0331366733, 0.3445210457, 0.0494183898, -0.4454932809, -0.2655269504, -0.4145762622, -0.0005249269, -0.030143708, 0.0728259981, 0.0513026342, 0.1241999045, 0.1792032123, -0.1231448799, -0.1842484176, 0.1778456271, -0.0687735081, 0.1079051644, 0.1092497855, -0.0445648432, 0.4041941166, -0.1749346852, -0.0894473195, 0.1353371441, 0.0723468736, -0.1188267618, -0.006114848, 0.155107528, 0.1744977832, -0.0285288915, -0.1091034412, -0.2261296958, 0.0833106488, -0.4550174475, 0.3012820184, 0.2682914436, 0.157690689, -0.1673206985, 0.1907923371, -0.1116957664, -0.0656147227, -0.0736658424, -0.1129290536, 0.3603916168, 0.14417018, 0.2214201391, 0.275177896, 0.0632537603, 0.0111395903, -0.0444083065, 0.2156654298, 0.3725350201, 0.0908575282, -0.1623176187, 0.0083143488, 0.0030546039, 0.0255276412, 0.1160376519, -0.0616760664, -0.4699500799, 0.3420882821, 0.0489592217, 0.1168969274, -0.0590606965, 0.111525774, -0.229278475, -0.04325746, 0.5168874264, 0.1539984047, 0.1316803843, 0.0440824963, 0.032783851, 0.5230231881, -0.1982862651, 0.0246884078, -0.2192222476, 0.018057391, 0.4865018725, 0.4259887338, 0.0942509547, -0.0261396095, 0.4793218076, 0.089681685, 0.0908979252, 0.3561842144, -0.0500970744, 0.0549431667, -0.186566785, -0.038818717, 0.1921584308, -0.2290361822, 0.2488800585, 0.2621863186, -0.0514818951, -0.3562253714, -0.0759627372, -0.2727501392, 0.2419854999, -0.2199641466, -0.238408342, 0.2570853829, -0.1364898831, 0.1088029444, 0.4127169847, -0.1660001576, -0.168358922, 0.1964532435, 0.1456790566, -0.2558963299, 0.349321723, 0.0103303343, 0.1349762976, 0.1556819379, 0.042273134, 0.7475678325, -0.4142765403, 0.140889585, -0.1750688255, 0.1564685851, 0.2523052394, 0.4177721143, -0.2435998619, -0.2142859399, 0.2119775712, 0.0734440833, -0.1310044229, 0.1941263974, -0.0987378061, 0.0808148682, 0.0563204512, 0.0752586573, -0.0412349366, 0.2130665332, 0.1265200377, 0.1920006126, 0.229085058, 0.1382927597, -0.1069689393, 0.0859757811, -0.111032337, 0.0139104575, -0.358921051, 0.0974493027, 0.2541070282, -0.3553035855, 0.3786213994, 0.2719304264, 0.0068044066, 0.1342213452, 0.4763182998, 0.3950313628, 0.3025306463, -0.2944330573, -0.1898592412, -0.2639847398, 0.3594506681, -0.0134936217, 0.1397281885, -0.581243515, 0.0616456643, -0.0707522482, -0.0522548631, 0.0035257638, 0.1889207959, -0.1334935278, 0.1061552912, -0.332628727, -0.0784466863, -0.0884288177, -0.0156107191, -0.0666601881, -0.4335344136, 0.3351227641, -0.1234754249, 0.048489742, 0.0505678914, 0.1613295823, -0.2314795554, -0.1227804124, 0.4660214782, 0.2035220414, 0.174214974, -0.2906889319, 0.0410781354, -0.1961381882, -0.0451881699, -0.3456785381, 0.0302921832, 0.0224218294, 0.3042698503, -0.1964488626, 0.0540574379, -0.3024552166, 0.1461641192, 0.2656729817, 0.0278430097, 0.1670298278, -0.0894473493, -0.1717293411, 0.0250688381, 0.0971112028, 0.5377414227, 0.0824894607, 0.2462519109, -0.1996487528, -0.0883890837, 0.2424048483, -0.2692514062, -0.5248529911, 0.3899620473, 0.0918867886, 0.2048325986, -0.2126505822, -0.2469796836, 0.0134119689, 0.138660267, -0.2424547672, -0.1393656731, 0.1419803947, 0.002418749, 0.1437821239, -0.0412588529, 0.3298483789, 0.0980606079, -0.4087777436, -0.0612626076, -0.2565206885 ]
https://github.com/huggingface/datasets/issues/643
Caching processed dataset at wrong folder
To reproduce it: ```bash !df -h | grep sda1 ``` ```python f = open("/content/drive/My Drive/test_to_remove.txt", "w") f.write(("a"*511 + "\n") * ((1 << 30) // 512)) # 1GiB f.write(("a"*511 + "\n") * ((1 << 30) // 512)) # 1GiB f.close() ``` ```bash !ls -lh /content/drive/My\ Drive/test_to_remove.txt !df -h | grep sda1 !rm -rf /content/drive/My\ Drive/test_to_remove.txt ``` [Colab](https://colab.research.google.com/drive/1D0UiweCYQwwWZ65EEhuqqbaDDbhJYXfm?usp=sharing)
Hi guys, I run this on my Colab (PRO): ```python from datasets import load_dataset dataset = load_dataset('text', data_files='/content/corpus.txt', cache_dir='/content/drive/My Drive', split='train') def encode(examples): return tokenizer(examples['text'], truncation=True, padding='max_length') dataset = dataset.map(encode, batched=True) ``` The file is about 4 GB, so I cannot process it on the Colab HD because there is no enough space. So I decided to mount my Google Drive fs and do it on it. The dataset is cached in the right place but by processing it (applying `encode` function) seems to use a different folder because Colab HD starts to grow and it crashes when it should be done in the Drive fs. What gets me crazy, it prints it is processing/encoding the dataset in the right folder: ``` Testing the mapped function outputs Testing finished, running the mapping function on the dataset Caching processed dataset at /content/drive/My Drive/text/default-ad3e69d6242ee916/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/cache-b16341780a59747d.arrow ```
56
Caching processed dataset at wrong folder Hi guys, I run this on my Colab (PRO): ```python from datasets import load_dataset dataset = load_dataset('text', data_files='/content/corpus.txt', cache_dir='/content/drive/My Drive', split='train') def encode(examples): return tokenizer(examples['text'], truncation=True, padding='max_length') dataset = dataset.map(encode, batched=True) ``` The file is about 4 GB, so I cannot process it on the Colab HD because there is no enough space. So I decided to mount my Google Drive fs and do it on it. The dataset is cached in the right place but by processing it (applying `encode` function) seems to use a different folder because Colab HD starts to grow and it crashes when it should be done in the Drive fs. What gets me crazy, it prints it is processing/encoding the dataset in the right folder: ``` Testing the mapped function outputs Testing finished, running the mapping function on the dataset Caching processed dataset at /content/drive/My Drive/text/default-ad3e69d6242ee916/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/cache-b16341780a59747d.arrow ``` To reproduce it: ```bash !df -h | grep sda1 ``` ```python f = open("/content/drive/My Drive/test_to_remove.txt", "w") f.write(("a"*511 + "\n") * ((1 << 30) // 512)) # 1GiB f.write(("a"*511 + "\n") * ((1 << 30) // 512)) # 1GiB f.close() ``` ```bash !ls -lh /content/drive/My\ Drive/test_to_remove.txt !df -h | grep sda1 !rm -rf /content/drive/My\ Drive/test_to_remove.txt ``` [Colab](https://colab.research.google.com/drive/1D0UiweCYQwwWZ65EEhuqqbaDDbhJYXfm?usp=sharing)
[ -0.0841229707, 0.1945450604, -0.0163261816, 0.4593132734, -0.0439054817, -0.0728670657, 0.3859433532, 0.0305537656, 0.0337212458, 0.2072655261, -0.0354526117, 0.2843731642, 0.0471325926, 0.405802846, -0.1138568595, 0.3554216325, 0.27776438, -0.1252814382, -0.0011570174, -0.1184547395, -0.3421108723, 0.3819749653, -0.2049853653, -0.0256494209, -0.4567476213, -0.0956496224, -0.1473442167, -0.0011712797, -0.0159745682, 0.2834329605, 0.1546799839, 0.0342305526, 0.2064486146, 0.6668277383, -0.0001267947, -0.1738705039, 0.211475879, -0.200600177, -0.2727822661, -0.1019952446, -0.3044107258, 0.036021404, -0.1111782268, -0.0811382458, -0.1309183538, 0.1151059717, 0.1956803054, -0.5906015038, -0.0217087902, 0.264655441, 0.0812926441, -0.4493581951, -0.5469539762, 0.3519181311, 0.1121327356, 0.2696874142, -0.1255562603, 0.2454706579, -0.0296037756, -0.1314677, -0.0899723768, 0.3205049038, -0.110980019, -0.0286941305, 0.2242936194, 0.3244671822, -0.3142949939, -0.2437789291, 0.552444458, -0.4051459134, 0.3221738935, -0.3460016847, -0.1634964645, -0.303388685, -0.3498370051, -0.3369370997, 0.5142971873, 0.1225766689, -0.0838439986, 0.087027818, -0.5243650675, -0.1893936098, 0.0964754745, 0.0881536379, -0.2207659483, 0.2885017097, -0.2636706829, 0.0324777067, -0.0383783244, 0.2351329923, 0.6513153911, -0.4771047235, 0.0129431132, 0.2287501991, -0.1191159859, 0.1084959581, -0.1420408487, 0.4823614657, -0.2183912247, -0.1262341738, -0.2273814678, -0.138771683, -0.1228210628, 0.096880883, 0.0549011528, 0.4719406068, -0.0667964667, 0.1334535778, 0.0674370229, 0.0957631767, -0.4957547188, -0.0539303422, 0.0051824711, -0.233222425, -0.0355137065, 0.1311545074, -0.0165150836, -0.2762652636, -0.210295409, 0.0187314488, -0.3704522252, -0.0185110457, 0.0934007168, 0.2619770169, -0.0247535091, -0.1263897121, -0.2220386118, 0.2480702251, -0.2516379952, 0.4041083753, -0.122416757, 0.0474491976, -0.3677769303, 0.422129333, 0.0711671636, -0.2111281753, 0.1136212349, 0.0618333854, -0.0047401302, -0.3111233413, -0.0762210116, -0.4326264858, 0.4141176641, 0.1112444624, 0.1141542196, 0.4161490798, 0.0264522023, -0.267082572, -0.0823543817, 0.236880362, -0.5531479716, -0.2045003176, 0.1599130183, 0.004120267, -0.2979141474, -0.0669919252, -0.6956053972, 0.1260885149, 0.4034118056, -0.1562395543, 0.0185711049, -0.0514678508, -0.4866918921, -0.2335223258, 0.0505959615, 0.1709023863, -0.2770162225, 0.1444512606, -0.3489139378, 0.5182403922, 0.54502213, 0.407844156, 0.0712292865, 0.2802245915, -0.5115205646, -0.0195449442, 0.0752299652, -0.2570965886, -0.7950392365, 0.3377694488, -0.0835516304, 0.0778088048, -0.0157463886, -0.0586844087, 0.1973319203, -0.1049472839, 0.1535238624, 0.1360724121, 0.1003886312, 0.1800669879, -0.2769970298, -0.126280874, -0.0409162603, -0.1735574752, 0.0368150733, 0.1725339741, 0.0476654842, -0.3246938884, 0.1804223359, -0.116600439, 0.1493275166, 0.2664095759, 0.2396841645, -0.1233287752, -0.0224795509, 0.1364757121, -0.2220260799, 0.1532648951, 0.0340710059, -0.1538009346, -0.512519896, -0.1051220596, -0.1252610981, -0.1472772956, -0.1138540655, -0.2297275513, -0.0633673146, 0.3790796101, 0.0941515714, 0.0542889833, -0.0053919107, 0.5049636364, 0.0877431631, -0.1718441695, -0.0558094904, 0.1888891757, -0.1316520274, -0.2515571117, -0.1695165038, -0.1328745037, 0.2208896577, -0.234168604, -0.1833904833, 0.1453888118, 0.2613754869, 0.4232810736, -0.0350134224, 0.2826692462, 0.085492678, 0.1074391454, 0.17938748, -0.0734494328, -0.0230700783, 0.1655795574, -0.1058248952, 0.2817679048, -0.1135012358, 0.1841133237, -0.1023150906, -0.3243629634, -0.0864484608, 0.0631370768, 0.2896332145, -0.1316163391, 0.0799005851, 0.3997072577, 0.3467099369, 0.2356554717, 0.2422430813, 0.3499772847, 0.6368630528, 0.041172497, -0.048086524, 0.1404189914, -0.0809066296, -0.2828871012, 0.2752006352, 0.5394601822, 0.4638061523, -0.0988678932, 0.3149520755, 0.1691283286, 0.0862853751, -0.1600848585, 0.2028649002, 0.0676124543, 0.2505053878, 0.033744514, 0.2440983951, -0.04214013, -0.0904776901, 0.133493036, 0.0407066196, 0.2235368639, -0.1144715175, 0.5990409255, -0.3186647594, -0.2106160522, -0.2102441639, -0.2406885326, 0.1269744635, -0.2022582591, -0.1322253048, 0.3160972595, -0.0469735712, 0.0264323875, 0.2152579427, 0.1936615705, -0.1426062733, -0.4581071138, -0.1588832736, 0.1610167325, -0.2578988075, -0.0891751871, 0.3661743104, -0.0499387719, 0.0422776118, -0.1529656947, 0.0076164231, -0.3925898075, -0.1384723485, 0.2612595856, -0.182377696, 0.3225195408, -0.2217480987, -0.0241666138, -0.4395270944, -0.3129886985, -0.0940907598, -0.0256117806, -0.1019075066, -0.4108662903, -0.0089528374, -0.049357038, -0.1378780305, 0.0002187826, -0.0952605605, -0.073497206, 0.0395642594, -0.0142233819, 0.2749407887, 0.0180203244, -0.0792431384, -0.1839407533, 0.445767045, -0.2575640678, -0.383885026, -0.5558648705, 0.184514299, -0.0844426975, -0.0193768404, -0.0445443168, -0.0801563486, 0.116056636, 0.4078015983, -0.4832974076, -0.2445377409, -0.1550949365, -0.0202740803, 0.1770356297, -0.0848397315, 0.3010421097, 0.069869265, 0.1874433458, -0.0501757227, -0.4178807735, 0.0349140055, 0.27925843, 0.3262372613, -0.0075917505, 0.2600142658, 0.0915994123, 0.5603274107, 0.2719266713, -0.0357675441, 0.4003484249, -0.006906664, 0.2754006684, -0.2521508336, -0.115773499, 0.2647081017, 0.0552052185, -0.7013975382, 0.4209537506, 0.057746578, -0.4382659495, 0.1772968769, -0.2116812915, -0.3041628003, -0.2247135937, 0.2229960859, 0.0012428816, 0.1784547865, 0.0605888814, 0.1398741454, -0.165908888, -0.2183030844, 0.0834840685, 0.0565795638, 0.3373073339, 0.1339710951, 0.2628166378, -0.4482697248, -0.3884062767, 0.3494023979, 0.3611346483, 0.0470249653, -0.0848466381, 0.2404614687, 0.1869641393, -0.1444206983, 0.5466492772, -0.334747076, -0.1636125743, -0.1299709976, 0.0169890001, -0.0141094774, 0.0017946362, 0.0124021173, 0.3665882349, -0.0325377509, 0.5276475549, 0.2805065215, -0.0422107242, -0.1070472449, 0.2349511385, -0.2846837044, -0.1501444429, -0.1825914979, -0.5665205717, -0.2081162333, -0.0184088983, 0.1475917101, 0.0265533216, 0.0363263115, -0.000575766, 0.1285927296, -0.0907082036, 0.104761757, 0.1362763941, 0.3940876722, 0.1735525578, -0.1035948545, 0.0645232648, 0.2168628126, 0.3731350899, 0.5316536427, 0.1416011006, 0.1814555526, 0.2140343785, -0.1720279753, 0.2364907712, 0.3640684485, -0.1516597867, -0.2099587321, -0.1321101487, -0.0220673457, -0.3369808197, 0.3455791771, 0.1578531265, 0.3325904608, -0.3623312414, -0.4115197659, 0.4117581248, 0.1483870298, -0.3303750157, 0.3158072233, 0.0331267342, -0.5511071086, 0.4613454938, 0.1280950755, 1.1126843691, -0.0020386418, 0.1784239709, 0.1135698333, 0.0957015902, -0.0194320381, -0.2755332589, 0.2005995363, -0.1734855622, -0.5922012925, -0.079910107, -0.1618946195, -0.2040150613, 0.1059160382, -0.1329107732, 0.4099911749, -0.0166433081, -0.0567097925, -0.0946828872, -0.061073184, -0.0468452238, -0.096130684, -0.2189690471, 0.0503207222, -0.0369418859, 0.3914654851, -0.1495079696, 0.2168739587, -0.1971984655, 0.0422706082, -0.1725230068, -0.0299817435, -0.3739890456, 0.0167008601, -0.3672955334, -0.3203126192, -0.1461950839, 0.3489961624, 0.4700152576, 0.2422032952, 0.0681789517, 0.013583459, 0.0455531105, 0.2881104052, 0.0449518636, -0.4729408324, 0.0219208281, 0.0892641395, 0.0966317058, -0.3060151339, 0.1714947075, -0.3764618039, -0.0309771299, 0.0909723639, -0.1157448739, 0.0323730074, -0.2353323251, -0.0012708455, -0.1241160035, 0.0911292732, 0.0381634384, -0.0249626078, 0.2713150978, 0.4538407028, -0.4913174212, -0.4338658154, -0.2429764867, 0.2196907699, 0.2761553526, 0.0923032686, 0.4714729786, -0.1963157058, -0.201721251, -0.0035783276, -0.0369500965, -0.2587950826, 0.3299714625, 0.1093430594, -0.4681279659, -0.1054975763, -0.1262483001, 0.0081282798, 0.2813584208, 0.0266782958, -0.4296478033, -0.3005277514, -0.3171999454, 0.0161022767, -0.0597179607, 0.077256456, 0.0695808977, 0.1193823516, 0.089235425, -0.1950305551, -0.1829489917, 0.2048532218, -0.094770968, 0.1227942184, 0.0786331147, -0.0175768044, 0.4281575382, -0.1939685941, -0.0813814029, 0.143147245, 0.0738632828, -0.120118022, -0.049200058, 0.1548920721, 0.1974505335, -0.0815239847, -0.0458108671, -0.247217834, 0.070901975, -0.4901148677, 0.2950860858, 0.3348082006, 0.1295578182, -0.1613295376, 0.2049960345, -0.1187462434, -0.0410114042, -0.0701125264, -0.1410943121, 0.360034287, 0.1319337189, 0.2307246774, 0.3176656365, 0.0463443324, -0.0278106108, -0.0652954653, 0.2071723044, 0.4547815323, 0.1323384196, -0.1851796061, -0.0328353718, -0.0284979716, 0.1077325717, 0.0593161136, -0.085581407, -0.457080543, 0.3199623227, 0.042406179, 0.1196211427, -0.0851221234, 0.0694153681, -0.2036310285, -0.0116876587, 0.5102894902, 0.1288097203, 0.1602643132, 0.0473566465, -0.0059920736, 0.5263208151, -0.1773721129, 0.0566703081, -0.2004925758, -0.0103059635, 0.5074064136, 0.396110177, 0.052276969, -0.0095697865, 0.4942311645, 0.1224837303, 0.0646159425, 0.3955699205, -0.013042992, 0.0467665195, -0.1954601705, -0.0617314763, 0.1664906144, -0.2371864319, 0.2832925916, 0.1871670783, -0.0691947266, -0.3907461166, -0.0177042969, -0.1824211776, 0.2373393178, -0.2330201566, -0.2296691686, 0.1860866994, -0.1380094141, 0.0792408437, 0.3893226981, -0.1683450341, -0.180637747, 0.1558277309, 0.1771239489, -0.2397207022, 0.3626706302, -0.0289093778, 0.1323882341, 0.1518107355, 0.0500287116, 0.7001642585, -0.3557699323, 0.1334785521, -0.1025939584, 0.1700507253, 0.2860386968, 0.3981005549, -0.2447241992, -0.1887207329, 0.1763651073, 0.0670011491, -0.122494027, 0.2717929482, -0.1491892338, 0.0327151753, 0.0977705494, 0.0750926584, -0.0500115342, 0.1827303767, 0.1619612873, 0.163740173, 0.2166757137, 0.1712014228, -0.1385445148, 0.1400007904, -0.1463420242, 0.0411114059, -0.3651643991, 0.0879891515, 0.2932604253, -0.273449868, 0.4256700575, 0.2981342971, 0.0132940523, 0.17913872, 0.4029977918, 0.3924380839, 0.3411541283, -0.3213725984, -0.161723271, -0.2840149105, 0.363971889, -0.0266979951, 0.1606379002, -0.545521915, 0.079350777, -0.0012822337, -0.0612963028, 0.0257044025, 0.1679890454, -0.1000320166, 0.1487790644, -0.3114844859, -0.1130686104, -0.0751885176, -0.0874409825, -0.1050552577, -0.451382786, 0.3197159171, -0.0907449648, 0.0485184416, 0.0477200896, 0.1198864281, -0.2438070476, -0.1118511632, 0.4397117496, 0.2618429363, 0.1683412045, -0.2935879827, -0.0259574354, -0.2678938508, 0.0190907568, -0.3106366098, 0.0170400161, -0.0132975429, 0.3414577544, -0.2391726077, 0.0636950284, -0.3666889369, 0.1739199162, 0.2756898999, 0.0119206682, 0.0677400902, -0.0571765602, -0.267406106, 0.0780816674, 0.0970483571, 0.5984994769, 0.0942749754, 0.2917084098, -0.2487270832, -0.0865623653, 0.3026677966, -0.3510761857, -0.5694525242, 0.3708778918, 0.0836353078, 0.2353771031, -0.1646018326, -0.2726033032, 0.0098311007, 0.1228053793, -0.2463629693, -0.1148191839, 0.0384759046, 0.0152514279, 0.1341855824, -0.0320452936, 0.290769279, 0.1469574422, -0.4093949199, 0.0017059073, -0.238971889 ]
https://github.com/huggingface/datasets/issues/633
Load large text file for LM pre-training resulting in OOM
Not sure what could cause that on the `datasets` side. Could this be a `Trainer` issue ? cc @julien-c @sgugger ?
I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks.
21
Load large text file for LM pre-training resulting in OOM I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks. Not sure what could cause that on the `datasets` side. Could this be a `Trainer` issue ? cc @julien-c @sgugger ?
[ -0.6339286566, -0.4775367379, 0.0106935017, 0.2986355722, 0.3600474596, -0.1518248618, 0.5567324758, 0.3738059402, 0.0108827576, 0.0107196309, -0.129592672, -0.1827997565, -0.2669844627, -0.1620898247, -0.0323721357, 0.0232065637, -0.0998215079, 0.1968753189, -0.2772667408, -0.0966669172, 0.0570117384, -0.0776291862, -0.2127603889, -0.0297442861, -0.3879216015, -0.0617137849, 0.3158174157, 0.1221210212, -0.1640810221, -0.0958083943, -0.2297017574, 0.1197900772, 0.439348489, 0.4340211749, -0.0001147721, 0.0066160336, 0.1890438497, -0.2363324165, -0.1672690213, 0.0192163214, 0.0775555521, -0.3022914231, -0.1064001918, -0.3036173582, -0.0272463262, -0.1061252505, -0.0156821907, -0.2243723869, 0.4987519681, 0.5414042473, 0.158762753, 0.2508125007, -0.2136608064, 0.0723968297, 0.2544375062, 0.0488999188, 0.1056828648, -0.2795148194, 0.4687834084, -0.1923182011, -0.4726887643, 0.147817716, -0.2247735411, -0.2016820759, -0.0950891525, 0.0382048525, 0.3660891652, 0.0315695107, 0.2641872764, 0.3474208415, 0.4272072911, -0.0008119822, -0.2896353602, -0.3209854364, -0.142623961, -0.0843160674, 0.1892364621, 0.2010776401, 0.0598267131, 0.1184588969, 0.1071273386, -0.1253313124, -0.1519896984, 0.0761606991, -0.2748077214, 0.3397959173, -0.2012510002, -0.0380433276, 0.1555755138, -0.0944792777, 0.1961691678, -0.0997751206, 0.0034044338, 0.256714642, -0.2454416156, -0.1123953462, -0.0716817975, -0.5194139481, 0.1627842039, -0.275945127, 0.1981878579, 0.2972391844, -0.0876616836, -0.0676169693, 0.0863933712, 0.4202023745, -0.2524377108, 0.2605276108, 0.2272560447, 0.1640110612, -0.1717063487, -0.044398699, -0.3305115998, -0.1962453574, 0.1015936062, -0.0773412734, -0.0110169947, -0.2546044588, -0.2404216379, 0.0585461818, -0.1407698691, -0.0308450628, 0.2748159468, 0.3899732232, -0.3540614843, 0.429497689, 0.1651095748, 0.0523289181, -0.4855332375, -0.3365268707, -0.1744022667, 0.1532573104, -0.2032445073, 0.0389754251, 0.1297925711, 0.1396334916, 0.0596540496, -0.0428831056, -0.062589474, -0.4402189851, -0.0094447248, -0.0241031349, 0.0207529441, -0.0425147153, 0.0630241409, -0.0233657025, 0.2347659618, -0.124994956, -0.0253818408, 0.347791791, -0.2762121558, -0.2276411355, 0.0509795845, 0.2101765722, -0.0282484777, 0.2091551125, 0.0285063665, 0.0768566355, 0.5359786153, 0.0066538528, -0.0471477136, -0.3782058656, 0.0945213884, 0.0920172632, 0.1619913876, 0.176630199, -0.0425881371, -0.0726022273, 0.0063893422, 0.1035867184, 0.2550407052, 0.3907325864, -0.2264701426, 0.2906734943, 0.0070044398, -0.0566552505, 0.5319920778, -0.2620993257, -0.4484443367, 0.2338527292, -0.3511552811, 0.1081818715, 0.2206701636, 0.1759027839, 0.1138891205, -0.1089850664, 0.2690153718, 0.3656853139, 0.0705155656, 0.1209142357, -0.2366243303, -0.0922287181, -0.1715156287, 0.4842028618, 0.3627239466, -0.0255859457, -0.073089242, 0.3563796282, 0.1833981574, -0.1992233247, 0.1312805116, 0.3693415821, -0.1744246632, -0.126158461, 0.0944814235, 0.065598011, -0.3000324965, -0.0709871948, -0.0908449665, 0.0828673989, -0.1136057153, -0.0894262716, 0.2825582922, 0.0228702873, -0.076665625, 0.1131795123, 0.0379412323, -0.1318216175, 0.0248667225, -0.0704166517, 0.1261966676, 0.0154216904, -0.0932069048, 0.0873603672, -0.218415916, -0.1239407063, 0.0018027006, -0.1370613873, 0.0769498348, 0.1752731055, -0.0511879697, 0.038401451, -0.1730400473, 0.1069573462, -0.1529778838, -0.2987068594, -0.5071570873, 0.435893029, 0.1864787787, -0.338109374, 0.3402595818, 0.104027614, -0.0295993853, -0.1062710211, -0.1198950633, 0.5749330521, 0.2730709612, 0.019617686, 0.2990927994, -0.2454116046, 0.2542809844, -0.2662388682, -0.1036153138, -0.0977649093, 0.4209234118, 0.0310557783, 0.2350629866, 0.148280412, -0.0869456232, -0.2320593894, 0.4703808725, -0.1788093746, -0.0138720572, 0.2665264308, -0.4247210622, -0.0115635023, -0.3480208218, -0.2494779825, 0.0067337304, 0.1732288599, -0.0944132358, -0.0554348305, 0.008538004, -0.2627931833, 0.0078360327, 0.153237164, -0.0718519539, 0.3198346198, 0.1165152863, 0.1005953103, -0.1228921488, -0.1794068366, 0.1028476655, 0.1782900989, -0.275191009, 0.1844428182, -0.0228228644, -0.0134468302, -0.2862631679, 0.1904247552, -0.1268012226, 0.0388155058, -0.064743273, 0.2586891651, -0.0779029801, 0.1019122973, 0.3001608849, 0.3549228907, 0.462962091, -0.0780947804, 0.3263480961, -0.0826636776, -0.1796956658, 0.0329436511, 0.3345074356, -0.3179924488, -0.0248901304, 0.0053771734, 0.0511769727, -0.098929733, -0.1660347879, 0.0946891308, 0.1968245208, -0.1661896259, -0.2573056519, -0.0721762553, 0.1776912063, -0.0424321517, 0.1197286695, -0.2324413657, 0.0084067807, 0.1450569928, -0.0062898882, -0.1843237281, 0.0404452905, -0.1199223399, -0.0470516533, -0.4116415977, -0.053518068, 0.1070510522, 0.1367316544, 0.3593674898, 0.1067891791, 0.194451645, 0.2884450853, 0.1585922837, 0.0063672885, -0.0885991901, 0.5141718388, -0.0175727978, -0.2971983254, -0.2908624411, -0.2397544682, 0.0251131542, 0.3155243993, -0.6842602491, 0.2337829173, -0.1138520464, -0.0940265656, -0.3955192268, 0.0007257089, 0.252205044, -0.0761481375, 0.0163395926, 0.0954741985, 0.2302769721, 0.1546791494, -0.0387872718, 0.3857944906, -0.4130838811, 0.359408766, -0.1354579031, 0.4855529666, -0.1221182644, -0.2300207019, 0.0988750979, 0.1504482776, -0.0171156786, -0.0218024775, -0.0139686503, -0.2593793869, -0.0582374632, 0.0227312893, 0.5197216272, 0.0168899372, -0.1655897647, -0.0158990994, 0.0681643113, 0.2563194036, -0.2606988251, 0.4905276, -0.1907785088, 0.0184165761, -0.1996744126, 0.0187455565, -0.2049808204, -0.2150854766, -0.1641347706, 0.0934438407, -0.0303831622, -0.2240440995, -0.2116935551, 0.10314022, -0.3681344092, -0.0437650606, 0.1179903895, 0.2293324769, 0.1807705015, -0.1961435974, -0.086288482, -0.2018707544, 0.3574145138, 0.1934524179, 0.1124655455, 0.1632595956, -0.0568787605, -0.2155371606, -0.1774517596, -0.2904546559, 0.1419078708, 0.6505103111, 0.6102460623, -0.2162735164, -0.1585993618, 0.0791448355, 0.0026035607, -0.1652731597, -0.5487569571, -0.1285093129, -0.1654994041, -0.4410517216, 0.0046067014, 0.3584984243, 0.3558533788, -0.2264456898, -0.0318751894, 0.1387300938, -0.0007012784, 0.3717512488, -0.1297643632, 0.0150536448, -0.066410847, 0.2258286923, -0.1074174196, 0.3673555255, 0.2488899529, 0.3563638031, 0.1917256266, -0.3175569773, 0.0706882849, -0.0298886281, 0.2314305753, 0.115161255, 0.1102285907, 0.3546282947, 0.1289709657, 0.3322556615, -0.1909662783, 0.6737231016, -0.0073441286, 0.2068271488, -0.8374903798, -0.2673701644, 0.3314557374, -0.0263577811, 0.1994024962, 0.215717867, -0.4038517177, -0.3216702938, 0.3766611218, 0.0990837663, 0.7675249577, -0.2972689867, 0.3962582052, -0.1552994698, 0.6611967087, 0.1280310154, -0.3813101053, 0.1585618854, -0.3987909555, -0.2663655877, 0.2130052447, -0.0923116431, 0.0365401059, 0.1605039835, -0.2905768156, -0.0298117511, 0.0862519294, 0.2249331921, 0.1306698024, 0.5134935379, 0.120926626, -0.6006980538, -0.0828129277, 0.0265690908, -0.0234003812, -0.1049019024, -0.0492794514, 0.0799459741, -0.0313268006, -0.2085639238, -0.0284662768, 0.2052582055, -0.2474462092, 0.0866158754, 0.0637520179, -0.0720534623, 0.5367990732, 0.1062302291, -0.0013607293, 0.4988959134, -0.0392867513, 0.1321583986, -0.2274820209, -0.1652866751, 0.2007084042, -0.0658488423, 0.4217028916, 0.0883791074, -0.00058797, 0.1141032875, 0.0632384419, -0.0466908254, 0.0947640389, -0.3643093109, 0.1054970175, -0.4112315476, -0.2154278457, -0.0884305984, -0.3007827401, -0.1009221599, 0.123109594, 0.1373566687, -0.464446038, 0.1151452065, 0.2652114034, -0.375010103, 0.0119969212, 0.3838247359, 0.1710590869, -0.1051390767, 0.499843061, 0.457587719, -0.0533619151, -0.2644975185, -0.1096296161, 0.2944480777, -0.2731434405, 0.0849161297, -0.2455591708, -0.3016008735, 0.2093508542, 0.0808756128, 0.1000995561, -0.0149994157, -0.0632621273, -0.2510291338, -0.6380496025, -0.1814441532, -0.0027227891, -0.0035484964, 0.1460835785, 0.2738982141, -0.3100468516, 0.5892775059, -0.2788560092, -0.0529419072, -0.2076085508, 0.2782230973, -0.0555623658, -0.3855459988, 0.2115361542, 0.2033527046, 0.1001042053, 0.0805963874, 0.0059093423, -0.2213539034, -0.1435202807, 0.112715289, 0.2364698499, -0.4196630418, -0.0159935355, -0.3129969835, -0.0924898088, -0.2700214386, 0.0660682172, 0.0170285404, 0.0139458058, -0.2261784673, 0.0093940832, 0.0475259274, -0.2552625537, 0.042060256, -0.2545849085, 0.1250909567, 0.246571213, 0.1331031621, 0.1178301424, 0.0006369948, -0.3931023479, 0.1796455681, 0.6661384702, -0.0175323728, 0.419082135, -0.4357323647, -0.0715063736, 0.247659564, 0.0498031005, 0.0480658486, -0.3416772485, -0.0107186846, 0.2935878038, 0.2268191278, 0.0152087137, -0.0147332661, 0.2263551354, -0.4112984538, 0.1301490664, 0.3453423977, -0.1119266823, -0.0674753264, 0.1559654474, 0.0913231522, 0.5066777468, 0.1149650663, 0.0057111867, 0.2489495873, -0.0012200177, 0.1548573822, 0.4856925309, 0.2611194551, 0.2386806309, 0.3016281128, -0.0326658972, -0.0406999141, -0.2312837243, 0.2294136882, -0.435616076, -0.415166378, -0.2674563825, 0.5044151545, -0.0193236805, 0.1242090911, -0.3350303173, -0.0662411451, -0.0406532101, 0.2541327477, -0.1192157567, 0.2744088173, -0.4530941248, -0.1705991924, -0.0894243345, -0.123961933, 0.0115341395, 0.1896333247, -0.0558586121, 0.2722765803, -0.5796659589, 0.2293862104, 0.0521361418, 0.1451499909, -0.1346786618, 0.1227787733, 0.1055012345, -0.5078569651, 0.3433353007, 0.2378563285, 0.2918590903, 0.115404956, -0.221906662, 0.3443704844, 0.4366974831, -0.1908073723, -0.0406121649, -0.1209841669, -0.0356197953, -0.1499129832, 0.1974350214, 0.3490367532, 0.1878990531, 0.2596160173, 0.0557278506, -0.265966177, 0.2785172462, 0.0650140941, -0.3240586519, 0.1786668301, -0.1329390854, 0.4019319415, -0.3285808265, -0.4429640174, 0.009867128, -0.3941243589, 0.2155868858, 0.6670196056, -0.2954224646, 0.2878553569, -0.0254881512, 0.0485503711, 0.0018406976, 0.4760496616, 0.4697353244, 0.2841596603, -0.3783711195, -0.0256674215, -0.3288833499, 0.2765586376, -0.2024302334, 0.0669976398, 0.0222478993, 0.1707946956, 0.1645776778, -0.0033023204, -0.0796225369, -0.0136221573, 0.218803376, -0.1950690895, -0.1675592959, -0.1280471683, 0.0020943638, 0.3573175073, -0.0259005763, -0.2975495458, 0.1588650346, 0.1150627136, 0.0991571695, -0.3214533329, 0.3626994789, -0.1984633654, -0.0203625765, 0.0691283345, -0.1504710466, 0.3792098761, -0.2204388827, -0.0330470726, -0.0818270966, 0.1281416565, -0.1670770645, -0.2209998518, 0.3402220011, 0.3884243667, -0.1076362282, 0.1082137078, -0.255192548, -0.0864622518, 0.0403043032, -0.369690299, -0.3087161779, -0.0074875243, -0.0002530813, 0.0085244328, 0.1030983254, 0.2458303273, 0.1053374782, -0.0278403312, -0.2292289734, -0.1855804026, 0.2094751894, -0.2946497798, -0.3130407035, -0.0353798121, 0.055624567, -0.1636292785, 0.2511017621, -0.4624068141, 0.1457485557, 0.2297439128, -0.0255144164, -0.4540739357, 0.2232972682, 0.2540614009, 0.0164720193, -0.2076969594, 0.0534572713, 0.152623266, -0.1369921267, -0.2016401291, -0.1757197082 ]
https://github.com/huggingface/datasets/issues/633
Load large text file for LM pre-training resulting in OOM
There was a memory leak issue fixed recently in master. You should install from source and see if it fixes your problem.
I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks.
22
Load large text file for LM pre-training resulting in OOM I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks. There was a memory leak issue fixed recently in master. You should install from source and see if it fixes your problem.
[ -0.6339286566, -0.4775367379, 0.0106935017, 0.2986355722, 0.3600474596, -0.1518248618, 0.5567324758, 0.3738059402, 0.0108827576, 0.0107196309, -0.129592672, -0.1827997565, -0.2669844627, -0.1620898247, -0.0323721357, 0.0232065637, -0.0998215079, 0.1968753189, -0.2772667408, -0.0966669172, 0.0570117384, -0.0776291862, -0.2127603889, -0.0297442861, -0.3879216015, -0.0617137849, 0.3158174157, 0.1221210212, -0.1640810221, -0.0958083943, -0.2297017574, 0.1197900772, 0.439348489, 0.4340211749, -0.0001147721, 0.0066160336, 0.1890438497, -0.2363324165, -0.1672690213, 0.0192163214, 0.0775555521, -0.3022914231, -0.1064001918, -0.3036173582, -0.0272463262, -0.1061252505, -0.0156821907, -0.2243723869, 0.4987519681, 0.5414042473, 0.158762753, 0.2508125007, -0.2136608064, 0.0723968297, 0.2544375062, 0.0488999188, 0.1056828648, -0.2795148194, 0.4687834084, -0.1923182011, -0.4726887643, 0.147817716, -0.2247735411, -0.2016820759, -0.0950891525, 0.0382048525, 0.3660891652, 0.0315695107, 0.2641872764, 0.3474208415, 0.4272072911, -0.0008119822, -0.2896353602, -0.3209854364, -0.142623961, -0.0843160674, 0.1892364621, 0.2010776401, 0.0598267131, 0.1184588969, 0.1071273386, -0.1253313124, -0.1519896984, 0.0761606991, -0.2748077214, 0.3397959173, -0.2012510002, -0.0380433276, 0.1555755138, -0.0944792777, 0.1961691678, -0.0997751206, 0.0034044338, 0.256714642, -0.2454416156, -0.1123953462, -0.0716817975, -0.5194139481, 0.1627842039, -0.275945127, 0.1981878579, 0.2972391844, -0.0876616836, -0.0676169693, 0.0863933712, 0.4202023745, -0.2524377108, 0.2605276108, 0.2272560447, 0.1640110612, -0.1717063487, -0.044398699, -0.3305115998, -0.1962453574, 0.1015936062, -0.0773412734, -0.0110169947, -0.2546044588, -0.2404216379, 0.0585461818, -0.1407698691, -0.0308450628, 0.2748159468, 0.3899732232, -0.3540614843, 0.429497689, 0.1651095748, 0.0523289181, -0.4855332375, -0.3365268707, -0.1744022667, 0.1532573104, -0.2032445073, 0.0389754251, 0.1297925711, 0.1396334916, 0.0596540496, -0.0428831056, -0.062589474, -0.4402189851, -0.0094447248, -0.0241031349, 0.0207529441, -0.0425147153, 0.0630241409, -0.0233657025, 0.2347659618, -0.124994956, -0.0253818408, 0.347791791, -0.2762121558, -0.2276411355, 0.0509795845, 0.2101765722, -0.0282484777, 0.2091551125, 0.0285063665, 0.0768566355, 0.5359786153, 0.0066538528, -0.0471477136, -0.3782058656, 0.0945213884, 0.0920172632, 0.1619913876, 0.176630199, -0.0425881371, -0.0726022273, 0.0063893422, 0.1035867184, 0.2550407052, 0.3907325864, -0.2264701426, 0.2906734943, 0.0070044398, -0.0566552505, 0.5319920778, -0.2620993257, -0.4484443367, 0.2338527292, -0.3511552811, 0.1081818715, 0.2206701636, 0.1759027839, 0.1138891205, -0.1089850664, 0.2690153718, 0.3656853139, 0.0705155656, 0.1209142357, -0.2366243303, -0.0922287181, -0.1715156287, 0.4842028618, 0.3627239466, -0.0255859457, -0.073089242, 0.3563796282, 0.1833981574, -0.1992233247, 0.1312805116, 0.3693415821, -0.1744246632, -0.126158461, 0.0944814235, 0.065598011, -0.3000324965, -0.0709871948, -0.0908449665, 0.0828673989, -0.1136057153, -0.0894262716, 0.2825582922, 0.0228702873, -0.076665625, 0.1131795123, 0.0379412323, -0.1318216175, 0.0248667225, -0.0704166517, 0.1261966676, 0.0154216904, -0.0932069048, 0.0873603672, -0.218415916, -0.1239407063, 0.0018027006, -0.1370613873, 0.0769498348, 0.1752731055, -0.0511879697, 0.038401451, -0.1730400473, 0.1069573462, -0.1529778838, -0.2987068594, -0.5071570873, 0.435893029, 0.1864787787, -0.338109374, 0.3402595818, 0.104027614, -0.0295993853, -0.1062710211, -0.1198950633, 0.5749330521, 0.2730709612, 0.019617686, 0.2990927994, -0.2454116046, 0.2542809844, -0.2662388682, -0.1036153138, -0.0977649093, 0.4209234118, 0.0310557783, 0.2350629866, 0.148280412, -0.0869456232, -0.2320593894, 0.4703808725, -0.1788093746, -0.0138720572, 0.2665264308, -0.4247210622, -0.0115635023, -0.3480208218, -0.2494779825, 0.0067337304, 0.1732288599, -0.0944132358, -0.0554348305, 0.008538004, -0.2627931833, 0.0078360327, 0.153237164, -0.0718519539, 0.3198346198, 0.1165152863, 0.1005953103, -0.1228921488, -0.1794068366, 0.1028476655, 0.1782900989, -0.275191009, 0.1844428182, -0.0228228644, -0.0134468302, -0.2862631679, 0.1904247552, -0.1268012226, 0.0388155058, -0.064743273, 0.2586891651, -0.0779029801, 0.1019122973, 0.3001608849, 0.3549228907, 0.462962091, -0.0780947804, 0.3263480961, -0.0826636776, -0.1796956658, 0.0329436511, 0.3345074356, -0.3179924488, -0.0248901304, 0.0053771734, 0.0511769727, -0.098929733, -0.1660347879, 0.0946891308, 0.1968245208, -0.1661896259, -0.2573056519, -0.0721762553, 0.1776912063, -0.0424321517, 0.1197286695, -0.2324413657, 0.0084067807, 0.1450569928, -0.0062898882, -0.1843237281, 0.0404452905, -0.1199223399, -0.0470516533, -0.4116415977, -0.053518068, 0.1070510522, 0.1367316544, 0.3593674898, 0.1067891791, 0.194451645, 0.2884450853, 0.1585922837, 0.0063672885, -0.0885991901, 0.5141718388, -0.0175727978, -0.2971983254, -0.2908624411, -0.2397544682, 0.0251131542, 0.3155243993, -0.6842602491, 0.2337829173, -0.1138520464, -0.0940265656, -0.3955192268, 0.0007257089, 0.252205044, -0.0761481375, 0.0163395926, 0.0954741985, 0.2302769721, 0.1546791494, -0.0387872718, 0.3857944906, -0.4130838811, 0.359408766, -0.1354579031, 0.4855529666, -0.1221182644, -0.2300207019, 0.0988750979, 0.1504482776, -0.0171156786, -0.0218024775, -0.0139686503, -0.2593793869, -0.0582374632, 0.0227312893, 0.5197216272, 0.0168899372, -0.1655897647, -0.0158990994, 0.0681643113, 0.2563194036, -0.2606988251, 0.4905276, -0.1907785088, 0.0184165761, -0.1996744126, 0.0187455565, -0.2049808204, -0.2150854766, -0.1641347706, 0.0934438407, -0.0303831622, -0.2240440995, -0.2116935551, 0.10314022, -0.3681344092, -0.0437650606, 0.1179903895, 0.2293324769, 0.1807705015, -0.1961435974, -0.086288482, -0.2018707544, 0.3574145138, 0.1934524179, 0.1124655455, 0.1632595956, -0.0568787605, -0.2155371606, -0.1774517596, -0.2904546559, 0.1419078708, 0.6505103111, 0.6102460623, -0.2162735164, -0.1585993618, 0.0791448355, 0.0026035607, -0.1652731597, -0.5487569571, -0.1285093129, -0.1654994041, -0.4410517216, 0.0046067014, 0.3584984243, 0.3558533788, -0.2264456898, -0.0318751894, 0.1387300938, -0.0007012784, 0.3717512488, -0.1297643632, 0.0150536448, -0.066410847, 0.2258286923, -0.1074174196, 0.3673555255, 0.2488899529, 0.3563638031, 0.1917256266, -0.3175569773, 0.0706882849, -0.0298886281, 0.2314305753, 0.115161255, 0.1102285907, 0.3546282947, 0.1289709657, 0.3322556615, -0.1909662783, 0.6737231016, -0.0073441286, 0.2068271488, -0.8374903798, -0.2673701644, 0.3314557374, -0.0263577811, 0.1994024962, 0.215717867, -0.4038517177, -0.3216702938, 0.3766611218, 0.0990837663, 0.7675249577, -0.2972689867, 0.3962582052, -0.1552994698, 0.6611967087, 0.1280310154, -0.3813101053, 0.1585618854, -0.3987909555, -0.2663655877, 0.2130052447, -0.0923116431, 0.0365401059, 0.1605039835, -0.2905768156, -0.0298117511, 0.0862519294, 0.2249331921, 0.1306698024, 0.5134935379, 0.120926626, -0.6006980538, -0.0828129277, 0.0265690908, -0.0234003812, -0.1049019024, -0.0492794514, 0.0799459741, -0.0313268006, -0.2085639238, -0.0284662768, 0.2052582055, -0.2474462092, 0.0866158754, 0.0637520179, -0.0720534623, 0.5367990732, 0.1062302291, -0.0013607293, 0.4988959134, -0.0392867513, 0.1321583986, -0.2274820209, -0.1652866751, 0.2007084042, -0.0658488423, 0.4217028916, 0.0883791074, -0.00058797, 0.1141032875, 0.0632384419, -0.0466908254, 0.0947640389, -0.3643093109, 0.1054970175, -0.4112315476, -0.2154278457, -0.0884305984, -0.3007827401, -0.1009221599, 0.123109594, 0.1373566687, -0.464446038, 0.1151452065, 0.2652114034, -0.375010103, 0.0119969212, 0.3838247359, 0.1710590869, -0.1051390767, 0.499843061, 0.457587719, -0.0533619151, -0.2644975185, -0.1096296161, 0.2944480777, -0.2731434405, 0.0849161297, -0.2455591708, -0.3016008735, 0.2093508542, 0.0808756128, 0.1000995561, -0.0149994157, -0.0632621273, -0.2510291338, -0.6380496025, -0.1814441532, -0.0027227891, -0.0035484964, 0.1460835785, 0.2738982141, -0.3100468516, 0.5892775059, -0.2788560092, -0.0529419072, -0.2076085508, 0.2782230973, -0.0555623658, -0.3855459988, 0.2115361542, 0.2033527046, 0.1001042053, 0.0805963874, 0.0059093423, -0.2213539034, -0.1435202807, 0.112715289, 0.2364698499, -0.4196630418, -0.0159935355, -0.3129969835, -0.0924898088, -0.2700214386, 0.0660682172, 0.0170285404, 0.0139458058, -0.2261784673, 0.0093940832, 0.0475259274, -0.2552625537, 0.042060256, -0.2545849085, 0.1250909567, 0.246571213, 0.1331031621, 0.1178301424, 0.0006369948, -0.3931023479, 0.1796455681, 0.6661384702, -0.0175323728, 0.419082135, -0.4357323647, -0.0715063736, 0.247659564, 0.0498031005, 0.0480658486, -0.3416772485, -0.0107186846, 0.2935878038, 0.2268191278, 0.0152087137, -0.0147332661, 0.2263551354, -0.4112984538, 0.1301490664, 0.3453423977, -0.1119266823, -0.0674753264, 0.1559654474, 0.0913231522, 0.5066777468, 0.1149650663, 0.0057111867, 0.2489495873, -0.0012200177, 0.1548573822, 0.4856925309, 0.2611194551, 0.2386806309, 0.3016281128, -0.0326658972, -0.0406999141, -0.2312837243, 0.2294136882, -0.435616076, -0.415166378, -0.2674563825, 0.5044151545, -0.0193236805, 0.1242090911, -0.3350303173, -0.0662411451, -0.0406532101, 0.2541327477, -0.1192157567, 0.2744088173, -0.4530941248, -0.1705991924, -0.0894243345, -0.123961933, 0.0115341395, 0.1896333247, -0.0558586121, 0.2722765803, -0.5796659589, 0.2293862104, 0.0521361418, 0.1451499909, -0.1346786618, 0.1227787733, 0.1055012345, -0.5078569651, 0.3433353007, 0.2378563285, 0.2918590903, 0.115404956, -0.221906662, 0.3443704844, 0.4366974831, -0.1908073723, -0.0406121649, -0.1209841669, -0.0356197953, -0.1499129832, 0.1974350214, 0.3490367532, 0.1878990531, 0.2596160173, 0.0557278506, -0.265966177, 0.2785172462, 0.0650140941, -0.3240586519, 0.1786668301, -0.1329390854, 0.4019319415, -0.3285808265, -0.4429640174, 0.009867128, -0.3941243589, 0.2155868858, 0.6670196056, -0.2954224646, 0.2878553569, -0.0254881512, 0.0485503711, 0.0018406976, 0.4760496616, 0.4697353244, 0.2841596603, -0.3783711195, -0.0256674215, -0.3288833499, 0.2765586376, -0.2024302334, 0.0669976398, 0.0222478993, 0.1707946956, 0.1645776778, -0.0033023204, -0.0796225369, -0.0136221573, 0.218803376, -0.1950690895, -0.1675592959, -0.1280471683, 0.0020943638, 0.3573175073, -0.0259005763, -0.2975495458, 0.1588650346, 0.1150627136, 0.0991571695, -0.3214533329, 0.3626994789, -0.1984633654, -0.0203625765, 0.0691283345, -0.1504710466, 0.3792098761, -0.2204388827, -0.0330470726, -0.0818270966, 0.1281416565, -0.1670770645, -0.2209998518, 0.3402220011, 0.3884243667, -0.1076362282, 0.1082137078, -0.255192548, -0.0864622518, 0.0403043032, -0.369690299, -0.3087161779, -0.0074875243, -0.0002530813, 0.0085244328, 0.1030983254, 0.2458303273, 0.1053374782, -0.0278403312, -0.2292289734, -0.1855804026, 0.2094751894, -0.2946497798, -0.3130407035, -0.0353798121, 0.055624567, -0.1636292785, 0.2511017621, -0.4624068141, 0.1457485557, 0.2297439128, -0.0255144164, -0.4540739357, 0.2232972682, 0.2540614009, 0.0164720193, -0.2076969594, 0.0534572713, 0.152623266, -0.1369921267, -0.2016401291, -0.1757197082 ]
https://github.com/huggingface/datasets/issues/633
Load large text file for LM pre-training resulting in OOM
@lhoestq @sgugger Thanks for your comments. I have install from source code as you told, but the problem is still there. To reproduce the issue, just replace [these lines](https://github.com/huggingface/transformers/blob/master/examples/language-modeling/run_language_modeling.py#L241-L258) with: (load_dataset and DataCollatorForDatasetsLanguageModeling as [above mentioned](https://github.com/huggingface/datasets/issues/633#issue-702440484)) ```python dataset = load_dataset("bookcorpus") dataset = dataset.train_test_split(test_size=0.1) train_dataset = dataset['train'] eval_dataset = dataset['test'] if training_args.do_eval else None data_collator = DataCollatorForDatasetsLanguageModeling( tokenizer=tokenizer, mlm=data_args.mlm, mlm_probability=data_args.mlm_probability, block_size=data_args.block_size ) ``` and run by: ```bash python run_language_modeling.py --output_dir=output \ --model_type=bert \ --model_name_or_path=bert-base-uncased \ --do_train \ --do_eval \ --mlm ```
I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks.
80
Load large text file for LM pre-training resulting in OOM I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks. @lhoestq @sgugger Thanks for your comments. I have install from source code as you told, but the problem is still there. To reproduce the issue, just replace [these lines](https://github.com/huggingface/transformers/blob/master/examples/language-modeling/run_language_modeling.py#L241-L258) with: (load_dataset and DataCollatorForDatasetsLanguageModeling as [above mentioned](https://github.com/huggingface/datasets/issues/633#issue-702440484)) ```python dataset = load_dataset("bookcorpus") dataset = dataset.train_test_split(test_size=0.1) train_dataset = dataset['train'] eval_dataset = dataset['test'] if training_args.do_eval else None data_collator = DataCollatorForDatasetsLanguageModeling( tokenizer=tokenizer, mlm=data_args.mlm, mlm_probability=data_args.mlm_probability, block_size=data_args.block_size ) ``` and run by: ```bash python run_language_modeling.py --output_dir=output \ --model_type=bert \ --model_name_or_path=bert-base-uncased \ --do_train \ --do_eval \ --mlm ```
[ -0.6339286566, -0.4775367379, 0.0106935017, 0.2986355722, 0.3600474596, -0.1518248618, 0.5567324758, 0.3738059402, 0.0108827576, 0.0107196309, -0.129592672, -0.1827997565, -0.2669844627, -0.1620898247, -0.0323721357, 0.0232065637, -0.0998215079, 0.1968753189, -0.2772667408, -0.0966669172, 0.0570117384, -0.0776291862, -0.2127603889, -0.0297442861, -0.3879216015, -0.0617137849, 0.3158174157, 0.1221210212, -0.1640810221, -0.0958083943, -0.2297017574, 0.1197900772, 0.439348489, 0.4340211749, -0.0001147721, 0.0066160336, 0.1890438497, -0.2363324165, -0.1672690213, 0.0192163214, 0.0775555521, -0.3022914231, -0.1064001918, -0.3036173582, -0.0272463262, -0.1061252505, -0.0156821907, -0.2243723869, 0.4987519681, 0.5414042473, 0.158762753, 0.2508125007, -0.2136608064, 0.0723968297, 0.2544375062, 0.0488999188, 0.1056828648, -0.2795148194, 0.4687834084, -0.1923182011, -0.4726887643, 0.147817716, -0.2247735411, -0.2016820759, -0.0950891525, 0.0382048525, 0.3660891652, 0.0315695107, 0.2641872764, 0.3474208415, 0.4272072911, -0.0008119822, -0.2896353602, -0.3209854364, -0.142623961, -0.0843160674, 0.1892364621, 0.2010776401, 0.0598267131, 0.1184588969, 0.1071273386, -0.1253313124, -0.1519896984, 0.0761606991, -0.2748077214, 0.3397959173, -0.2012510002, -0.0380433276, 0.1555755138, -0.0944792777, 0.1961691678, -0.0997751206, 0.0034044338, 0.256714642, -0.2454416156, -0.1123953462, -0.0716817975, -0.5194139481, 0.1627842039, -0.275945127, 0.1981878579, 0.2972391844, -0.0876616836, -0.0676169693, 0.0863933712, 0.4202023745, -0.2524377108, 0.2605276108, 0.2272560447, 0.1640110612, -0.1717063487, -0.044398699, -0.3305115998, -0.1962453574, 0.1015936062, -0.0773412734, -0.0110169947, -0.2546044588, -0.2404216379, 0.0585461818, -0.1407698691, -0.0308450628, 0.2748159468, 0.3899732232, -0.3540614843, 0.429497689, 0.1651095748, 0.0523289181, -0.4855332375, -0.3365268707, -0.1744022667, 0.1532573104, -0.2032445073, 0.0389754251, 0.1297925711, 0.1396334916, 0.0596540496, -0.0428831056, -0.062589474, -0.4402189851, -0.0094447248, -0.0241031349, 0.0207529441, -0.0425147153, 0.0630241409, -0.0233657025, 0.2347659618, -0.124994956, -0.0253818408, 0.347791791, -0.2762121558, -0.2276411355, 0.0509795845, 0.2101765722, -0.0282484777, 0.2091551125, 0.0285063665, 0.0768566355, 0.5359786153, 0.0066538528, -0.0471477136, -0.3782058656, 0.0945213884, 0.0920172632, 0.1619913876, 0.176630199, -0.0425881371, -0.0726022273, 0.0063893422, 0.1035867184, 0.2550407052, 0.3907325864, -0.2264701426, 0.2906734943, 0.0070044398, -0.0566552505, 0.5319920778, -0.2620993257, -0.4484443367, 0.2338527292, -0.3511552811, 0.1081818715, 0.2206701636, 0.1759027839, 0.1138891205, -0.1089850664, 0.2690153718, 0.3656853139, 0.0705155656, 0.1209142357, -0.2366243303, -0.0922287181, -0.1715156287, 0.4842028618, 0.3627239466, -0.0255859457, -0.073089242, 0.3563796282, 0.1833981574, -0.1992233247, 0.1312805116, 0.3693415821, -0.1744246632, -0.126158461, 0.0944814235, 0.065598011, -0.3000324965, -0.0709871948, -0.0908449665, 0.0828673989, -0.1136057153, -0.0894262716, 0.2825582922, 0.0228702873, -0.076665625, 0.1131795123, 0.0379412323, -0.1318216175, 0.0248667225, -0.0704166517, 0.1261966676, 0.0154216904, -0.0932069048, 0.0873603672, -0.218415916, -0.1239407063, 0.0018027006, -0.1370613873, 0.0769498348, 0.1752731055, -0.0511879697, 0.038401451, -0.1730400473, 0.1069573462, -0.1529778838, -0.2987068594, -0.5071570873, 0.435893029, 0.1864787787, -0.338109374, 0.3402595818, 0.104027614, -0.0295993853, -0.1062710211, -0.1198950633, 0.5749330521, 0.2730709612, 0.019617686, 0.2990927994, -0.2454116046, 0.2542809844, -0.2662388682, -0.1036153138, -0.0977649093, 0.4209234118, 0.0310557783, 0.2350629866, 0.148280412, -0.0869456232, -0.2320593894, 0.4703808725, -0.1788093746, -0.0138720572, 0.2665264308, -0.4247210622, -0.0115635023, -0.3480208218, -0.2494779825, 0.0067337304, 0.1732288599, -0.0944132358, -0.0554348305, 0.008538004, -0.2627931833, 0.0078360327, 0.153237164, -0.0718519539, 0.3198346198, 0.1165152863, 0.1005953103, -0.1228921488, -0.1794068366, 0.1028476655, 0.1782900989, -0.275191009, 0.1844428182, -0.0228228644, -0.0134468302, -0.2862631679, 0.1904247552, -0.1268012226, 0.0388155058, -0.064743273, 0.2586891651, -0.0779029801, 0.1019122973, 0.3001608849, 0.3549228907, 0.462962091, -0.0780947804, 0.3263480961, -0.0826636776, -0.1796956658, 0.0329436511, 0.3345074356, -0.3179924488, -0.0248901304, 0.0053771734, 0.0511769727, -0.098929733, -0.1660347879, 0.0946891308, 0.1968245208, -0.1661896259, -0.2573056519, -0.0721762553, 0.1776912063, -0.0424321517, 0.1197286695, -0.2324413657, 0.0084067807, 0.1450569928, -0.0062898882, -0.1843237281, 0.0404452905, -0.1199223399, -0.0470516533, -0.4116415977, -0.053518068, 0.1070510522, 0.1367316544, 0.3593674898, 0.1067891791, 0.194451645, 0.2884450853, 0.1585922837, 0.0063672885, -0.0885991901, 0.5141718388, -0.0175727978, -0.2971983254, -0.2908624411, -0.2397544682, 0.0251131542, 0.3155243993, -0.6842602491, 0.2337829173, -0.1138520464, -0.0940265656, -0.3955192268, 0.0007257089, 0.252205044, -0.0761481375, 0.0163395926, 0.0954741985, 0.2302769721, 0.1546791494, -0.0387872718, 0.3857944906, -0.4130838811, 0.359408766, -0.1354579031, 0.4855529666, -0.1221182644, -0.2300207019, 0.0988750979, 0.1504482776, -0.0171156786, -0.0218024775, -0.0139686503, -0.2593793869, -0.0582374632, 0.0227312893, 0.5197216272, 0.0168899372, -0.1655897647, -0.0158990994, 0.0681643113, 0.2563194036, -0.2606988251, 0.4905276, -0.1907785088, 0.0184165761, -0.1996744126, 0.0187455565, -0.2049808204, -0.2150854766, -0.1641347706, 0.0934438407, -0.0303831622, -0.2240440995, -0.2116935551, 0.10314022, -0.3681344092, -0.0437650606, 0.1179903895, 0.2293324769, 0.1807705015, -0.1961435974, -0.086288482, -0.2018707544, 0.3574145138, 0.1934524179, 0.1124655455, 0.1632595956, -0.0568787605, -0.2155371606, -0.1774517596, -0.2904546559, 0.1419078708, 0.6505103111, 0.6102460623, -0.2162735164, -0.1585993618, 0.0791448355, 0.0026035607, -0.1652731597, -0.5487569571, -0.1285093129, -0.1654994041, -0.4410517216, 0.0046067014, 0.3584984243, 0.3558533788, -0.2264456898, -0.0318751894, 0.1387300938, -0.0007012784, 0.3717512488, -0.1297643632, 0.0150536448, -0.066410847, 0.2258286923, -0.1074174196, 0.3673555255, 0.2488899529, 0.3563638031, 0.1917256266, -0.3175569773, 0.0706882849, -0.0298886281, 0.2314305753, 0.115161255, 0.1102285907, 0.3546282947, 0.1289709657, 0.3322556615, -0.1909662783, 0.6737231016, -0.0073441286, 0.2068271488, -0.8374903798, -0.2673701644, 0.3314557374, -0.0263577811, 0.1994024962, 0.215717867, -0.4038517177, -0.3216702938, 0.3766611218, 0.0990837663, 0.7675249577, -0.2972689867, 0.3962582052, -0.1552994698, 0.6611967087, 0.1280310154, -0.3813101053, 0.1585618854, -0.3987909555, -0.2663655877, 0.2130052447, -0.0923116431, 0.0365401059, 0.1605039835, -0.2905768156, -0.0298117511, 0.0862519294, 0.2249331921, 0.1306698024, 0.5134935379, 0.120926626, -0.6006980538, -0.0828129277, 0.0265690908, -0.0234003812, -0.1049019024, -0.0492794514, 0.0799459741, -0.0313268006, -0.2085639238, -0.0284662768, 0.2052582055, -0.2474462092, 0.0866158754, 0.0637520179, -0.0720534623, 0.5367990732, 0.1062302291, -0.0013607293, 0.4988959134, -0.0392867513, 0.1321583986, -0.2274820209, -0.1652866751, 0.2007084042, -0.0658488423, 0.4217028916, 0.0883791074, -0.00058797, 0.1141032875, 0.0632384419, -0.0466908254, 0.0947640389, -0.3643093109, 0.1054970175, -0.4112315476, -0.2154278457, -0.0884305984, -0.3007827401, -0.1009221599, 0.123109594, 0.1373566687, -0.464446038, 0.1151452065, 0.2652114034, -0.375010103, 0.0119969212, 0.3838247359, 0.1710590869, -0.1051390767, 0.499843061, 0.457587719, -0.0533619151, -0.2644975185, -0.1096296161, 0.2944480777, -0.2731434405, 0.0849161297, -0.2455591708, -0.3016008735, 0.2093508542, 0.0808756128, 0.1000995561, -0.0149994157, -0.0632621273, -0.2510291338, -0.6380496025, -0.1814441532, -0.0027227891, -0.0035484964, 0.1460835785, 0.2738982141, -0.3100468516, 0.5892775059, -0.2788560092, -0.0529419072, -0.2076085508, 0.2782230973, -0.0555623658, -0.3855459988, 0.2115361542, 0.2033527046, 0.1001042053, 0.0805963874, 0.0059093423, -0.2213539034, -0.1435202807, 0.112715289, 0.2364698499, -0.4196630418, -0.0159935355, -0.3129969835, -0.0924898088, -0.2700214386, 0.0660682172, 0.0170285404, 0.0139458058, -0.2261784673, 0.0093940832, 0.0475259274, -0.2552625537, 0.042060256, -0.2545849085, 0.1250909567, 0.246571213, 0.1331031621, 0.1178301424, 0.0006369948, -0.3931023479, 0.1796455681, 0.6661384702, -0.0175323728, 0.419082135, -0.4357323647, -0.0715063736, 0.247659564, 0.0498031005, 0.0480658486, -0.3416772485, -0.0107186846, 0.2935878038, 0.2268191278, 0.0152087137, -0.0147332661, 0.2263551354, -0.4112984538, 0.1301490664, 0.3453423977, -0.1119266823, -0.0674753264, 0.1559654474, 0.0913231522, 0.5066777468, 0.1149650663, 0.0057111867, 0.2489495873, -0.0012200177, 0.1548573822, 0.4856925309, 0.2611194551, 0.2386806309, 0.3016281128, -0.0326658972, -0.0406999141, -0.2312837243, 0.2294136882, -0.435616076, -0.415166378, -0.2674563825, 0.5044151545, -0.0193236805, 0.1242090911, -0.3350303173, -0.0662411451, -0.0406532101, 0.2541327477, -0.1192157567, 0.2744088173, -0.4530941248, -0.1705991924, -0.0894243345, -0.123961933, 0.0115341395, 0.1896333247, -0.0558586121, 0.2722765803, -0.5796659589, 0.2293862104, 0.0521361418, 0.1451499909, -0.1346786618, 0.1227787733, 0.1055012345, -0.5078569651, 0.3433353007, 0.2378563285, 0.2918590903, 0.115404956, -0.221906662, 0.3443704844, 0.4366974831, -0.1908073723, -0.0406121649, -0.1209841669, -0.0356197953, -0.1499129832, 0.1974350214, 0.3490367532, 0.1878990531, 0.2596160173, 0.0557278506, -0.265966177, 0.2785172462, 0.0650140941, -0.3240586519, 0.1786668301, -0.1329390854, 0.4019319415, -0.3285808265, -0.4429640174, 0.009867128, -0.3941243589, 0.2155868858, 0.6670196056, -0.2954224646, 0.2878553569, -0.0254881512, 0.0485503711, 0.0018406976, 0.4760496616, 0.4697353244, 0.2841596603, -0.3783711195, -0.0256674215, -0.3288833499, 0.2765586376, -0.2024302334, 0.0669976398, 0.0222478993, 0.1707946956, 0.1645776778, -0.0033023204, -0.0796225369, -0.0136221573, 0.218803376, -0.1950690895, -0.1675592959, -0.1280471683, 0.0020943638, 0.3573175073, -0.0259005763, -0.2975495458, 0.1588650346, 0.1150627136, 0.0991571695, -0.3214533329, 0.3626994789, -0.1984633654, -0.0203625765, 0.0691283345, -0.1504710466, 0.3792098761, -0.2204388827, -0.0330470726, -0.0818270966, 0.1281416565, -0.1670770645, -0.2209998518, 0.3402220011, 0.3884243667, -0.1076362282, 0.1082137078, -0.255192548, -0.0864622518, 0.0403043032, -0.369690299, -0.3087161779, -0.0074875243, -0.0002530813, 0.0085244328, 0.1030983254, 0.2458303273, 0.1053374782, -0.0278403312, -0.2292289734, -0.1855804026, 0.2094751894, -0.2946497798, -0.3130407035, -0.0353798121, 0.055624567, -0.1636292785, 0.2511017621, -0.4624068141, 0.1457485557, 0.2297439128, -0.0255144164, -0.4540739357, 0.2232972682, 0.2540614009, 0.0164720193, -0.2076969594, 0.0534572713, 0.152623266, -0.1369921267, -0.2016401291, -0.1757197082 ]
https://github.com/huggingface/datasets/issues/633
Load large text file for LM pre-training resulting in OOM
Same here. Pre-training on wikitext-103 to do some test. At the end of the training it takes 32GB of RAM + ~30GB of SWAP. I installed dataset==1.1.0, not built from source. I will try uninstalling and building from source when it finish.
I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks.
42
Load large text file for LM pre-training resulting in OOM I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks. Same here. Pre-training on wikitext-103 to do some test. At the end of the training it takes 32GB of RAM + ~30GB of SWAP. I installed dataset==1.1.0, not built from source. I will try uninstalling and building from source when it finish.
[ -0.6339286566, -0.4775367379, 0.0106935017, 0.2986355722, 0.3600474596, -0.1518248618, 0.5567324758, 0.3738059402, 0.0108827576, 0.0107196309, -0.129592672, -0.1827997565, -0.2669844627, -0.1620898247, -0.0323721357, 0.0232065637, -0.0998215079, 0.1968753189, -0.2772667408, -0.0966669172, 0.0570117384, -0.0776291862, -0.2127603889, -0.0297442861, -0.3879216015, -0.0617137849, 0.3158174157, 0.1221210212, -0.1640810221, -0.0958083943, -0.2297017574, 0.1197900772, 0.439348489, 0.4340211749, -0.0001147721, 0.0066160336, 0.1890438497, -0.2363324165, -0.1672690213, 0.0192163214, 0.0775555521, -0.3022914231, -0.1064001918, -0.3036173582, -0.0272463262, -0.1061252505, -0.0156821907, -0.2243723869, 0.4987519681, 0.5414042473, 0.158762753, 0.2508125007, -0.2136608064, 0.0723968297, 0.2544375062, 0.0488999188, 0.1056828648, -0.2795148194, 0.4687834084, -0.1923182011, -0.4726887643, 0.147817716, -0.2247735411, -0.2016820759, -0.0950891525, 0.0382048525, 0.3660891652, 0.0315695107, 0.2641872764, 0.3474208415, 0.4272072911, -0.0008119822, -0.2896353602, -0.3209854364, -0.142623961, -0.0843160674, 0.1892364621, 0.2010776401, 0.0598267131, 0.1184588969, 0.1071273386, -0.1253313124, -0.1519896984, 0.0761606991, -0.2748077214, 0.3397959173, -0.2012510002, -0.0380433276, 0.1555755138, -0.0944792777, 0.1961691678, -0.0997751206, 0.0034044338, 0.256714642, -0.2454416156, -0.1123953462, -0.0716817975, -0.5194139481, 0.1627842039, -0.275945127, 0.1981878579, 0.2972391844, -0.0876616836, -0.0676169693, 0.0863933712, 0.4202023745, -0.2524377108, 0.2605276108, 0.2272560447, 0.1640110612, -0.1717063487, -0.044398699, -0.3305115998, -0.1962453574, 0.1015936062, -0.0773412734, -0.0110169947, -0.2546044588, -0.2404216379, 0.0585461818, -0.1407698691, -0.0308450628, 0.2748159468, 0.3899732232, -0.3540614843, 0.429497689, 0.1651095748, 0.0523289181, -0.4855332375, -0.3365268707, -0.1744022667, 0.1532573104, -0.2032445073, 0.0389754251, 0.1297925711, 0.1396334916, 0.0596540496, -0.0428831056, -0.062589474, -0.4402189851, -0.0094447248, -0.0241031349, 0.0207529441, -0.0425147153, 0.0630241409, -0.0233657025, 0.2347659618, -0.124994956, -0.0253818408, 0.347791791, -0.2762121558, -0.2276411355, 0.0509795845, 0.2101765722, -0.0282484777, 0.2091551125, 0.0285063665, 0.0768566355, 0.5359786153, 0.0066538528, -0.0471477136, -0.3782058656, 0.0945213884, 0.0920172632, 0.1619913876, 0.176630199, -0.0425881371, -0.0726022273, 0.0063893422, 0.1035867184, 0.2550407052, 0.3907325864, -0.2264701426, 0.2906734943, 0.0070044398, -0.0566552505, 0.5319920778, -0.2620993257, -0.4484443367, 0.2338527292, -0.3511552811, 0.1081818715, 0.2206701636, 0.1759027839, 0.1138891205, -0.1089850664, 0.2690153718, 0.3656853139, 0.0705155656, 0.1209142357, -0.2366243303, -0.0922287181, -0.1715156287, 0.4842028618, 0.3627239466, -0.0255859457, -0.073089242, 0.3563796282, 0.1833981574, -0.1992233247, 0.1312805116, 0.3693415821, -0.1744246632, -0.126158461, 0.0944814235, 0.065598011, -0.3000324965, -0.0709871948, -0.0908449665, 0.0828673989, -0.1136057153, -0.0894262716, 0.2825582922, 0.0228702873, -0.076665625, 0.1131795123, 0.0379412323, -0.1318216175, 0.0248667225, -0.0704166517, 0.1261966676, 0.0154216904, -0.0932069048, 0.0873603672, -0.218415916, -0.1239407063, 0.0018027006, -0.1370613873, 0.0769498348, 0.1752731055, -0.0511879697, 0.038401451, -0.1730400473, 0.1069573462, -0.1529778838, -0.2987068594, -0.5071570873, 0.435893029, 0.1864787787, -0.338109374, 0.3402595818, 0.104027614, -0.0295993853, -0.1062710211, -0.1198950633, 0.5749330521, 0.2730709612, 0.019617686, 0.2990927994, -0.2454116046, 0.2542809844, -0.2662388682, -0.1036153138, -0.0977649093, 0.4209234118, 0.0310557783, 0.2350629866, 0.148280412, -0.0869456232, -0.2320593894, 0.4703808725, -0.1788093746, -0.0138720572, 0.2665264308, -0.4247210622, -0.0115635023, -0.3480208218, -0.2494779825, 0.0067337304, 0.1732288599, -0.0944132358, -0.0554348305, 0.008538004, -0.2627931833, 0.0078360327, 0.153237164, -0.0718519539, 0.3198346198, 0.1165152863, 0.1005953103, -0.1228921488, -0.1794068366, 0.1028476655, 0.1782900989, -0.275191009, 0.1844428182, -0.0228228644, -0.0134468302, -0.2862631679, 0.1904247552, -0.1268012226, 0.0388155058, -0.064743273, 0.2586891651, -0.0779029801, 0.1019122973, 0.3001608849, 0.3549228907, 0.462962091, -0.0780947804, 0.3263480961, -0.0826636776, -0.1796956658, 0.0329436511, 0.3345074356, -0.3179924488, -0.0248901304, 0.0053771734, 0.0511769727, -0.098929733, -0.1660347879, 0.0946891308, 0.1968245208, -0.1661896259, -0.2573056519, -0.0721762553, 0.1776912063, -0.0424321517, 0.1197286695, -0.2324413657, 0.0084067807, 0.1450569928, -0.0062898882, -0.1843237281, 0.0404452905, -0.1199223399, -0.0470516533, -0.4116415977, -0.053518068, 0.1070510522, 0.1367316544, 0.3593674898, 0.1067891791, 0.194451645, 0.2884450853, 0.1585922837, 0.0063672885, -0.0885991901, 0.5141718388, -0.0175727978, -0.2971983254, -0.2908624411, -0.2397544682, 0.0251131542, 0.3155243993, -0.6842602491, 0.2337829173, -0.1138520464, -0.0940265656, -0.3955192268, 0.0007257089, 0.252205044, -0.0761481375, 0.0163395926, 0.0954741985, 0.2302769721, 0.1546791494, -0.0387872718, 0.3857944906, -0.4130838811, 0.359408766, -0.1354579031, 0.4855529666, -0.1221182644, -0.2300207019, 0.0988750979, 0.1504482776, -0.0171156786, -0.0218024775, -0.0139686503, -0.2593793869, -0.0582374632, 0.0227312893, 0.5197216272, 0.0168899372, -0.1655897647, -0.0158990994, 0.0681643113, 0.2563194036, -0.2606988251, 0.4905276, -0.1907785088, 0.0184165761, -0.1996744126, 0.0187455565, -0.2049808204, -0.2150854766, -0.1641347706, 0.0934438407, -0.0303831622, -0.2240440995, -0.2116935551, 0.10314022, -0.3681344092, -0.0437650606, 0.1179903895, 0.2293324769, 0.1807705015, -0.1961435974, -0.086288482, -0.2018707544, 0.3574145138, 0.1934524179, 0.1124655455, 0.1632595956, -0.0568787605, -0.2155371606, -0.1774517596, -0.2904546559, 0.1419078708, 0.6505103111, 0.6102460623, -0.2162735164, -0.1585993618, 0.0791448355, 0.0026035607, -0.1652731597, -0.5487569571, -0.1285093129, -0.1654994041, -0.4410517216, 0.0046067014, 0.3584984243, 0.3558533788, -0.2264456898, -0.0318751894, 0.1387300938, -0.0007012784, 0.3717512488, -0.1297643632, 0.0150536448, -0.066410847, 0.2258286923, -0.1074174196, 0.3673555255, 0.2488899529, 0.3563638031, 0.1917256266, -0.3175569773, 0.0706882849, -0.0298886281, 0.2314305753, 0.115161255, 0.1102285907, 0.3546282947, 0.1289709657, 0.3322556615, -0.1909662783, 0.6737231016, -0.0073441286, 0.2068271488, -0.8374903798, -0.2673701644, 0.3314557374, -0.0263577811, 0.1994024962, 0.215717867, -0.4038517177, -0.3216702938, 0.3766611218, 0.0990837663, 0.7675249577, -0.2972689867, 0.3962582052, -0.1552994698, 0.6611967087, 0.1280310154, -0.3813101053, 0.1585618854, -0.3987909555, -0.2663655877, 0.2130052447, -0.0923116431, 0.0365401059, 0.1605039835, -0.2905768156, -0.0298117511, 0.0862519294, 0.2249331921, 0.1306698024, 0.5134935379, 0.120926626, -0.6006980538, -0.0828129277, 0.0265690908, -0.0234003812, -0.1049019024, -0.0492794514, 0.0799459741, -0.0313268006, -0.2085639238, -0.0284662768, 0.2052582055, -0.2474462092, 0.0866158754, 0.0637520179, -0.0720534623, 0.5367990732, 0.1062302291, -0.0013607293, 0.4988959134, -0.0392867513, 0.1321583986, -0.2274820209, -0.1652866751, 0.2007084042, -0.0658488423, 0.4217028916, 0.0883791074, -0.00058797, 0.1141032875, 0.0632384419, -0.0466908254, 0.0947640389, -0.3643093109, 0.1054970175, -0.4112315476, -0.2154278457, -0.0884305984, -0.3007827401, -0.1009221599, 0.123109594, 0.1373566687, -0.464446038, 0.1151452065, 0.2652114034, -0.375010103, 0.0119969212, 0.3838247359, 0.1710590869, -0.1051390767, 0.499843061, 0.457587719, -0.0533619151, -0.2644975185, -0.1096296161, 0.2944480777, -0.2731434405, 0.0849161297, -0.2455591708, -0.3016008735, 0.2093508542, 0.0808756128, 0.1000995561, -0.0149994157, -0.0632621273, -0.2510291338, -0.6380496025, -0.1814441532, -0.0027227891, -0.0035484964, 0.1460835785, 0.2738982141, -0.3100468516, 0.5892775059, -0.2788560092, -0.0529419072, -0.2076085508, 0.2782230973, -0.0555623658, -0.3855459988, 0.2115361542, 0.2033527046, 0.1001042053, 0.0805963874, 0.0059093423, -0.2213539034, -0.1435202807, 0.112715289, 0.2364698499, -0.4196630418, -0.0159935355, -0.3129969835, -0.0924898088, -0.2700214386, 0.0660682172, 0.0170285404, 0.0139458058, -0.2261784673, 0.0093940832, 0.0475259274, -0.2552625537, 0.042060256, -0.2545849085, 0.1250909567, 0.246571213, 0.1331031621, 0.1178301424, 0.0006369948, -0.3931023479, 0.1796455681, 0.6661384702, -0.0175323728, 0.419082135, -0.4357323647, -0.0715063736, 0.247659564, 0.0498031005, 0.0480658486, -0.3416772485, -0.0107186846, 0.2935878038, 0.2268191278, 0.0152087137, -0.0147332661, 0.2263551354, -0.4112984538, 0.1301490664, 0.3453423977, -0.1119266823, -0.0674753264, 0.1559654474, 0.0913231522, 0.5066777468, 0.1149650663, 0.0057111867, 0.2489495873, -0.0012200177, 0.1548573822, 0.4856925309, 0.2611194551, 0.2386806309, 0.3016281128, -0.0326658972, -0.0406999141, -0.2312837243, 0.2294136882, -0.435616076, -0.415166378, -0.2674563825, 0.5044151545, -0.0193236805, 0.1242090911, -0.3350303173, -0.0662411451, -0.0406532101, 0.2541327477, -0.1192157567, 0.2744088173, -0.4530941248, -0.1705991924, -0.0894243345, -0.123961933, 0.0115341395, 0.1896333247, -0.0558586121, 0.2722765803, -0.5796659589, 0.2293862104, 0.0521361418, 0.1451499909, -0.1346786618, 0.1227787733, 0.1055012345, -0.5078569651, 0.3433353007, 0.2378563285, 0.2918590903, 0.115404956, -0.221906662, 0.3443704844, 0.4366974831, -0.1908073723, -0.0406121649, -0.1209841669, -0.0356197953, -0.1499129832, 0.1974350214, 0.3490367532, 0.1878990531, 0.2596160173, 0.0557278506, -0.265966177, 0.2785172462, 0.0650140941, -0.3240586519, 0.1786668301, -0.1329390854, 0.4019319415, -0.3285808265, -0.4429640174, 0.009867128, -0.3941243589, 0.2155868858, 0.6670196056, -0.2954224646, 0.2878553569, -0.0254881512, 0.0485503711, 0.0018406976, 0.4760496616, 0.4697353244, 0.2841596603, -0.3783711195, -0.0256674215, -0.3288833499, 0.2765586376, -0.2024302334, 0.0669976398, 0.0222478993, 0.1707946956, 0.1645776778, -0.0033023204, -0.0796225369, -0.0136221573, 0.218803376, -0.1950690895, -0.1675592959, -0.1280471683, 0.0020943638, 0.3573175073, -0.0259005763, -0.2975495458, 0.1588650346, 0.1150627136, 0.0991571695, -0.3214533329, 0.3626994789, -0.1984633654, -0.0203625765, 0.0691283345, -0.1504710466, 0.3792098761, -0.2204388827, -0.0330470726, -0.0818270966, 0.1281416565, -0.1670770645, -0.2209998518, 0.3402220011, 0.3884243667, -0.1076362282, 0.1082137078, -0.255192548, -0.0864622518, 0.0403043032, -0.369690299, -0.3087161779, -0.0074875243, -0.0002530813, 0.0085244328, 0.1030983254, 0.2458303273, 0.1053374782, -0.0278403312, -0.2292289734, -0.1855804026, 0.2094751894, -0.2946497798, -0.3130407035, -0.0353798121, 0.055624567, -0.1636292785, 0.2511017621, -0.4624068141, 0.1457485557, 0.2297439128, -0.0255144164, -0.4540739357, 0.2232972682, 0.2540614009, 0.0164720193, -0.2076969594, 0.0534572713, 0.152623266, -0.1369921267, -0.2016401291, -0.1757197082 ]
https://github.com/huggingface/datasets/issues/633
Load large text file for LM pre-training resulting in OOM
This seems to be on the `transformers` library side. If you have more informations (pip env) or even better, a colab reproducing the error we can investigate.
I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks.
27
Load large text file for LM pre-training resulting in OOM I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks. This seems to be on the `transformers` library side. If you have more informations (pip env) or even better, a colab reproducing the error we can investigate.
[ -0.6339286566, -0.4775367379, 0.0106935017, 0.2986355722, 0.3600474596, -0.1518248618, 0.5567324758, 0.3738059402, 0.0108827576, 0.0107196309, -0.129592672, -0.1827997565, -0.2669844627, -0.1620898247, -0.0323721357, 0.0232065637, -0.0998215079, 0.1968753189, -0.2772667408, -0.0966669172, 0.0570117384, -0.0776291862, -0.2127603889, -0.0297442861, -0.3879216015, -0.0617137849, 0.3158174157, 0.1221210212, -0.1640810221, -0.0958083943, -0.2297017574, 0.1197900772, 0.439348489, 0.4340211749, -0.0001147721, 0.0066160336, 0.1890438497, -0.2363324165, -0.1672690213, 0.0192163214, 0.0775555521, -0.3022914231, -0.1064001918, -0.3036173582, -0.0272463262, -0.1061252505, -0.0156821907, -0.2243723869, 0.4987519681, 0.5414042473, 0.158762753, 0.2508125007, -0.2136608064, 0.0723968297, 0.2544375062, 0.0488999188, 0.1056828648, -0.2795148194, 0.4687834084, -0.1923182011, -0.4726887643, 0.147817716, -0.2247735411, -0.2016820759, -0.0950891525, 0.0382048525, 0.3660891652, 0.0315695107, 0.2641872764, 0.3474208415, 0.4272072911, -0.0008119822, -0.2896353602, -0.3209854364, -0.142623961, -0.0843160674, 0.1892364621, 0.2010776401, 0.0598267131, 0.1184588969, 0.1071273386, -0.1253313124, -0.1519896984, 0.0761606991, -0.2748077214, 0.3397959173, -0.2012510002, -0.0380433276, 0.1555755138, -0.0944792777, 0.1961691678, -0.0997751206, 0.0034044338, 0.256714642, -0.2454416156, -0.1123953462, -0.0716817975, -0.5194139481, 0.1627842039, -0.275945127, 0.1981878579, 0.2972391844, -0.0876616836, -0.0676169693, 0.0863933712, 0.4202023745, -0.2524377108, 0.2605276108, 0.2272560447, 0.1640110612, -0.1717063487, -0.044398699, -0.3305115998, -0.1962453574, 0.1015936062, -0.0773412734, -0.0110169947, -0.2546044588, -0.2404216379, 0.0585461818, -0.1407698691, -0.0308450628, 0.2748159468, 0.3899732232, -0.3540614843, 0.429497689, 0.1651095748, 0.0523289181, -0.4855332375, -0.3365268707, -0.1744022667, 0.1532573104, -0.2032445073, 0.0389754251, 0.1297925711, 0.1396334916, 0.0596540496, -0.0428831056, -0.062589474, -0.4402189851, -0.0094447248, -0.0241031349, 0.0207529441, -0.0425147153, 0.0630241409, -0.0233657025, 0.2347659618, -0.124994956, -0.0253818408, 0.347791791, -0.2762121558, -0.2276411355, 0.0509795845, 0.2101765722, -0.0282484777, 0.2091551125, 0.0285063665, 0.0768566355, 0.5359786153, 0.0066538528, -0.0471477136, -0.3782058656, 0.0945213884, 0.0920172632, 0.1619913876, 0.176630199, -0.0425881371, -0.0726022273, 0.0063893422, 0.1035867184, 0.2550407052, 0.3907325864, -0.2264701426, 0.2906734943, 0.0070044398, -0.0566552505, 0.5319920778, -0.2620993257, -0.4484443367, 0.2338527292, -0.3511552811, 0.1081818715, 0.2206701636, 0.1759027839, 0.1138891205, -0.1089850664, 0.2690153718, 0.3656853139, 0.0705155656, 0.1209142357, -0.2366243303, -0.0922287181, -0.1715156287, 0.4842028618, 0.3627239466, -0.0255859457, -0.073089242, 0.3563796282, 0.1833981574, -0.1992233247, 0.1312805116, 0.3693415821, -0.1744246632, -0.126158461, 0.0944814235, 0.065598011, -0.3000324965, -0.0709871948, -0.0908449665, 0.0828673989, -0.1136057153, -0.0894262716, 0.2825582922, 0.0228702873, -0.076665625, 0.1131795123, 0.0379412323, -0.1318216175, 0.0248667225, -0.0704166517, 0.1261966676, 0.0154216904, -0.0932069048, 0.0873603672, -0.218415916, -0.1239407063, 0.0018027006, -0.1370613873, 0.0769498348, 0.1752731055, -0.0511879697, 0.038401451, -0.1730400473, 0.1069573462, -0.1529778838, -0.2987068594, -0.5071570873, 0.435893029, 0.1864787787, -0.338109374, 0.3402595818, 0.104027614, -0.0295993853, -0.1062710211, -0.1198950633, 0.5749330521, 0.2730709612, 0.019617686, 0.2990927994, -0.2454116046, 0.2542809844, -0.2662388682, -0.1036153138, -0.0977649093, 0.4209234118, 0.0310557783, 0.2350629866, 0.148280412, -0.0869456232, -0.2320593894, 0.4703808725, -0.1788093746, -0.0138720572, 0.2665264308, -0.4247210622, -0.0115635023, -0.3480208218, -0.2494779825, 0.0067337304, 0.1732288599, -0.0944132358, -0.0554348305, 0.008538004, -0.2627931833, 0.0078360327, 0.153237164, -0.0718519539, 0.3198346198, 0.1165152863, 0.1005953103, -0.1228921488, -0.1794068366, 0.1028476655, 0.1782900989, -0.275191009, 0.1844428182, -0.0228228644, -0.0134468302, -0.2862631679, 0.1904247552, -0.1268012226, 0.0388155058, -0.064743273, 0.2586891651, -0.0779029801, 0.1019122973, 0.3001608849, 0.3549228907, 0.462962091, -0.0780947804, 0.3263480961, -0.0826636776, -0.1796956658, 0.0329436511, 0.3345074356, -0.3179924488, -0.0248901304, 0.0053771734, 0.0511769727, -0.098929733, -0.1660347879, 0.0946891308, 0.1968245208, -0.1661896259, -0.2573056519, -0.0721762553, 0.1776912063, -0.0424321517, 0.1197286695, -0.2324413657, 0.0084067807, 0.1450569928, -0.0062898882, -0.1843237281, 0.0404452905, -0.1199223399, -0.0470516533, -0.4116415977, -0.053518068, 0.1070510522, 0.1367316544, 0.3593674898, 0.1067891791, 0.194451645, 0.2884450853, 0.1585922837, 0.0063672885, -0.0885991901, 0.5141718388, -0.0175727978, -0.2971983254, -0.2908624411, -0.2397544682, 0.0251131542, 0.3155243993, -0.6842602491, 0.2337829173, -0.1138520464, -0.0940265656, -0.3955192268, 0.0007257089, 0.252205044, -0.0761481375, 0.0163395926, 0.0954741985, 0.2302769721, 0.1546791494, -0.0387872718, 0.3857944906, -0.4130838811, 0.359408766, -0.1354579031, 0.4855529666, -0.1221182644, -0.2300207019, 0.0988750979, 0.1504482776, -0.0171156786, -0.0218024775, -0.0139686503, -0.2593793869, -0.0582374632, 0.0227312893, 0.5197216272, 0.0168899372, -0.1655897647, -0.0158990994, 0.0681643113, 0.2563194036, -0.2606988251, 0.4905276, -0.1907785088, 0.0184165761, -0.1996744126, 0.0187455565, -0.2049808204, -0.2150854766, -0.1641347706, 0.0934438407, -0.0303831622, -0.2240440995, -0.2116935551, 0.10314022, -0.3681344092, -0.0437650606, 0.1179903895, 0.2293324769, 0.1807705015, -0.1961435974, -0.086288482, -0.2018707544, 0.3574145138, 0.1934524179, 0.1124655455, 0.1632595956, -0.0568787605, -0.2155371606, -0.1774517596, -0.2904546559, 0.1419078708, 0.6505103111, 0.6102460623, -0.2162735164, -0.1585993618, 0.0791448355, 0.0026035607, -0.1652731597, -0.5487569571, -0.1285093129, -0.1654994041, -0.4410517216, 0.0046067014, 0.3584984243, 0.3558533788, -0.2264456898, -0.0318751894, 0.1387300938, -0.0007012784, 0.3717512488, -0.1297643632, 0.0150536448, -0.066410847, 0.2258286923, -0.1074174196, 0.3673555255, 0.2488899529, 0.3563638031, 0.1917256266, -0.3175569773, 0.0706882849, -0.0298886281, 0.2314305753, 0.115161255, 0.1102285907, 0.3546282947, 0.1289709657, 0.3322556615, -0.1909662783, 0.6737231016, -0.0073441286, 0.2068271488, -0.8374903798, -0.2673701644, 0.3314557374, -0.0263577811, 0.1994024962, 0.215717867, -0.4038517177, -0.3216702938, 0.3766611218, 0.0990837663, 0.7675249577, -0.2972689867, 0.3962582052, -0.1552994698, 0.6611967087, 0.1280310154, -0.3813101053, 0.1585618854, -0.3987909555, -0.2663655877, 0.2130052447, -0.0923116431, 0.0365401059, 0.1605039835, -0.2905768156, -0.0298117511, 0.0862519294, 0.2249331921, 0.1306698024, 0.5134935379, 0.120926626, -0.6006980538, -0.0828129277, 0.0265690908, -0.0234003812, -0.1049019024, -0.0492794514, 0.0799459741, -0.0313268006, -0.2085639238, -0.0284662768, 0.2052582055, -0.2474462092, 0.0866158754, 0.0637520179, -0.0720534623, 0.5367990732, 0.1062302291, -0.0013607293, 0.4988959134, -0.0392867513, 0.1321583986, -0.2274820209, -0.1652866751, 0.2007084042, -0.0658488423, 0.4217028916, 0.0883791074, -0.00058797, 0.1141032875, 0.0632384419, -0.0466908254, 0.0947640389, -0.3643093109, 0.1054970175, -0.4112315476, -0.2154278457, -0.0884305984, -0.3007827401, -0.1009221599, 0.123109594, 0.1373566687, -0.464446038, 0.1151452065, 0.2652114034, -0.375010103, 0.0119969212, 0.3838247359, 0.1710590869, -0.1051390767, 0.499843061, 0.457587719, -0.0533619151, -0.2644975185, -0.1096296161, 0.2944480777, -0.2731434405, 0.0849161297, -0.2455591708, -0.3016008735, 0.2093508542, 0.0808756128, 0.1000995561, -0.0149994157, -0.0632621273, -0.2510291338, -0.6380496025, -0.1814441532, -0.0027227891, -0.0035484964, 0.1460835785, 0.2738982141, -0.3100468516, 0.5892775059, -0.2788560092, -0.0529419072, -0.2076085508, 0.2782230973, -0.0555623658, -0.3855459988, 0.2115361542, 0.2033527046, 0.1001042053, 0.0805963874, 0.0059093423, -0.2213539034, -0.1435202807, 0.112715289, 0.2364698499, -0.4196630418, -0.0159935355, -0.3129969835, -0.0924898088, -0.2700214386, 0.0660682172, 0.0170285404, 0.0139458058, -0.2261784673, 0.0093940832, 0.0475259274, -0.2552625537, 0.042060256, -0.2545849085, 0.1250909567, 0.246571213, 0.1331031621, 0.1178301424, 0.0006369948, -0.3931023479, 0.1796455681, 0.6661384702, -0.0175323728, 0.419082135, -0.4357323647, -0.0715063736, 0.247659564, 0.0498031005, 0.0480658486, -0.3416772485, -0.0107186846, 0.2935878038, 0.2268191278, 0.0152087137, -0.0147332661, 0.2263551354, -0.4112984538, 0.1301490664, 0.3453423977, -0.1119266823, -0.0674753264, 0.1559654474, 0.0913231522, 0.5066777468, 0.1149650663, 0.0057111867, 0.2489495873, -0.0012200177, 0.1548573822, 0.4856925309, 0.2611194551, 0.2386806309, 0.3016281128, -0.0326658972, -0.0406999141, -0.2312837243, 0.2294136882, -0.435616076, -0.415166378, -0.2674563825, 0.5044151545, -0.0193236805, 0.1242090911, -0.3350303173, -0.0662411451, -0.0406532101, 0.2541327477, -0.1192157567, 0.2744088173, -0.4530941248, -0.1705991924, -0.0894243345, -0.123961933, 0.0115341395, 0.1896333247, -0.0558586121, 0.2722765803, -0.5796659589, 0.2293862104, 0.0521361418, 0.1451499909, -0.1346786618, 0.1227787733, 0.1055012345, -0.5078569651, 0.3433353007, 0.2378563285, 0.2918590903, 0.115404956, -0.221906662, 0.3443704844, 0.4366974831, -0.1908073723, -0.0406121649, -0.1209841669, -0.0356197953, -0.1499129832, 0.1974350214, 0.3490367532, 0.1878990531, 0.2596160173, 0.0557278506, -0.265966177, 0.2785172462, 0.0650140941, -0.3240586519, 0.1786668301, -0.1329390854, 0.4019319415, -0.3285808265, -0.4429640174, 0.009867128, -0.3941243589, 0.2155868858, 0.6670196056, -0.2954224646, 0.2878553569, -0.0254881512, 0.0485503711, 0.0018406976, 0.4760496616, 0.4697353244, 0.2841596603, -0.3783711195, -0.0256674215, -0.3288833499, 0.2765586376, -0.2024302334, 0.0669976398, 0.0222478993, 0.1707946956, 0.1645776778, -0.0033023204, -0.0796225369, -0.0136221573, 0.218803376, -0.1950690895, -0.1675592959, -0.1280471683, 0.0020943638, 0.3573175073, -0.0259005763, -0.2975495458, 0.1588650346, 0.1150627136, 0.0991571695, -0.3214533329, 0.3626994789, -0.1984633654, -0.0203625765, 0.0691283345, -0.1504710466, 0.3792098761, -0.2204388827, -0.0330470726, -0.0818270966, 0.1281416565, -0.1670770645, -0.2209998518, 0.3402220011, 0.3884243667, -0.1076362282, 0.1082137078, -0.255192548, -0.0864622518, 0.0403043032, -0.369690299, -0.3087161779, -0.0074875243, -0.0002530813, 0.0085244328, 0.1030983254, 0.2458303273, 0.1053374782, -0.0278403312, -0.2292289734, -0.1855804026, 0.2094751894, -0.2946497798, -0.3130407035, -0.0353798121, 0.055624567, -0.1636292785, 0.2511017621, -0.4624068141, 0.1457485557, 0.2297439128, -0.0255144164, -0.4540739357, 0.2232972682, 0.2540614009, 0.0164720193, -0.2076969594, 0.0534572713, 0.152623266, -0.1369921267, -0.2016401291, -0.1757197082 ]
https://github.com/huggingface/datasets/issues/633
Load large text file for LM pre-training resulting in OOM
It seems like it's solved with freshed versions of transformers. I have tried to replicate the error doing a fresh pip install transformers & datasets on colab and the error doesn't continue. On colab it keeps stable on 5GB! (Y) Edit: **Thanks for your great work**. Have a good day.
I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks.
50
Load large text file for LM pre-training resulting in OOM I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks. It seems like it's solved with freshed versions of transformers. I have tried to replicate the error doing a fresh pip install transformers & datasets on colab and the error doesn't continue. On colab it keeps stable on 5GB! (Y) Edit: **Thanks for your great work**. Have a good day.
[ -0.6339286566, -0.4775367379, 0.0106935017, 0.2986355722, 0.3600474596, -0.1518248618, 0.5567324758, 0.3738059402, 0.0108827576, 0.0107196309, -0.129592672, -0.1827997565, -0.2669844627, -0.1620898247, -0.0323721357, 0.0232065637, -0.0998215079, 0.1968753189, -0.2772667408, -0.0966669172, 0.0570117384, -0.0776291862, -0.2127603889, -0.0297442861, -0.3879216015, -0.0617137849, 0.3158174157, 0.1221210212, -0.1640810221, -0.0958083943, -0.2297017574, 0.1197900772, 0.439348489, 0.4340211749, -0.0001147721, 0.0066160336, 0.1890438497, -0.2363324165, -0.1672690213, 0.0192163214, 0.0775555521, -0.3022914231, -0.1064001918, -0.3036173582, -0.0272463262, -0.1061252505, -0.0156821907, -0.2243723869, 0.4987519681, 0.5414042473, 0.158762753, 0.2508125007, -0.2136608064, 0.0723968297, 0.2544375062, 0.0488999188, 0.1056828648, -0.2795148194, 0.4687834084, -0.1923182011, -0.4726887643, 0.147817716, -0.2247735411, -0.2016820759, -0.0950891525, 0.0382048525, 0.3660891652, 0.0315695107, 0.2641872764, 0.3474208415, 0.4272072911, -0.0008119822, -0.2896353602, -0.3209854364, -0.142623961, -0.0843160674, 0.1892364621, 0.2010776401, 0.0598267131, 0.1184588969, 0.1071273386, -0.1253313124, -0.1519896984, 0.0761606991, -0.2748077214, 0.3397959173, -0.2012510002, -0.0380433276, 0.1555755138, -0.0944792777, 0.1961691678, -0.0997751206, 0.0034044338, 0.256714642, -0.2454416156, -0.1123953462, -0.0716817975, -0.5194139481, 0.1627842039, -0.275945127, 0.1981878579, 0.2972391844, -0.0876616836, -0.0676169693, 0.0863933712, 0.4202023745, -0.2524377108, 0.2605276108, 0.2272560447, 0.1640110612, -0.1717063487, -0.044398699, -0.3305115998, -0.1962453574, 0.1015936062, -0.0773412734, -0.0110169947, -0.2546044588, -0.2404216379, 0.0585461818, -0.1407698691, -0.0308450628, 0.2748159468, 0.3899732232, -0.3540614843, 0.429497689, 0.1651095748, 0.0523289181, -0.4855332375, -0.3365268707, -0.1744022667, 0.1532573104, -0.2032445073, 0.0389754251, 0.1297925711, 0.1396334916, 0.0596540496, -0.0428831056, -0.062589474, -0.4402189851, -0.0094447248, -0.0241031349, 0.0207529441, -0.0425147153, 0.0630241409, -0.0233657025, 0.2347659618, -0.124994956, -0.0253818408, 0.347791791, -0.2762121558, -0.2276411355, 0.0509795845, 0.2101765722, -0.0282484777, 0.2091551125, 0.0285063665, 0.0768566355, 0.5359786153, 0.0066538528, -0.0471477136, -0.3782058656, 0.0945213884, 0.0920172632, 0.1619913876, 0.176630199, -0.0425881371, -0.0726022273, 0.0063893422, 0.1035867184, 0.2550407052, 0.3907325864, -0.2264701426, 0.2906734943, 0.0070044398, -0.0566552505, 0.5319920778, -0.2620993257, -0.4484443367, 0.2338527292, -0.3511552811, 0.1081818715, 0.2206701636, 0.1759027839, 0.1138891205, -0.1089850664, 0.2690153718, 0.3656853139, 0.0705155656, 0.1209142357, -0.2366243303, -0.0922287181, -0.1715156287, 0.4842028618, 0.3627239466, -0.0255859457, -0.073089242, 0.3563796282, 0.1833981574, -0.1992233247, 0.1312805116, 0.3693415821, -0.1744246632, -0.126158461, 0.0944814235, 0.065598011, -0.3000324965, -0.0709871948, -0.0908449665, 0.0828673989, -0.1136057153, -0.0894262716, 0.2825582922, 0.0228702873, -0.076665625, 0.1131795123, 0.0379412323, -0.1318216175, 0.0248667225, -0.0704166517, 0.1261966676, 0.0154216904, -0.0932069048, 0.0873603672, -0.218415916, -0.1239407063, 0.0018027006, -0.1370613873, 0.0769498348, 0.1752731055, -0.0511879697, 0.038401451, -0.1730400473, 0.1069573462, -0.1529778838, -0.2987068594, -0.5071570873, 0.435893029, 0.1864787787, -0.338109374, 0.3402595818, 0.104027614, -0.0295993853, -0.1062710211, -0.1198950633, 0.5749330521, 0.2730709612, 0.019617686, 0.2990927994, -0.2454116046, 0.2542809844, -0.2662388682, -0.1036153138, -0.0977649093, 0.4209234118, 0.0310557783, 0.2350629866, 0.148280412, -0.0869456232, -0.2320593894, 0.4703808725, -0.1788093746, -0.0138720572, 0.2665264308, -0.4247210622, -0.0115635023, -0.3480208218, -0.2494779825, 0.0067337304, 0.1732288599, -0.0944132358, -0.0554348305, 0.008538004, -0.2627931833, 0.0078360327, 0.153237164, -0.0718519539, 0.3198346198, 0.1165152863, 0.1005953103, -0.1228921488, -0.1794068366, 0.1028476655, 0.1782900989, -0.275191009, 0.1844428182, -0.0228228644, -0.0134468302, -0.2862631679, 0.1904247552, -0.1268012226, 0.0388155058, -0.064743273, 0.2586891651, -0.0779029801, 0.1019122973, 0.3001608849, 0.3549228907, 0.462962091, -0.0780947804, 0.3263480961, -0.0826636776, -0.1796956658, 0.0329436511, 0.3345074356, -0.3179924488, -0.0248901304, 0.0053771734, 0.0511769727, -0.098929733, -0.1660347879, 0.0946891308, 0.1968245208, -0.1661896259, -0.2573056519, -0.0721762553, 0.1776912063, -0.0424321517, 0.1197286695, -0.2324413657, 0.0084067807, 0.1450569928, -0.0062898882, -0.1843237281, 0.0404452905, -0.1199223399, -0.0470516533, -0.4116415977, -0.053518068, 0.1070510522, 0.1367316544, 0.3593674898, 0.1067891791, 0.194451645, 0.2884450853, 0.1585922837, 0.0063672885, -0.0885991901, 0.5141718388, -0.0175727978, -0.2971983254, -0.2908624411, -0.2397544682, 0.0251131542, 0.3155243993, -0.6842602491, 0.2337829173, -0.1138520464, -0.0940265656, -0.3955192268, 0.0007257089, 0.252205044, -0.0761481375, 0.0163395926, 0.0954741985, 0.2302769721, 0.1546791494, -0.0387872718, 0.3857944906, -0.4130838811, 0.359408766, -0.1354579031, 0.4855529666, -0.1221182644, -0.2300207019, 0.0988750979, 0.1504482776, -0.0171156786, -0.0218024775, -0.0139686503, -0.2593793869, -0.0582374632, 0.0227312893, 0.5197216272, 0.0168899372, -0.1655897647, -0.0158990994, 0.0681643113, 0.2563194036, -0.2606988251, 0.4905276, -0.1907785088, 0.0184165761, -0.1996744126, 0.0187455565, -0.2049808204, -0.2150854766, -0.1641347706, 0.0934438407, -0.0303831622, -0.2240440995, -0.2116935551, 0.10314022, -0.3681344092, -0.0437650606, 0.1179903895, 0.2293324769, 0.1807705015, -0.1961435974, -0.086288482, -0.2018707544, 0.3574145138, 0.1934524179, 0.1124655455, 0.1632595956, -0.0568787605, -0.2155371606, -0.1774517596, -0.2904546559, 0.1419078708, 0.6505103111, 0.6102460623, -0.2162735164, -0.1585993618, 0.0791448355, 0.0026035607, -0.1652731597, -0.5487569571, -0.1285093129, -0.1654994041, -0.4410517216, 0.0046067014, 0.3584984243, 0.3558533788, -0.2264456898, -0.0318751894, 0.1387300938, -0.0007012784, 0.3717512488, -0.1297643632, 0.0150536448, -0.066410847, 0.2258286923, -0.1074174196, 0.3673555255, 0.2488899529, 0.3563638031, 0.1917256266, -0.3175569773, 0.0706882849, -0.0298886281, 0.2314305753, 0.115161255, 0.1102285907, 0.3546282947, 0.1289709657, 0.3322556615, -0.1909662783, 0.6737231016, -0.0073441286, 0.2068271488, -0.8374903798, -0.2673701644, 0.3314557374, -0.0263577811, 0.1994024962, 0.215717867, -0.4038517177, -0.3216702938, 0.3766611218, 0.0990837663, 0.7675249577, -0.2972689867, 0.3962582052, -0.1552994698, 0.6611967087, 0.1280310154, -0.3813101053, 0.1585618854, -0.3987909555, -0.2663655877, 0.2130052447, -0.0923116431, 0.0365401059, 0.1605039835, -0.2905768156, -0.0298117511, 0.0862519294, 0.2249331921, 0.1306698024, 0.5134935379, 0.120926626, -0.6006980538, -0.0828129277, 0.0265690908, -0.0234003812, -0.1049019024, -0.0492794514, 0.0799459741, -0.0313268006, -0.2085639238, -0.0284662768, 0.2052582055, -0.2474462092, 0.0866158754, 0.0637520179, -0.0720534623, 0.5367990732, 0.1062302291, -0.0013607293, 0.4988959134, -0.0392867513, 0.1321583986, -0.2274820209, -0.1652866751, 0.2007084042, -0.0658488423, 0.4217028916, 0.0883791074, -0.00058797, 0.1141032875, 0.0632384419, -0.0466908254, 0.0947640389, -0.3643093109, 0.1054970175, -0.4112315476, -0.2154278457, -0.0884305984, -0.3007827401, -0.1009221599, 0.123109594, 0.1373566687, -0.464446038, 0.1151452065, 0.2652114034, -0.375010103, 0.0119969212, 0.3838247359, 0.1710590869, -0.1051390767, 0.499843061, 0.457587719, -0.0533619151, -0.2644975185, -0.1096296161, 0.2944480777, -0.2731434405, 0.0849161297, -0.2455591708, -0.3016008735, 0.2093508542, 0.0808756128, 0.1000995561, -0.0149994157, -0.0632621273, -0.2510291338, -0.6380496025, -0.1814441532, -0.0027227891, -0.0035484964, 0.1460835785, 0.2738982141, -0.3100468516, 0.5892775059, -0.2788560092, -0.0529419072, -0.2076085508, 0.2782230973, -0.0555623658, -0.3855459988, 0.2115361542, 0.2033527046, 0.1001042053, 0.0805963874, 0.0059093423, -0.2213539034, -0.1435202807, 0.112715289, 0.2364698499, -0.4196630418, -0.0159935355, -0.3129969835, -0.0924898088, -0.2700214386, 0.0660682172, 0.0170285404, 0.0139458058, -0.2261784673, 0.0093940832, 0.0475259274, -0.2552625537, 0.042060256, -0.2545849085, 0.1250909567, 0.246571213, 0.1331031621, 0.1178301424, 0.0006369948, -0.3931023479, 0.1796455681, 0.6661384702, -0.0175323728, 0.419082135, -0.4357323647, -0.0715063736, 0.247659564, 0.0498031005, 0.0480658486, -0.3416772485, -0.0107186846, 0.2935878038, 0.2268191278, 0.0152087137, -0.0147332661, 0.2263551354, -0.4112984538, 0.1301490664, 0.3453423977, -0.1119266823, -0.0674753264, 0.1559654474, 0.0913231522, 0.5066777468, 0.1149650663, 0.0057111867, 0.2489495873, -0.0012200177, 0.1548573822, 0.4856925309, 0.2611194551, 0.2386806309, 0.3016281128, -0.0326658972, -0.0406999141, -0.2312837243, 0.2294136882, -0.435616076, -0.415166378, -0.2674563825, 0.5044151545, -0.0193236805, 0.1242090911, -0.3350303173, -0.0662411451, -0.0406532101, 0.2541327477, -0.1192157567, 0.2744088173, -0.4530941248, -0.1705991924, -0.0894243345, -0.123961933, 0.0115341395, 0.1896333247, -0.0558586121, 0.2722765803, -0.5796659589, 0.2293862104, 0.0521361418, 0.1451499909, -0.1346786618, 0.1227787733, 0.1055012345, -0.5078569651, 0.3433353007, 0.2378563285, 0.2918590903, 0.115404956, -0.221906662, 0.3443704844, 0.4366974831, -0.1908073723, -0.0406121649, -0.1209841669, -0.0356197953, -0.1499129832, 0.1974350214, 0.3490367532, 0.1878990531, 0.2596160173, 0.0557278506, -0.265966177, 0.2785172462, 0.0650140941, -0.3240586519, 0.1786668301, -0.1329390854, 0.4019319415, -0.3285808265, -0.4429640174, 0.009867128, -0.3941243589, 0.2155868858, 0.6670196056, -0.2954224646, 0.2878553569, -0.0254881512, 0.0485503711, 0.0018406976, 0.4760496616, 0.4697353244, 0.2841596603, -0.3783711195, -0.0256674215, -0.3288833499, 0.2765586376, -0.2024302334, 0.0669976398, 0.0222478993, 0.1707946956, 0.1645776778, -0.0033023204, -0.0796225369, -0.0136221573, 0.218803376, -0.1950690895, -0.1675592959, -0.1280471683, 0.0020943638, 0.3573175073, -0.0259005763, -0.2975495458, 0.1588650346, 0.1150627136, 0.0991571695, -0.3214533329, 0.3626994789, -0.1984633654, -0.0203625765, 0.0691283345, -0.1504710466, 0.3792098761, -0.2204388827, -0.0330470726, -0.0818270966, 0.1281416565, -0.1670770645, -0.2209998518, 0.3402220011, 0.3884243667, -0.1076362282, 0.1082137078, -0.255192548, -0.0864622518, 0.0403043032, -0.369690299, -0.3087161779, -0.0074875243, -0.0002530813, 0.0085244328, 0.1030983254, 0.2458303273, 0.1053374782, -0.0278403312, -0.2292289734, -0.1855804026, 0.2094751894, -0.2946497798, -0.3130407035, -0.0353798121, 0.055624567, -0.1636292785, 0.2511017621, -0.4624068141, 0.1457485557, 0.2297439128, -0.0255144164, -0.4540739357, 0.2232972682, 0.2540614009, 0.0164720193, -0.2076969594, 0.0534572713, 0.152623266, -0.1369921267, -0.2016401291, -0.1757197082 ]
https://github.com/huggingface/datasets/issues/633
Load large text file for LM pre-training resulting in OOM
@gaceladri witch version transformers and datasets are you using now? I want to try again. Thanks.
I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks.
16
Load large text file for LM pre-training resulting in OOM I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks. @gaceladri witch version transformers and datasets are you using now? I want to try again. Thanks.
[ -0.6339286566, -0.4775367379, 0.0106935017, 0.2986355722, 0.3600474596, -0.1518248618, 0.5567324758, 0.3738059402, 0.0108827576, 0.0107196309, -0.129592672, -0.1827997565, -0.2669844627, -0.1620898247, -0.0323721357, 0.0232065637, -0.0998215079, 0.1968753189, -0.2772667408, -0.0966669172, 0.0570117384, -0.0776291862, -0.2127603889, -0.0297442861, -0.3879216015, -0.0617137849, 0.3158174157, 0.1221210212, -0.1640810221, -0.0958083943, -0.2297017574, 0.1197900772, 0.439348489, 0.4340211749, -0.0001147721, 0.0066160336, 0.1890438497, -0.2363324165, -0.1672690213, 0.0192163214, 0.0775555521, -0.3022914231, -0.1064001918, -0.3036173582, -0.0272463262, -0.1061252505, -0.0156821907, -0.2243723869, 0.4987519681, 0.5414042473, 0.158762753, 0.2508125007, -0.2136608064, 0.0723968297, 0.2544375062, 0.0488999188, 0.1056828648, -0.2795148194, 0.4687834084, -0.1923182011, -0.4726887643, 0.147817716, -0.2247735411, -0.2016820759, -0.0950891525, 0.0382048525, 0.3660891652, 0.0315695107, 0.2641872764, 0.3474208415, 0.4272072911, -0.0008119822, -0.2896353602, -0.3209854364, -0.142623961, -0.0843160674, 0.1892364621, 0.2010776401, 0.0598267131, 0.1184588969, 0.1071273386, -0.1253313124, -0.1519896984, 0.0761606991, -0.2748077214, 0.3397959173, -0.2012510002, -0.0380433276, 0.1555755138, -0.0944792777, 0.1961691678, -0.0997751206, 0.0034044338, 0.256714642, -0.2454416156, -0.1123953462, -0.0716817975, -0.5194139481, 0.1627842039, -0.275945127, 0.1981878579, 0.2972391844, -0.0876616836, -0.0676169693, 0.0863933712, 0.4202023745, -0.2524377108, 0.2605276108, 0.2272560447, 0.1640110612, -0.1717063487, -0.044398699, -0.3305115998, -0.1962453574, 0.1015936062, -0.0773412734, -0.0110169947, -0.2546044588, -0.2404216379, 0.0585461818, -0.1407698691, -0.0308450628, 0.2748159468, 0.3899732232, -0.3540614843, 0.429497689, 0.1651095748, 0.0523289181, -0.4855332375, -0.3365268707, -0.1744022667, 0.1532573104, -0.2032445073, 0.0389754251, 0.1297925711, 0.1396334916, 0.0596540496, -0.0428831056, -0.062589474, -0.4402189851, -0.0094447248, -0.0241031349, 0.0207529441, -0.0425147153, 0.0630241409, -0.0233657025, 0.2347659618, -0.124994956, -0.0253818408, 0.347791791, -0.2762121558, -0.2276411355, 0.0509795845, 0.2101765722, -0.0282484777, 0.2091551125, 0.0285063665, 0.0768566355, 0.5359786153, 0.0066538528, -0.0471477136, -0.3782058656, 0.0945213884, 0.0920172632, 0.1619913876, 0.176630199, -0.0425881371, -0.0726022273, 0.0063893422, 0.1035867184, 0.2550407052, 0.3907325864, -0.2264701426, 0.2906734943, 0.0070044398, -0.0566552505, 0.5319920778, -0.2620993257, -0.4484443367, 0.2338527292, -0.3511552811, 0.1081818715, 0.2206701636, 0.1759027839, 0.1138891205, -0.1089850664, 0.2690153718, 0.3656853139, 0.0705155656, 0.1209142357, -0.2366243303, -0.0922287181, -0.1715156287, 0.4842028618, 0.3627239466, -0.0255859457, -0.073089242, 0.3563796282, 0.1833981574, -0.1992233247, 0.1312805116, 0.3693415821, -0.1744246632, -0.126158461, 0.0944814235, 0.065598011, -0.3000324965, -0.0709871948, -0.0908449665, 0.0828673989, -0.1136057153, -0.0894262716, 0.2825582922, 0.0228702873, -0.076665625, 0.1131795123, 0.0379412323, -0.1318216175, 0.0248667225, -0.0704166517, 0.1261966676, 0.0154216904, -0.0932069048, 0.0873603672, -0.218415916, -0.1239407063, 0.0018027006, -0.1370613873, 0.0769498348, 0.1752731055, -0.0511879697, 0.038401451, -0.1730400473, 0.1069573462, -0.1529778838, -0.2987068594, -0.5071570873, 0.435893029, 0.1864787787, -0.338109374, 0.3402595818, 0.104027614, -0.0295993853, -0.1062710211, -0.1198950633, 0.5749330521, 0.2730709612, 0.019617686, 0.2990927994, -0.2454116046, 0.2542809844, -0.2662388682, -0.1036153138, -0.0977649093, 0.4209234118, 0.0310557783, 0.2350629866, 0.148280412, -0.0869456232, -0.2320593894, 0.4703808725, -0.1788093746, -0.0138720572, 0.2665264308, -0.4247210622, -0.0115635023, -0.3480208218, -0.2494779825, 0.0067337304, 0.1732288599, -0.0944132358, -0.0554348305, 0.008538004, -0.2627931833, 0.0078360327, 0.153237164, -0.0718519539, 0.3198346198, 0.1165152863, 0.1005953103, -0.1228921488, -0.1794068366, 0.1028476655, 0.1782900989, -0.275191009, 0.1844428182, -0.0228228644, -0.0134468302, -0.2862631679, 0.1904247552, -0.1268012226, 0.0388155058, -0.064743273, 0.2586891651, -0.0779029801, 0.1019122973, 0.3001608849, 0.3549228907, 0.462962091, -0.0780947804, 0.3263480961, -0.0826636776, -0.1796956658, 0.0329436511, 0.3345074356, -0.3179924488, -0.0248901304, 0.0053771734, 0.0511769727, -0.098929733, -0.1660347879, 0.0946891308, 0.1968245208, -0.1661896259, -0.2573056519, -0.0721762553, 0.1776912063, -0.0424321517, 0.1197286695, -0.2324413657, 0.0084067807, 0.1450569928, -0.0062898882, -0.1843237281, 0.0404452905, -0.1199223399, -0.0470516533, -0.4116415977, -0.053518068, 0.1070510522, 0.1367316544, 0.3593674898, 0.1067891791, 0.194451645, 0.2884450853, 0.1585922837, 0.0063672885, -0.0885991901, 0.5141718388, -0.0175727978, -0.2971983254, -0.2908624411, -0.2397544682, 0.0251131542, 0.3155243993, -0.6842602491, 0.2337829173, -0.1138520464, -0.0940265656, -0.3955192268, 0.0007257089, 0.252205044, -0.0761481375, 0.0163395926, 0.0954741985, 0.2302769721, 0.1546791494, -0.0387872718, 0.3857944906, -0.4130838811, 0.359408766, -0.1354579031, 0.4855529666, -0.1221182644, -0.2300207019, 0.0988750979, 0.1504482776, -0.0171156786, -0.0218024775, -0.0139686503, -0.2593793869, -0.0582374632, 0.0227312893, 0.5197216272, 0.0168899372, -0.1655897647, -0.0158990994, 0.0681643113, 0.2563194036, -0.2606988251, 0.4905276, -0.1907785088, 0.0184165761, -0.1996744126, 0.0187455565, -0.2049808204, -0.2150854766, -0.1641347706, 0.0934438407, -0.0303831622, -0.2240440995, -0.2116935551, 0.10314022, -0.3681344092, -0.0437650606, 0.1179903895, 0.2293324769, 0.1807705015, -0.1961435974, -0.086288482, -0.2018707544, 0.3574145138, 0.1934524179, 0.1124655455, 0.1632595956, -0.0568787605, -0.2155371606, -0.1774517596, -0.2904546559, 0.1419078708, 0.6505103111, 0.6102460623, -0.2162735164, -0.1585993618, 0.0791448355, 0.0026035607, -0.1652731597, -0.5487569571, -0.1285093129, -0.1654994041, -0.4410517216, 0.0046067014, 0.3584984243, 0.3558533788, -0.2264456898, -0.0318751894, 0.1387300938, -0.0007012784, 0.3717512488, -0.1297643632, 0.0150536448, -0.066410847, 0.2258286923, -0.1074174196, 0.3673555255, 0.2488899529, 0.3563638031, 0.1917256266, -0.3175569773, 0.0706882849, -0.0298886281, 0.2314305753, 0.115161255, 0.1102285907, 0.3546282947, 0.1289709657, 0.3322556615, -0.1909662783, 0.6737231016, -0.0073441286, 0.2068271488, -0.8374903798, -0.2673701644, 0.3314557374, -0.0263577811, 0.1994024962, 0.215717867, -0.4038517177, -0.3216702938, 0.3766611218, 0.0990837663, 0.7675249577, -0.2972689867, 0.3962582052, -0.1552994698, 0.6611967087, 0.1280310154, -0.3813101053, 0.1585618854, -0.3987909555, -0.2663655877, 0.2130052447, -0.0923116431, 0.0365401059, 0.1605039835, -0.2905768156, -0.0298117511, 0.0862519294, 0.2249331921, 0.1306698024, 0.5134935379, 0.120926626, -0.6006980538, -0.0828129277, 0.0265690908, -0.0234003812, -0.1049019024, -0.0492794514, 0.0799459741, -0.0313268006, -0.2085639238, -0.0284662768, 0.2052582055, -0.2474462092, 0.0866158754, 0.0637520179, -0.0720534623, 0.5367990732, 0.1062302291, -0.0013607293, 0.4988959134, -0.0392867513, 0.1321583986, -0.2274820209, -0.1652866751, 0.2007084042, -0.0658488423, 0.4217028916, 0.0883791074, -0.00058797, 0.1141032875, 0.0632384419, -0.0466908254, 0.0947640389, -0.3643093109, 0.1054970175, -0.4112315476, -0.2154278457, -0.0884305984, -0.3007827401, -0.1009221599, 0.123109594, 0.1373566687, -0.464446038, 0.1151452065, 0.2652114034, -0.375010103, 0.0119969212, 0.3838247359, 0.1710590869, -0.1051390767, 0.499843061, 0.457587719, -0.0533619151, -0.2644975185, -0.1096296161, 0.2944480777, -0.2731434405, 0.0849161297, -0.2455591708, -0.3016008735, 0.2093508542, 0.0808756128, 0.1000995561, -0.0149994157, -0.0632621273, -0.2510291338, -0.6380496025, -0.1814441532, -0.0027227891, -0.0035484964, 0.1460835785, 0.2738982141, -0.3100468516, 0.5892775059, -0.2788560092, -0.0529419072, -0.2076085508, 0.2782230973, -0.0555623658, -0.3855459988, 0.2115361542, 0.2033527046, 0.1001042053, 0.0805963874, 0.0059093423, -0.2213539034, -0.1435202807, 0.112715289, 0.2364698499, -0.4196630418, -0.0159935355, -0.3129969835, -0.0924898088, -0.2700214386, 0.0660682172, 0.0170285404, 0.0139458058, -0.2261784673, 0.0093940832, 0.0475259274, -0.2552625537, 0.042060256, -0.2545849085, 0.1250909567, 0.246571213, 0.1331031621, 0.1178301424, 0.0006369948, -0.3931023479, 0.1796455681, 0.6661384702, -0.0175323728, 0.419082135, -0.4357323647, -0.0715063736, 0.247659564, 0.0498031005, 0.0480658486, -0.3416772485, -0.0107186846, 0.2935878038, 0.2268191278, 0.0152087137, -0.0147332661, 0.2263551354, -0.4112984538, 0.1301490664, 0.3453423977, -0.1119266823, -0.0674753264, 0.1559654474, 0.0913231522, 0.5066777468, 0.1149650663, 0.0057111867, 0.2489495873, -0.0012200177, 0.1548573822, 0.4856925309, 0.2611194551, 0.2386806309, 0.3016281128, -0.0326658972, -0.0406999141, -0.2312837243, 0.2294136882, -0.435616076, -0.415166378, -0.2674563825, 0.5044151545, -0.0193236805, 0.1242090911, -0.3350303173, -0.0662411451, -0.0406532101, 0.2541327477, -0.1192157567, 0.2744088173, -0.4530941248, -0.1705991924, -0.0894243345, -0.123961933, 0.0115341395, 0.1896333247, -0.0558586121, 0.2722765803, -0.5796659589, 0.2293862104, 0.0521361418, 0.1451499909, -0.1346786618, 0.1227787733, 0.1055012345, -0.5078569651, 0.3433353007, 0.2378563285, 0.2918590903, 0.115404956, -0.221906662, 0.3443704844, 0.4366974831, -0.1908073723, -0.0406121649, -0.1209841669, -0.0356197953, -0.1499129832, 0.1974350214, 0.3490367532, 0.1878990531, 0.2596160173, 0.0557278506, -0.265966177, 0.2785172462, 0.0650140941, -0.3240586519, 0.1786668301, -0.1329390854, 0.4019319415, -0.3285808265, -0.4429640174, 0.009867128, -0.3941243589, 0.2155868858, 0.6670196056, -0.2954224646, 0.2878553569, -0.0254881512, 0.0485503711, 0.0018406976, 0.4760496616, 0.4697353244, 0.2841596603, -0.3783711195, -0.0256674215, -0.3288833499, 0.2765586376, -0.2024302334, 0.0669976398, 0.0222478993, 0.1707946956, 0.1645776778, -0.0033023204, -0.0796225369, -0.0136221573, 0.218803376, -0.1950690895, -0.1675592959, -0.1280471683, 0.0020943638, 0.3573175073, -0.0259005763, -0.2975495458, 0.1588650346, 0.1150627136, 0.0991571695, -0.3214533329, 0.3626994789, -0.1984633654, -0.0203625765, 0.0691283345, -0.1504710466, 0.3792098761, -0.2204388827, -0.0330470726, -0.0818270966, 0.1281416565, -0.1670770645, -0.2209998518, 0.3402220011, 0.3884243667, -0.1076362282, 0.1082137078, -0.255192548, -0.0864622518, 0.0403043032, -0.369690299, -0.3087161779, -0.0074875243, -0.0002530813, 0.0085244328, 0.1030983254, 0.2458303273, 0.1053374782, -0.0278403312, -0.2292289734, -0.1855804026, 0.2094751894, -0.2946497798, -0.3130407035, -0.0353798121, 0.055624567, -0.1636292785, 0.2511017621, -0.4624068141, 0.1457485557, 0.2297439128, -0.0255144164, -0.4540739357, 0.2232972682, 0.2540614009, 0.0164720193, -0.2076969594, 0.0534572713, 0.152623266, -0.1369921267, -0.2016401291, -0.1757197082 ]
https://github.com/huggingface/datasets/issues/633
Load large text file for LM pre-training resulting in OOM
It's happening to me again. After 4 hours of pre-training, my ram memory gets full and the kernel dies. I am using the last transformers version as today. 4.4.0 and the last version of datasets 1.2.1, both installed from master. The memory consumption keeps increasing.
I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks.
45
Load large text file for LM pre-training resulting in OOM I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks. It's happening to me again. After 4 hours of pre-training, my ram memory gets full and the kernel dies. I am using the last transformers version as today. 4.4.0 and the last version of datasets 1.2.1, both installed from master. The memory consumption keeps increasing.
[ -0.6339286566, -0.4775367379, 0.0106935017, 0.2986355722, 0.3600474596, -0.1518248618, 0.5567324758, 0.3738059402, 0.0108827576, 0.0107196309, -0.129592672, -0.1827997565, -0.2669844627, -0.1620898247, -0.0323721357, 0.0232065637, -0.0998215079, 0.1968753189, -0.2772667408, -0.0966669172, 0.0570117384, -0.0776291862, -0.2127603889, -0.0297442861, -0.3879216015, -0.0617137849, 0.3158174157, 0.1221210212, -0.1640810221, -0.0958083943, -0.2297017574, 0.1197900772, 0.439348489, 0.4340211749, -0.0001147721, 0.0066160336, 0.1890438497, -0.2363324165, -0.1672690213, 0.0192163214, 0.0775555521, -0.3022914231, -0.1064001918, -0.3036173582, -0.0272463262, -0.1061252505, -0.0156821907, -0.2243723869, 0.4987519681, 0.5414042473, 0.158762753, 0.2508125007, -0.2136608064, 0.0723968297, 0.2544375062, 0.0488999188, 0.1056828648, -0.2795148194, 0.4687834084, -0.1923182011, -0.4726887643, 0.147817716, -0.2247735411, -0.2016820759, -0.0950891525, 0.0382048525, 0.3660891652, 0.0315695107, 0.2641872764, 0.3474208415, 0.4272072911, -0.0008119822, -0.2896353602, -0.3209854364, -0.142623961, -0.0843160674, 0.1892364621, 0.2010776401, 0.0598267131, 0.1184588969, 0.1071273386, -0.1253313124, -0.1519896984, 0.0761606991, -0.2748077214, 0.3397959173, -0.2012510002, -0.0380433276, 0.1555755138, -0.0944792777, 0.1961691678, -0.0997751206, 0.0034044338, 0.256714642, -0.2454416156, -0.1123953462, -0.0716817975, -0.5194139481, 0.1627842039, -0.275945127, 0.1981878579, 0.2972391844, -0.0876616836, -0.0676169693, 0.0863933712, 0.4202023745, -0.2524377108, 0.2605276108, 0.2272560447, 0.1640110612, -0.1717063487, -0.044398699, -0.3305115998, -0.1962453574, 0.1015936062, -0.0773412734, -0.0110169947, -0.2546044588, -0.2404216379, 0.0585461818, -0.1407698691, -0.0308450628, 0.2748159468, 0.3899732232, -0.3540614843, 0.429497689, 0.1651095748, 0.0523289181, -0.4855332375, -0.3365268707, -0.1744022667, 0.1532573104, -0.2032445073, 0.0389754251, 0.1297925711, 0.1396334916, 0.0596540496, -0.0428831056, -0.062589474, -0.4402189851, -0.0094447248, -0.0241031349, 0.0207529441, -0.0425147153, 0.0630241409, -0.0233657025, 0.2347659618, -0.124994956, -0.0253818408, 0.347791791, -0.2762121558, -0.2276411355, 0.0509795845, 0.2101765722, -0.0282484777, 0.2091551125, 0.0285063665, 0.0768566355, 0.5359786153, 0.0066538528, -0.0471477136, -0.3782058656, 0.0945213884, 0.0920172632, 0.1619913876, 0.176630199, -0.0425881371, -0.0726022273, 0.0063893422, 0.1035867184, 0.2550407052, 0.3907325864, -0.2264701426, 0.2906734943, 0.0070044398, -0.0566552505, 0.5319920778, -0.2620993257, -0.4484443367, 0.2338527292, -0.3511552811, 0.1081818715, 0.2206701636, 0.1759027839, 0.1138891205, -0.1089850664, 0.2690153718, 0.3656853139, 0.0705155656, 0.1209142357, -0.2366243303, -0.0922287181, -0.1715156287, 0.4842028618, 0.3627239466, -0.0255859457, -0.073089242, 0.3563796282, 0.1833981574, -0.1992233247, 0.1312805116, 0.3693415821, -0.1744246632, -0.126158461, 0.0944814235, 0.065598011, -0.3000324965, -0.0709871948, -0.0908449665, 0.0828673989, -0.1136057153, -0.0894262716, 0.2825582922, 0.0228702873, -0.076665625, 0.1131795123, 0.0379412323, -0.1318216175, 0.0248667225, -0.0704166517, 0.1261966676, 0.0154216904, -0.0932069048, 0.0873603672, -0.218415916, -0.1239407063, 0.0018027006, -0.1370613873, 0.0769498348, 0.1752731055, -0.0511879697, 0.038401451, -0.1730400473, 0.1069573462, -0.1529778838, -0.2987068594, -0.5071570873, 0.435893029, 0.1864787787, -0.338109374, 0.3402595818, 0.104027614, -0.0295993853, -0.1062710211, -0.1198950633, 0.5749330521, 0.2730709612, 0.019617686, 0.2990927994, -0.2454116046, 0.2542809844, -0.2662388682, -0.1036153138, -0.0977649093, 0.4209234118, 0.0310557783, 0.2350629866, 0.148280412, -0.0869456232, -0.2320593894, 0.4703808725, -0.1788093746, -0.0138720572, 0.2665264308, -0.4247210622, -0.0115635023, -0.3480208218, -0.2494779825, 0.0067337304, 0.1732288599, -0.0944132358, -0.0554348305, 0.008538004, -0.2627931833, 0.0078360327, 0.153237164, -0.0718519539, 0.3198346198, 0.1165152863, 0.1005953103, -0.1228921488, -0.1794068366, 0.1028476655, 0.1782900989, -0.275191009, 0.1844428182, -0.0228228644, -0.0134468302, -0.2862631679, 0.1904247552, -0.1268012226, 0.0388155058, -0.064743273, 0.2586891651, -0.0779029801, 0.1019122973, 0.3001608849, 0.3549228907, 0.462962091, -0.0780947804, 0.3263480961, -0.0826636776, -0.1796956658, 0.0329436511, 0.3345074356, -0.3179924488, -0.0248901304, 0.0053771734, 0.0511769727, -0.098929733, -0.1660347879, 0.0946891308, 0.1968245208, -0.1661896259, -0.2573056519, -0.0721762553, 0.1776912063, -0.0424321517, 0.1197286695, -0.2324413657, 0.0084067807, 0.1450569928, -0.0062898882, -0.1843237281, 0.0404452905, -0.1199223399, -0.0470516533, -0.4116415977, -0.053518068, 0.1070510522, 0.1367316544, 0.3593674898, 0.1067891791, 0.194451645, 0.2884450853, 0.1585922837, 0.0063672885, -0.0885991901, 0.5141718388, -0.0175727978, -0.2971983254, -0.2908624411, -0.2397544682, 0.0251131542, 0.3155243993, -0.6842602491, 0.2337829173, -0.1138520464, -0.0940265656, -0.3955192268, 0.0007257089, 0.252205044, -0.0761481375, 0.0163395926, 0.0954741985, 0.2302769721, 0.1546791494, -0.0387872718, 0.3857944906, -0.4130838811, 0.359408766, -0.1354579031, 0.4855529666, -0.1221182644, -0.2300207019, 0.0988750979, 0.1504482776, -0.0171156786, -0.0218024775, -0.0139686503, -0.2593793869, -0.0582374632, 0.0227312893, 0.5197216272, 0.0168899372, -0.1655897647, -0.0158990994, 0.0681643113, 0.2563194036, -0.2606988251, 0.4905276, -0.1907785088, 0.0184165761, -0.1996744126, 0.0187455565, -0.2049808204, -0.2150854766, -0.1641347706, 0.0934438407, -0.0303831622, -0.2240440995, -0.2116935551, 0.10314022, -0.3681344092, -0.0437650606, 0.1179903895, 0.2293324769, 0.1807705015, -0.1961435974, -0.086288482, -0.2018707544, 0.3574145138, 0.1934524179, 0.1124655455, 0.1632595956, -0.0568787605, -0.2155371606, -0.1774517596, -0.2904546559, 0.1419078708, 0.6505103111, 0.6102460623, -0.2162735164, -0.1585993618, 0.0791448355, 0.0026035607, -0.1652731597, -0.5487569571, -0.1285093129, -0.1654994041, -0.4410517216, 0.0046067014, 0.3584984243, 0.3558533788, -0.2264456898, -0.0318751894, 0.1387300938, -0.0007012784, 0.3717512488, -0.1297643632, 0.0150536448, -0.066410847, 0.2258286923, -0.1074174196, 0.3673555255, 0.2488899529, 0.3563638031, 0.1917256266, -0.3175569773, 0.0706882849, -0.0298886281, 0.2314305753, 0.115161255, 0.1102285907, 0.3546282947, 0.1289709657, 0.3322556615, -0.1909662783, 0.6737231016, -0.0073441286, 0.2068271488, -0.8374903798, -0.2673701644, 0.3314557374, -0.0263577811, 0.1994024962, 0.215717867, -0.4038517177, -0.3216702938, 0.3766611218, 0.0990837663, 0.7675249577, -0.2972689867, 0.3962582052, -0.1552994698, 0.6611967087, 0.1280310154, -0.3813101053, 0.1585618854, -0.3987909555, -0.2663655877, 0.2130052447, -0.0923116431, 0.0365401059, 0.1605039835, -0.2905768156, -0.0298117511, 0.0862519294, 0.2249331921, 0.1306698024, 0.5134935379, 0.120926626, -0.6006980538, -0.0828129277, 0.0265690908, -0.0234003812, -0.1049019024, -0.0492794514, 0.0799459741, -0.0313268006, -0.2085639238, -0.0284662768, 0.2052582055, -0.2474462092, 0.0866158754, 0.0637520179, -0.0720534623, 0.5367990732, 0.1062302291, -0.0013607293, 0.4988959134, -0.0392867513, 0.1321583986, -0.2274820209, -0.1652866751, 0.2007084042, -0.0658488423, 0.4217028916, 0.0883791074, -0.00058797, 0.1141032875, 0.0632384419, -0.0466908254, 0.0947640389, -0.3643093109, 0.1054970175, -0.4112315476, -0.2154278457, -0.0884305984, -0.3007827401, -0.1009221599, 0.123109594, 0.1373566687, -0.464446038, 0.1151452065, 0.2652114034, -0.375010103, 0.0119969212, 0.3838247359, 0.1710590869, -0.1051390767, 0.499843061, 0.457587719, -0.0533619151, -0.2644975185, -0.1096296161, 0.2944480777, -0.2731434405, 0.0849161297, -0.2455591708, -0.3016008735, 0.2093508542, 0.0808756128, 0.1000995561, -0.0149994157, -0.0632621273, -0.2510291338, -0.6380496025, -0.1814441532, -0.0027227891, -0.0035484964, 0.1460835785, 0.2738982141, -0.3100468516, 0.5892775059, -0.2788560092, -0.0529419072, -0.2076085508, 0.2782230973, -0.0555623658, -0.3855459988, 0.2115361542, 0.2033527046, 0.1001042053, 0.0805963874, 0.0059093423, -0.2213539034, -0.1435202807, 0.112715289, 0.2364698499, -0.4196630418, -0.0159935355, -0.3129969835, -0.0924898088, -0.2700214386, 0.0660682172, 0.0170285404, 0.0139458058, -0.2261784673, 0.0093940832, 0.0475259274, -0.2552625537, 0.042060256, -0.2545849085, 0.1250909567, 0.246571213, 0.1331031621, 0.1178301424, 0.0006369948, -0.3931023479, 0.1796455681, 0.6661384702, -0.0175323728, 0.419082135, -0.4357323647, -0.0715063736, 0.247659564, 0.0498031005, 0.0480658486, -0.3416772485, -0.0107186846, 0.2935878038, 0.2268191278, 0.0152087137, -0.0147332661, 0.2263551354, -0.4112984538, 0.1301490664, 0.3453423977, -0.1119266823, -0.0674753264, 0.1559654474, 0.0913231522, 0.5066777468, 0.1149650663, 0.0057111867, 0.2489495873, -0.0012200177, 0.1548573822, 0.4856925309, 0.2611194551, 0.2386806309, 0.3016281128, -0.0326658972, -0.0406999141, -0.2312837243, 0.2294136882, -0.435616076, -0.415166378, -0.2674563825, 0.5044151545, -0.0193236805, 0.1242090911, -0.3350303173, -0.0662411451, -0.0406532101, 0.2541327477, -0.1192157567, 0.2744088173, -0.4530941248, -0.1705991924, -0.0894243345, -0.123961933, 0.0115341395, 0.1896333247, -0.0558586121, 0.2722765803, -0.5796659589, 0.2293862104, 0.0521361418, 0.1451499909, -0.1346786618, 0.1227787733, 0.1055012345, -0.5078569651, 0.3433353007, 0.2378563285, 0.2918590903, 0.115404956, -0.221906662, 0.3443704844, 0.4366974831, -0.1908073723, -0.0406121649, -0.1209841669, -0.0356197953, -0.1499129832, 0.1974350214, 0.3490367532, 0.1878990531, 0.2596160173, 0.0557278506, -0.265966177, 0.2785172462, 0.0650140941, -0.3240586519, 0.1786668301, -0.1329390854, 0.4019319415, -0.3285808265, -0.4429640174, 0.009867128, -0.3941243589, 0.2155868858, 0.6670196056, -0.2954224646, 0.2878553569, -0.0254881512, 0.0485503711, 0.0018406976, 0.4760496616, 0.4697353244, 0.2841596603, -0.3783711195, -0.0256674215, -0.3288833499, 0.2765586376, -0.2024302334, 0.0669976398, 0.0222478993, 0.1707946956, 0.1645776778, -0.0033023204, -0.0796225369, -0.0136221573, 0.218803376, -0.1950690895, -0.1675592959, -0.1280471683, 0.0020943638, 0.3573175073, -0.0259005763, -0.2975495458, 0.1588650346, 0.1150627136, 0.0991571695, -0.3214533329, 0.3626994789, -0.1984633654, -0.0203625765, 0.0691283345, -0.1504710466, 0.3792098761, -0.2204388827, -0.0330470726, -0.0818270966, 0.1281416565, -0.1670770645, -0.2209998518, 0.3402220011, 0.3884243667, -0.1076362282, 0.1082137078, -0.255192548, -0.0864622518, 0.0403043032, -0.369690299, -0.3087161779, -0.0074875243, -0.0002530813, 0.0085244328, 0.1030983254, 0.2458303273, 0.1053374782, -0.0278403312, -0.2292289734, -0.1855804026, 0.2094751894, -0.2946497798, -0.3130407035, -0.0353798121, 0.055624567, -0.1636292785, 0.2511017621, -0.4624068141, 0.1457485557, 0.2297439128, -0.0255144164, -0.4540739357, 0.2232972682, 0.2540614009, 0.0164720193, -0.2076969594, 0.0534572713, 0.152623266, -0.1369921267, -0.2016401291, -0.1757197082 ]
https://github.com/huggingface/datasets/issues/633
Load large text file for LM pre-training resulting in OOM
Thanks for the investigation @gaceladri Apparently this happens when `num_workers>0` and has to do with objects being copied-on-write. Did you try setting num_workers to 0 @gaceladri ? If the issue doesn't happen with `num_workers=0` then this would confirm that it's indeed related to this python/pytorch issue. Since a `Dataset` object is a wrapper of a pyarrow Table, we should investigate if the data being copied comes from the Table itself or from metadata in the `Dataset` object. If it comes from the metadata in the `Dataset` object, we should be able to implement a workaround. But if it comes from the Table, we'll need to see with the pyarrow team what we can do...
I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks.
114
Load large text file for LM pre-training resulting in OOM I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks. Thanks for the investigation @gaceladri Apparently this happens when `num_workers>0` and has to do with objects being copied-on-write. Did you try setting num_workers to 0 @gaceladri ? If the issue doesn't happen with `num_workers=0` then this would confirm that it's indeed related to this python/pytorch issue. Since a `Dataset` object is a wrapper of a pyarrow Table, we should investigate if the data being copied comes from the Table itself or from metadata in the `Dataset` object. If it comes from the metadata in the `Dataset` object, we should be able to implement a workaround. But if it comes from the Table, we'll need to see with the pyarrow team what we can do...
[ -0.6339286566, -0.4775367379, 0.0106935017, 0.2986355722, 0.3600474596, -0.1518248618, 0.5567324758, 0.3738059402, 0.0108827576, 0.0107196309, -0.129592672, -0.1827997565, -0.2669844627, -0.1620898247, -0.0323721357, 0.0232065637, -0.0998215079, 0.1968753189, -0.2772667408, -0.0966669172, 0.0570117384, -0.0776291862, -0.2127603889, -0.0297442861, -0.3879216015, -0.0617137849, 0.3158174157, 0.1221210212, -0.1640810221, -0.0958083943, -0.2297017574, 0.1197900772, 0.439348489, 0.4340211749, -0.0001147721, 0.0066160336, 0.1890438497, -0.2363324165, -0.1672690213, 0.0192163214, 0.0775555521, -0.3022914231, -0.1064001918, -0.3036173582, -0.0272463262, -0.1061252505, -0.0156821907, -0.2243723869, 0.4987519681, 0.5414042473, 0.158762753, 0.2508125007, -0.2136608064, 0.0723968297, 0.2544375062, 0.0488999188, 0.1056828648, -0.2795148194, 0.4687834084, -0.1923182011, -0.4726887643, 0.147817716, -0.2247735411, -0.2016820759, -0.0950891525, 0.0382048525, 0.3660891652, 0.0315695107, 0.2641872764, 0.3474208415, 0.4272072911, -0.0008119822, -0.2896353602, -0.3209854364, -0.142623961, -0.0843160674, 0.1892364621, 0.2010776401, 0.0598267131, 0.1184588969, 0.1071273386, -0.1253313124, -0.1519896984, 0.0761606991, -0.2748077214, 0.3397959173, -0.2012510002, -0.0380433276, 0.1555755138, -0.0944792777, 0.1961691678, -0.0997751206, 0.0034044338, 0.256714642, -0.2454416156, -0.1123953462, -0.0716817975, -0.5194139481, 0.1627842039, -0.275945127, 0.1981878579, 0.2972391844, -0.0876616836, -0.0676169693, 0.0863933712, 0.4202023745, -0.2524377108, 0.2605276108, 0.2272560447, 0.1640110612, -0.1717063487, -0.044398699, -0.3305115998, -0.1962453574, 0.1015936062, -0.0773412734, -0.0110169947, -0.2546044588, -0.2404216379, 0.0585461818, -0.1407698691, -0.0308450628, 0.2748159468, 0.3899732232, -0.3540614843, 0.429497689, 0.1651095748, 0.0523289181, -0.4855332375, -0.3365268707, -0.1744022667, 0.1532573104, -0.2032445073, 0.0389754251, 0.1297925711, 0.1396334916, 0.0596540496, -0.0428831056, -0.062589474, -0.4402189851, -0.0094447248, -0.0241031349, 0.0207529441, -0.0425147153, 0.0630241409, -0.0233657025, 0.2347659618, -0.124994956, -0.0253818408, 0.347791791, -0.2762121558, -0.2276411355, 0.0509795845, 0.2101765722, -0.0282484777, 0.2091551125, 0.0285063665, 0.0768566355, 0.5359786153, 0.0066538528, -0.0471477136, -0.3782058656, 0.0945213884, 0.0920172632, 0.1619913876, 0.176630199, -0.0425881371, -0.0726022273, 0.0063893422, 0.1035867184, 0.2550407052, 0.3907325864, -0.2264701426, 0.2906734943, 0.0070044398, -0.0566552505, 0.5319920778, -0.2620993257, -0.4484443367, 0.2338527292, -0.3511552811, 0.1081818715, 0.2206701636, 0.1759027839, 0.1138891205, -0.1089850664, 0.2690153718, 0.3656853139, 0.0705155656, 0.1209142357, -0.2366243303, -0.0922287181, -0.1715156287, 0.4842028618, 0.3627239466, -0.0255859457, -0.073089242, 0.3563796282, 0.1833981574, -0.1992233247, 0.1312805116, 0.3693415821, -0.1744246632, -0.126158461, 0.0944814235, 0.065598011, -0.3000324965, -0.0709871948, -0.0908449665, 0.0828673989, -0.1136057153, -0.0894262716, 0.2825582922, 0.0228702873, -0.076665625, 0.1131795123, 0.0379412323, -0.1318216175, 0.0248667225, -0.0704166517, 0.1261966676, 0.0154216904, -0.0932069048, 0.0873603672, -0.218415916, -0.1239407063, 0.0018027006, -0.1370613873, 0.0769498348, 0.1752731055, -0.0511879697, 0.038401451, -0.1730400473, 0.1069573462, -0.1529778838, -0.2987068594, -0.5071570873, 0.435893029, 0.1864787787, -0.338109374, 0.3402595818, 0.104027614, -0.0295993853, -0.1062710211, -0.1198950633, 0.5749330521, 0.2730709612, 0.019617686, 0.2990927994, -0.2454116046, 0.2542809844, -0.2662388682, -0.1036153138, -0.0977649093, 0.4209234118, 0.0310557783, 0.2350629866, 0.148280412, -0.0869456232, -0.2320593894, 0.4703808725, -0.1788093746, -0.0138720572, 0.2665264308, -0.4247210622, -0.0115635023, -0.3480208218, -0.2494779825, 0.0067337304, 0.1732288599, -0.0944132358, -0.0554348305, 0.008538004, -0.2627931833, 0.0078360327, 0.153237164, -0.0718519539, 0.3198346198, 0.1165152863, 0.1005953103, -0.1228921488, -0.1794068366, 0.1028476655, 0.1782900989, -0.275191009, 0.1844428182, -0.0228228644, -0.0134468302, -0.2862631679, 0.1904247552, -0.1268012226, 0.0388155058, -0.064743273, 0.2586891651, -0.0779029801, 0.1019122973, 0.3001608849, 0.3549228907, 0.462962091, -0.0780947804, 0.3263480961, -0.0826636776, -0.1796956658, 0.0329436511, 0.3345074356, -0.3179924488, -0.0248901304, 0.0053771734, 0.0511769727, -0.098929733, -0.1660347879, 0.0946891308, 0.1968245208, -0.1661896259, -0.2573056519, -0.0721762553, 0.1776912063, -0.0424321517, 0.1197286695, -0.2324413657, 0.0084067807, 0.1450569928, -0.0062898882, -0.1843237281, 0.0404452905, -0.1199223399, -0.0470516533, -0.4116415977, -0.053518068, 0.1070510522, 0.1367316544, 0.3593674898, 0.1067891791, 0.194451645, 0.2884450853, 0.1585922837, 0.0063672885, -0.0885991901, 0.5141718388, -0.0175727978, -0.2971983254, -0.2908624411, -0.2397544682, 0.0251131542, 0.3155243993, -0.6842602491, 0.2337829173, -0.1138520464, -0.0940265656, -0.3955192268, 0.0007257089, 0.252205044, -0.0761481375, 0.0163395926, 0.0954741985, 0.2302769721, 0.1546791494, -0.0387872718, 0.3857944906, -0.4130838811, 0.359408766, -0.1354579031, 0.4855529666, -0.1221182644, -0.2300207019, 0.0988750979, 0.1504482776, -0.0171156786, -0.0218024775, -0.0139686503, -0.2593793869, -0.0582374632, 0.0227312893, 0.5197216272, 0.0168899372, -0.1655897647, -0.0158990994, 0.0681643113, 0.2563194036, -0.2606988251, 0.4905276, -0.1907785088, 0.0184165761, -0.1996744126, 0.0187455565, -0.2049808204, -0.2150854766, -0.1641347706, 0.0934438407, -0.0303831622, -0.2240440995, -0.2116935551, 0.10314022, -0.3681344092, -0.0437650606, 0.1179903895, 0.2293324769, 0.1807705015, -0.1961435974, -0.086288482, -0.2018707544, 0.3574145138, 0.1934524179, 0.1124655455, 0.1632595956, -0.0568787605, -0.2155371606, -0.1774517596, -0.2904546559, 0.1419078708, 0.6505103111, 0.6102460623, -0.2162735164, -0.1585993618, 0.0791448355, 0.0026035607, -0.1652731597, -0.5487569571, -0.1285093129, -0.1654994041, -0.4410517216, 0.0046067014, 0.3584984243, 0.3558533788, -0.2264456898, -0.0318751894, 0.1387300938, -0.0007012784, 0.3717512488, -0.1297643632, 0.0150536448, -0.066410847, 0.2258286923, -0.1074174196, 0.3673555255, 0.2488899529, 0.3563638031, 0.1917256266, -0.3175569773, 0.0706882849, -0.0298886281, 0.2314305753, 0.115161255, 0.1102285907, 0.3546282947, 0.1289709657, 0.3322556615, -0.1909662783, 0.6737231016, -0.0073441286, 0.2068271488, -0.8374903798, -0.2673701644, 0.3314557374, -0.0263577811, 0.1994024962, 0.215717867, -0.4038517177, -0.3216702938, 0.3766611218, 0.0990837663, 0.7675249577, -0.2972689867, 0.3962582052, -0.1552994698, 0.6611967087, 0.1280310154, -0.3813101053, 0.1585618854, -0.3987909555, -0.2663655877, 0.2130052447, -0.0923116431, 0.0365401059, 0.1605039835, -0.2905768156, -0.0298117511, 0.0862519294, 0.2249331921, 0.1306698024, 0.5134935379, 0.120926626, -0.6006980538, -0.0828129277, 0.0265690908, -0.0234003812, -0.1049019024, -0.0492794514, 0.0799459741, -0.0313268006, -0.2085639238, -0.0284662768, 0.2052582055, -0.2474462092, 0.0866158754, 0.0637520179, -0.0720534623, 0.5367990732, 0.1062302291, -0.0013607293, 0.4988959134, -0.0392867513, 0.1321583986, -0.2274820209, -0.1652866751, 0.2007084042, -0.0658488423, 0.4217028916, 0.0883791074, -0.00058797, 0.1141032875, 0.0632384419, -0.0466908254, 0.0947640389, -0.3643093109, 0.1054970175, -0.4112315476, -0.2154278457, -0.0884305984, -0.3007827401, -0.1009221599, 0.123109594, 0.1373566687, -0.464446038, 0.1151452065, 0.2652114034, -0.375010103, 0.0119969212, 0.3838247359, 0.1710590869, -0.1051390767, 0.499843061, 0.457587719, -0.0533619151, -0.2644975185, -0.1096296161, 0.2944480777, -0.2731434405, 0.0849161297, -0.2455591708, -0.3016008735, 0.2093508542, 0.0808756128, 0.1000995561, -0.0149994157, -0.0632621273, -0.2510291338, -0.6380496025, -0.1814441532, -0.0027227891, -0.0035484964, 0.1460835785, 0.2738982141, -0.3100468516, 0.5892775059, -0.2788560092, -0.0529419072, -0.2076085508, 0.2782230973, -0.0555623658, -0.3855459988, 0.2115361542, 0.2033527046, 0.1001042053, 0.0805963874, 0.0059093423, -0.2213539034, -0.1435202807, 0.112715289, 0.2364698499, -0.4196630418, -0.0159935355, -0.3129969835, -0.0924898088, -0.2700214386, 0.0660682172, 0.0170285404, 0.0139458058, -0.2261784673, 0.0093940832, 0.0475259274, -0.2552625537, 0.042060256, -0.2545849085, 0.1250909567, 0.246571213, 0.1331031621, 0.1178301424, 0.0006369948, -0.3931023479, 0.1796455681, 0.6661384702, -0.0175323728, 0.419082135, -0.4357323647, -0.0715063736, 0.247659564, 0.0498031005, 0.0480658486, -0.3416772485, -0.0107186846, 0.2935878038, 0.2268191278, 0.0152087137, -0.0147332661, 0.2263551354, -0.4112984538, 0.1301490664, 0.3453423977, -0.1119266823, -0.0674753264, 0.1559654474, 0.0913231522, 0.5066777468, 0.1149650663, 0.0057111867, 0.2489495873, -0.0012200177, 0.1548573822, 0.4856925309, 0.2611194551, 0.2386806309, 0.3016281128, -0.0326658972, -0.0406999141, -0.2312837243, 0.2294136882, -0.435616076, -0.415166378, -0.2674563825, 0.5044151545, -0.0193236805, 0.1242090911, -0.3350303173, -0.0662411451, -0.0406532101, 0.2541327477, -0.1192157567, 0.2744088173, -0.4530941248, -0.1705991924, -0.0894243345, -0.123961933, 0.0115341395, 0.1896333247, -0.0558586121, 0.2722765803, -0.5796659589, 0.2293862104, 0.0521361418, 0.1451499909, -0.1346786618, 0.1227787733, 0.1055012345, -0.5078569651, 0.3433353007, 0.2378563285, 0.2918590903, 0.115404956, -0.221906662, 0.3443704844, 0.4366974831, -0.1908073723, -0.0406121649, -0.1209841669, -0.0356197953, -0.1499129832, 0.1974350214, 0.3490367532, 0.1878990531, 0.2596160173, 0.0557278506, -0.265966177, 0.2785172462, 0.0650140941, -0.3240586519, 0.1786668301, -0.1329390854, 0.4019319415, -0.3285808265, -0.4429640174, 0.009867128, -0.3941243589, 0.2155868858, 0.6670196056, -0.2954224646, 0.2878553569, -0.0254881512, 0.0485503711, 0.0018406976, 0.4760496616, 0.4697353244, 0.2841596603, -0.3783711195, -0.0256674215, -0.3288833499, 0.2765586376, -0.2024302334, 0.0669976398, 0.0222478993, 0.1707946956, 0.1645776778, -0.0033023204, -0.0796225369, -0.0136221573, 0.218803376, -0.1950690895, -0.1675592959, -0.1280471683, 0.0020943638, 0.3573175073, -0.0259005763, -0.2975495458, 0.1588650346, 0.1150627136, 0.0991571695, -0.3214533329, 0.3626994789, -0.1984633654, -0.0203625765, 0.0691283345, -0.1504710466, 0.3792098761, -0.2204388827, -0.0330470726, -0.0818270966, 0.1281416565, -0.1670770645, -0.2209998518, 0.3402220011, 0.3884243667, -0.1076362282, 0.1082137078, -0.255192548, -0.0864622518, 0.0403043032, -0.369690299, -0.3087161779, -0.0074875243, -0.0002530813, 0.0085244328, 0.1030983254, 0.2458303273, 0.1053374782, -0.0278403312, -0.2292289734, -0.1855804026, 0.2094751894, -0.2946497798, -0.3130407035, -0.0353798121, 0.055624567, -0.1636292785, 0.2511017621, -0.4624068141, 0.1457485557, 0.2297439128, -0.0255144164, -0.4540739357, 0.2232972682, 0.2540614009, 0.0164720193, -0.2076969594, 0.0534572713, 0.152623266, -0.1369921267, -0.2016401291, -0.1757197082 ]
https://github.com/huggingface/datasets/issues/633
Load large text file for LM pre-training resulting in OOM
Hmmm so this might come from another issue... Since it doesn't seem to be related to multiprocessing it should be easier to investigate though. Do you have some ideas @gaceladri ?
I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks.
31
Load large text file for LM pre-training resulting in OOM I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks. Hmmm so this might come from another issue... Since it doesn't seem to be related to multiprocessing it should be easier to investigate though. Do you have some ideas @gaceladri ?
[ -0.6339286566, -0.4775367379, 0.0106935017, 0.2986355722, 0.3600474596, -0.1518248618, 0.5567324758, 0.3738059402, 0.0108827576, 0.0107196309, -0.129592672, -0.1827997565, -0.2669844627, -0.1620898247, -0.0323721357, 0.0232065637, -0.0998215079, 0.1968753189, -0.2772667408, -0.0966669172, 0.0570117384, -0.0776291862, -0.2127603889, -0.0297442861, -0.3879216015, -0.0617137849, 0.3158174157, 0.1221210212, -0.1640810221, -0.0958083943, -0.2297017574, 0.1197900772, 0.439348489, 0.4340211749, -0.0001147721, 0.0066160336, 0.1890438497, -0.2363324165, -0.1672690213, 0.0192163214, 0.0775555521, -0.3022914231, -0.1064001918, -0.3036173582, -0.0272463262, -0.1061252505, -0.0156821907, -0.2243723869, 0.4987519681, 0.5414042473, 0.158762753, 0.2508125007, -0.2136608064, 0.0723968297, 0.2544375062, 0.0488999188, 0.1056828648, -0.2795148194, 0.4687834084, -0.1923182011, -0.4726887643, 0.147817716, -0.2247735411, -0.2016820759, -0.0950891525, 0.0382048525, 0.3660891652, 0.0315695107, 0.2641872764, 0.3474208415, 0.4272072911, -0.0008119822, -0.2896353602, -0.3209854364, -0.142623961, -0.0843160674, 0.1892364621, 0.2010776401, 0.0598267131, 0.1184588969, 0.1071273386, -0.1253313124, -0.1519896984, 0.0761606991, -0.2748077214, 0.3397959173, -0.2012510002, -0.0380433276, 0.1555755138, -0.0944792777, 0.1961691678, -0.0997751206, 0.0034044338, 0.256714642, -0.2454416156, -0.1123953462, -0.0716817975, -0.5194139481, 0.1627842039, -0.275945127, 0.1981878579, 0.2972391844, -0.0876616836, -0.0676169693, 0.0863933712, 0.4202023745, -0.2524377108, 0.2605276108, 0.2272560447, 0.1640110612, -0.1717063487, -0.044398699, -0.3305115998, -0.1962453574, 0.1015936062, -0.0773412734, -0.0110169947, -0.2546044588, -0.2404216379, 0.0585461818, -0.1407698691, -0.0308450628, 0.2748159468, 0.3899732232, -0.3540614843, 0.429497689, 0.1651095748, 0.0523289181, -0.4855332375, -0.3365268707, -0.1744022667, 0.1532573104, -0.2032445073, 0.0389754251, 0.1297925711, 0.1396334916, 0.0596540496, -0.0428831056, -0.062589474, -0.4402189851, -0.0094447248, -0.0241031349, 0.0207529441, -0.0425147153, 0.0630241409, -0.0233657025, 0.2347659618, -0.124994956, -0.0253818408, 0.347791791, -0.2762121558, -0.2276411355, 0.0509795845, 0.2101765722, -0.0282484777, 0.2091551125, 0.0285063665, 0.0768566355, 0.5359786153, 0.0066538528, -0.0471477136, -0.3782058656, 0.0945213884, 0.0920172632, 0.1619913876, 0.176630199, -0.0425881371, -0.0726022273, 0.0063893422, 0.1035867184, 0.2550407052, 0.3907325864, -0.2264701426, 0.2906734943, 0.0070044398, -0.0566552505, 0.5319920778, -0.2620993257, -0.4484443367, 0.2338527292, -0.3511552811, 0.1081818715, 0.2206701636, 0.1759027839, 0.1138891205, -0.1089850664, 0.2690153718, 0.3656853139, 0.0705155656, 0.1209142357, -0.2366243303, -0.0922287181, -0.1715156287, 0.4842028618, 0.3627239466, -0.0255859457, -0.073089242, 0.3563796282, 0.1833981574, -0.1992233247, 0.1312805116, 0.3693415821, -0.1744246632, -0.126158461, 0.0944814235, 0.065598011, -0.3000324965, -0.0709871948, -0.0908449665, 0.0828673989, -0.1136057153, -0.0894262716, 0.2825582922, 0.0228702873, -0.076665625, 0.1131795123, 0.0379412323, -0.1318216175, 0.0248667225, -0.0704166517, 0.1261966676, 0.0154216904, -0.0932069048, 0.0873603672, -0.218415916, -0.1239407063, 0.0018027006, -0.1370613873, 0.0769498348, 0.1752731055, -0.0511879697, 0.038401451, -0.1730400473, 0.1069573462, -0.1529778838, -0.2987068594, -0.5071570873, 0.435893029, 0.1864787787, -0.338109374, 0.3402595818, 0.104027614, -0.0295993853, -0.1062710211, -0.1198950633, 0.5749330521, 0.2730709612, 0.019617686, 0.2990927994, -0.2454116046, 0.2542809844, -0.2662388682, -0.1036153138, -0.0977649093, 0.4209234118, 0.0310557783, 0.2350629866, 0.148280412, -0.0869456232, -0.2320593894, 0.4703808725, -0.1788093746, -0.0138720572, 0.2665264308, -0.4247210622, -0.0115635023, -0.3480208218, -0.2494779825, 0.0067337304, 0.1732288599, -0.0944132358, -0.0554348305, 0.008538004, -0.2627931833, 0.0078360327, 0.153237164, -0.0718519539, 0.3198346198, 0.1165152863, 0.1005953103, -0.1228921488, -0.1794068366, 0.1028476655, 0.1782900989, -0.275191009, 0.1844428182, -0.0228228644, -0.0134468302, -0.2862631679, 0.1904247552, -0.1268012226, 0.0388155058, -0.064743273, 0.2586891651, -0.0779029801, 0.1019122973, 0.3001608849, 0.3549228907, 0.462962091, -0.0780947804, 0.3263480961, -0.0826636776, -0.1796956658, 0.0329436511, 0.3345074356, -0.3179924488, -0.0248901304, 0.0053771734, 0.0511769727, -0.098929733, -0.1660347879, 0.0946891308, 0.1968245208, -0.1661896259, -0.2573056519, -0.0721762553, 0.1776912063, -0.0424321517, 0.1197286695, -0.2324413657, 0.0084067807, 0.1450569928, -0.0062898882, -0.1843237281, 0.0404452905, -0.1199223399, -0.0470516533, -0.4116415977, -0.053518068, 0.1070510522, 0.1367316544, 0.3593674898, 0.1067891791, 0.194451645, 0.2884450853, 0.1585922837, 0.0063672885, -0.0885991901, 0.5141718388, -0.0175727978, -0.2971983254, -0.2908624411, -0.2397544682, 0.0251131542, 0.3155243993, -0.6842602491, 0.2337829173, -0.1138520464, -0.0940265656, -0.3955192268, 0.0007257089, 0.252205044, -0.0761481375, 0.0163395926, 0.0954741985, 0.2302769721, 0.1546791494, -0.0387872718, 0.3857944906, -0.4130838811, 0.359408766, -0.1354579031, 0.4855529666, -0.1221182644, -0.2300207019, 0.0988750979, 0.1504482776, -0.0171156786, -0.0218024775, -0.0139686503, -0.2593793869, -0.0582374632, 0.0227312893, 0.5197216272, 0.0168899372, -0.1655897647, -0.0158990994, 0.0681643113, 0.2563194036, -0.2606988251, 0.4905276, -0.1907785088, 0.0184165761, -0.1996744126, 0.0187455565, -0.2049808204, -0.2150854766, -0.1641347706, 0.0934438407, -0.0303831622, -0.2240440995, -0.2116935551, 0.10314022, -0.3681344092, -0.0437650606, 0.1179903895, 0.2293324769, 0.1807705015, -0.1961435974, -0.086288482, -0.2018707544, 0.3574145138, 0.1934524179, 0.1124655455, 0.1632595956, -0.0568787605, -0.2155371606, -0.1774517596, -0.2904546559, 0.1419078708, 0.6505103111, 0.6102460623, -0.2162735164, -0.1585993618, 0.0791448355, 0.0026035607, -0.1652731597, -0.5487569571, -0.1285093129, -0.1654994041, -0.4410517216, 0.0046067014, 0.3584984243, 0.3558533788, -0.2264456898, -0.0318751894, 0.1387300938, -0.0007012784, 0.3717512488, -0.1297643632, 0.0150536448, -0.066410847, 0.2258286923, -0.1074174196, 0.3673555255, 0.2488899529, 0.3563638031, 0.1917256266, -0.3175569773, 0.0706882849, -0.0298886281, 0.2314305753, 0.115161255, 0.1102285907, 0.3546282947, 0.1289709657, 0.3322556615, -0.1909662783, 0.6737231016, -0.0073441286, 0.2068271488, -0.8374903798, -0.2673701644, 0.3314557374, -0.0263577811, 0.1994024962, 0.215717867, -0.4038517177, -0.3216702938, 0.3766611218, 0.0990837663, 0.7675249577, -0.2972689867, 0.3962582052, -0.1552994698, 0.6611967087, 0.1280310154, -0.3813101053, 0.1585618854, -0.3987909555, -0.2663655877, 0.2130052447, -0.0923116431, 0.0365401059, 0.1605039835, -0.2905768156, -0.0298117511, 0.0862519294, 0.2249331921, 0.1306698024, 0.5134935379, 0.120926626, -0.6006980538, -0.0828129277, 0.0265690908, -0.0234003812, -0.1049019024, -0.0492794514, 0.0799459741, -0.0313268006, -0.2085639238, -0.0284662768, 0.2052582055, -0.2474462092, 0.0866158754, 0.0637520179, -0.0720534623, 0.5367990732, 0.1062302291, -0.0013607293, 0.4988959134, -0.0392867513, 0.1321583986, -0.2274820209, -0.1652866751, 0.2007084042, -0.0658488423, 0.4217028916, 0.0883791074, -0.00058797, 0.1141032875, 0.0632384419, -0.0466908254, 0.0947640389, -0.3643093109, 0.1054970175, -0.4112315476, -0.2154278457, -0.0884305984, -0.3007827401, -0.1009221599, 0.123109594, 0.1373566687, -0.464446038, 0.1151452065, 0.2652114034, -0.375010103, 0.0119969212, 0.3838247359, 0.1710590869, -0.1051390767, 0.499843061, 0.457587719, -0.0533619151, -0.2644975185, -0.1096296161, 0.2944480777, -0.2731434405, 0.0849161297, -0.2455591708, -0.3016008735, 0.2093508542, 0.0808756128, 0.1000995561, -0.0149994157, -0.0632621273, -0.2510291338, -0.6380496025, -0.1814441532, -0.0027227891, -0.0035484964, 0.1460835785, 0.2738982141, -0.3100468516, 0.5892775059, -0.2788560092, -0.0529419072, -0.2076085508, 0.2782230973, -0.0555623658, -0.3855459988, 0.2115361542, 0.2033527046, 0.1001042053, 0.0805963874, 0.0059093423, -0.2213539034, -0.1435202807, 0.112715289, 0.2364698499, -0.4196630418, -0.0159935355, -0.3129969835, -0.0924898088, -0.2700214386, 0.0660682172, 0.0170285404, 0.0139458058, -0.2261784673, 0.0093940832, 0.0475259274, -0.2552625537, 0.042060256, -0.2545849085, 0.1250909567, 0.246571213, 0.1331031621, 0.1178301424, 0.0006369948, -0.3931023479, 0.1796455681, 0.6661384702, -0.0175323728, 0.419082135, -0.4357323647, -0.0715063736, 0.247659564, 0.0498031005, 0.0480658486, -0.3416772485, -0.0107186846, 0.2935878038, 0.2268191278, 0.0152087137, -0.0147332661, 0.2263551354, -0.4112984538, 0.1301490664, 0.3453423977, -0.1119266823, -0.0674753264, 0.1559654474, 0.0913231522, 0.5066777468, 0.1149650663, 0.0057111867, 0.2489495873, -0.0012200177, 0.1548573822, 0.4856925309, 0.2611194551, 0.2386806309, 0.3016281128, -0.0326658972, -0.0406999141, -0.2312837243, 0.2294136882, -0.435616076, -0.415166378, -0.2674563825, 0.5044151545, -0.0193236805, 0.1242090911, -0.3350303173, -0.0662411451, -0.0406532101, 0.2541327477, -0.1192157567, 0.2744088173, -0.4530941248, -0.1705991924, -0.0894243345, -0.123961933, 0.0115341395, 0.1896333247, -0.0558586121, 0.2722765803, -0.5796659589, 0.2293862104, 0.0521361418, 0.1451499909, -0.1346786618, 0.1227787733, 0.1055012345, -0.5078569651, 0.3433353007, 0.2378563285, 0.2918590903, 0.115404956, -0.221906662, 0.3443704844, 0.4366974831, -0.1908073723, -0.0406121649, -0.1209841669, -0.0356197953, -0.1499129832, 0.1974350214, 0.3490367532, 0.1878990531, 0.2596160173, 0.0557278506, -0.265966177, 0.2785172462, 0.0650140941, -0.3240586519, 0.1786668301, -0.1329390854, 0.4019319415, -0.3285808265, -0.4429640174, 0.009867128, -0.3941243589, 0.2155868858, 0.6670196056, -0.2954224646, 0.2878553569, -0.0254881512, 0.0485503711, 0.0018406976, 0.4760496616, 0.4697353244, 0.2841596603, -0.3783711195, -0.0256674215, -0.3288833499, 0.2765586376, -0.2024302334, 0.0669976398, 0.0222478993, 0.1707946956, 0.1645776778, -0.0033023204, -0.0796225369, -0.0136221573, 0.218803376, -0.1950690895, -0.1675592959, -0.1280471683, 0.0020943638, 0.3573175073, -0.0259005763, -0.2975495458, 0.1588650346, 0.1150627136, 0.0991571695, -0.3214533329, 0.3626994789, -0.1984633654, -0.0203625765, 0.0691283345, -0.1504710466, 0.3792098761, -0.2204388827, -0.0330470726, -0.0818270966, 0.1281416565, -0.1670770645, -0.2209998518, 0.3402220011, 0.3884243667, -0.1076362282, 0.1082137078, -0.255192548, -0.0864622518, 0.0403043032, -0.369690299, -0.3087161779, -0.0074875243, -0.0002530813, 0.0085244328, 0.1030983254, 0.2458303273, 0.1053374782, -0.0278403312, -0.2292289734, -0.1855804026, 0.2094751894, -0.2946497798, -0.3130407035, -0.0353798121, 0.055624567, -0.1636292785, 0.2511017621, -0.4624068141, 0.1457485557, 0.2297439128, -0.0255144164, -0.4540739357, 0.2232972682, 0.2540614009, 0.0164720193, -0.2076969594, 0.0534572713, 0.152623266, -0.1369921267, -0.2016401291, -0.1757197082 ]
https://github.com/huggingface/datasets/issues/633
Load large text file for LM pre-training resulting in OOM
@lhoestq I looked quickly to a previously spoted bug in my env wandb /sdk/interface/interface.py, because sometimes when I load the dataset I got a multiprocessing error at line 510 in wandb...interface.py This bug is reported here https://github.com/huggingface/datasets/issues/847 ``` --------------------------------------------------------------------------- AssertionError Traceback (most recent call last) <timed eval> in <module> ~/anaconda3/envs/tfm/lib/python3.6/site-packages/transformers/trainer.py in train(self, model_path, trial) 877 print(len(epoch_iterator)) 878 --> 879 for step, inputs in enumerate(epoch_iterator): 880 881 start_step = time.time() ~/anaconda3/envs/tfm/lib/python3.6/site-packages/torch/utils/data/dataloader.py in __next__(self) 433 if self._sampler_iter is None: 434 self._reset() --> 435 data = self._next_data() 436 self._num_yielded += 1 437 if self._dataset_kind == _DatasetKind.Iterable and \ ~/anaconda3/envs/tfm/lib/python3.6/site-packages/torch/utils/data/dataloader.py in _next_data(self) 1083 else: 1084 del self._task_info[idx] -> 1085 return self._process_data(data) 1086 1087 def _try_put_index(self): ~/anaconda3/envs/tfm/lib/python3.6/site-packages/torch/utils/data/dataloader.py in _process_data(self, data) 1109 self._try_put_index() 1110 if isinstance(data, ExceptionWrapper): -> 1111 data.reraise() 1112 return data 1113 ~/anaconda3/envs/tfm/lib/python3.6/site-packages/torch/_utils.py in reraise(self) 426 # have message field 427 raise self.exc_type(message=msg) --> 428 raise self.exc_type(msg) 429 430 AssertionError: Caught AssertionError in DataLoader worker process 0. Original Traceback (most recent call last): File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/torch/utils/data/_utils/worker.py", line 198, in _worker_loop data = fetcher.fetch(index) File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/torch/utils/data/_utils/fetch.py", line 44, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/torch/utils/data/_utils/fetch.py", line 44, in <listcomp> data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/datasets/arrow_dataset.py", line 1083, in __getitem__ format_kwargs=self._format_kwargs, File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/datasets/arrow_dataset.py", line 1070, in _getitem format_kwargs=format_kwargs, File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/datasets/arrow_dataset.py", line 886, in _convert_outputs v = map_nested(command, v, **map_nested_kwargs) File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/datasets/utils/py_utils.py", line 216, in map_nested return function(data_struct) File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/datasets/arrow_dataset.py", line 847, in command return torch.tensor(x, **format_kwargs) File "/home/ad/anaconda3/envs/tfm/lib/python3.6/warnings.py", line 101, in _showwarnmsg _showwarnmsg_impl(msg) File "/home/ad/anaconda3/envs/tfm/lib/python3.6/warnings.py", line 30, in _showwarnmsg_impl file.write(text) File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/wandb/sdk/lib/redirect.py", line 100, in new_write cb(name, data) File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/wandb/sdk/wandb_run.py", line 729, in _console_callback self._backend.interface.publish_output(name, data) File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/wandb/sdk/interface/interface.py", line 186, in publish_output self._publish_output(o) File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/wandb/sdk/interface/interface.py", line 191, in _publish_output self._publish(rec) File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/wandb/sdk/interface/interface.py", line 510, in _publish if self._process and not self._process.is_alive(): File "/home/ad/anaconda3/envs/tfm/lib/python3.6/multiprocessing/process.py", line 134, in is_alive assert self._parent_pid == os.getpid(), 'can only test a child process' AssertionError: can only test a child process ``` My workaround was to just comment those lines without looking to much into consecuences: ``` def _publish(self, record: pb.Record, local: bool = None) -> None: #if self._process and not self._process.is_alive(): # raise Exception("The wandb backend process has shutdown") ``` It worked so far... I need to try running without wandb and see if it could be causing something wrong with multiprocessing. I am going to try to launch the training setting wandb to false and I will let you know again.
I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks.
396
Load large text file for LM pre-training resulting in OOM I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks. @lhoestq I looked quickly to a previously spoted bug in my env wandb /sdk/interface/interface.py, because sometimes when I load the dataset I got a multiprocessing error at line 510 in wandb...interface.py This bug is reported here https://github.com/huggingface/datasets/issues/847 ``` --------------------------------------------------------------------------- AssertionError Traceback (most recent call last) <timed eval> in <module> ~/anaconda3/envs/tfm/lib/python3.6/site-packages/transformers/trainer.py in train(self, model_path, trial) 877 print(len(epoch_iterator)) 878 --> 879 for step, inputs in enumerate(epoch_iterator): 880 881 start_step = time.time() ~/anaconda3/envs/tfm/lib/python3.6/site-packages/torch/utils/data/dataloader.py in __next__(self) 433 if self._sampler_iter is None: 434 self._reset() --> 435 data = self._next_data() 436 self._num_yielded += 1 437 if self._dataset_kind == _DatasetKind.Iterable and \ ~/anaconda3/envs/tfm/lib/python3.6/site-packages/torch/utils/data/dataloader.py in _next_data(self) 1083 else: 1084 del self._task_info[idx] -> 1085 return self._process_data(data) 1086 1087 def _try_put_index(self): ~/anaconda3/envs/tfm/lib/python3.6/site-packages/torch/utils/data/dataloader.py in _process_data(self, data) 1109 self._try_put_index() 1110 if isinstance(data, ExceptionWrapper): -> 1111 data.reraise() 1112 return data 1113 ~/anaconda3/envs/tfm/lib/python3.6/site-packages/torch/_utils.py in reraise(self) 426 # have message field 427 raise self.exc_type(message=msg) --> 428 raise self.exc_type(msg) 429 430 AssertionError: Caught AssertionError in DataLoader worker process 0. Original Traceback (most recent call last): File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/torch/utils/data/_utils/worker.py", line 198, in _worker_loop data = fetcher.fetch(index) File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/torch/utils/data/_utils/fetch.py", line 44, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/torch/utils/data/_utils/fetch.py", line 44, in <listcomp> data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/datasets/arrow_dataset.py", line 1083, in __getitem__ format_kwargs=self._format_kwargs, File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/datasets/arrow_dataset.py", line 1070, in _getitem format_kwargs=format_kwargs, File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/datasets/arrow_dataset.py", line 886, in _convert_outputs v = map_nested(command, v, **map_nested_kwargs) File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/datasets/utils/py_utils.py", line 216, in map_nested return function(data_struct) File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/datasets/arrow_dataset.py", line 847, in command return torch.tensor(x, **format_kwargs) File "/home/ad/anaconda3/envs/tfm/lib/python3.6/warnings.py", line 101, in _showwarnmsg _showwarnmsg_impl(msg) File "/home/ad/anaconda3/envs/tfm/lib/python3.6/warnings.py", line 30, in _showwarnmsg_impl file.write(text) File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/wandb/sdk/lib/redirect.py", line 100, in new_write cb(name, data) File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/wandb/sdk/wandb_run.py", line 729, in _console_callback self._backend.interface.publish_output(name, data) File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/wandb/sdk/interface/interface.py", line 186, in publish_output self._publish_output(o) File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/wandb/sdk/interface/interface.py", line 191, in _publish_output self._publish(rec) File "/home/ad/anaconda3/envs/tfm/lib/python3.6/site-packages/wandb/sdk/interface/interface.py", line 510, in _publish if self._process and not self._process.is_alive(): File "/home/ad/anaconda3/envs/tfm/lib/python3.6/multiprocessing/process.py", line 134, in is_alive assert self._parent_pid == os.getpid(), 'can only test a child process' AssertionError: can only test a child process ``` My workaround was to just comment those lines without looking to much into consecuences: ``` def _publish(self, record: pb.Record, local: bool = None) -> None: #if self._process and not self._process.is_alive(): # raise Exception("The wandb backend process has shutdown") ``` It worked so far... I need to try running without wandb and see if it could be causing something wrong with multiprocessing. I am going to try to launch the training setting wandb to false and I will let you know again.
[ -0.6339286566, -0.4775367379, 0.0106935017, 0.2986355722, 0.3600474596, -0.1518248618, 0.5567324758, 0.3738059402, 0.0108827576, 0.0107196309, -0.129592672, -0.1827997565, -0.2669844627, -0.1620898247, -0.0323721357, 0.0232065637, -0.0998215079, 0.1968753189, -0.2772667408, -0.0966669172, 0.0570117384, -0.0776291862, -0.2127603889, -0.0297442861, -0.3879216015, -0.0617137849, 0.3158174157, 0.1221210212, -0.1640810221, -0.0958083943, -0.2297017574, 0.1197900772, 0.439348489, 0.4340211749, -0.0001147721, 0.0066160336, 0.1890438497, -0.2363324165, -0.1672690213, 0.0192163214, 0.0775555521, -0.3022914231, -0.1064001918, -0.3036173582, -0.0272463262, -0.1061252505, -0.0156821907, -0.2243723869, 0.4987519681, 0.5414042473, 0.158762753, 0.2508125007, -0.2136608064, 0.0723968297, 0.2544375062, 0.0488999188, 0.1056828648, -0.2795148194, 0.4687834084, -0.1923182011, -0.4726887643, 0.147817716, -0.2247735411, -0.2016820759, -0.0950891525, 0.0382048525, 0.3660891652, 0.0315695107, 0.2641872764, 0.3474208415, 0.4272072911, -0.0008119822, -0.2896353602, -0.3209854364, -0.142623961, -0.0843160674, 0.1892364621, 0.2010776401, 0.0598267131, 0.1184588969, 0.1071273386, -0.1253313124, -0.1519896984, 0.0761606991, -0.2748077214, 0.3397959173, -0.2012510002, -0.0380433276, 0.1555755138, -0.0944792777, 0.1961691678, -0.0997751206, 0.0034044338, 0.256714642, -0.2454416156, -0.1123953462, -0.0716817975, -0.5194139481, 0.1627842039, -0.275945127, 0.1981878579, 0.2972391844, -0.0876616836, -0.0676169693, 0.0863933712, 0.4202023745, -0.2524377108, 0.2605276108, 0.2272560447, 0.1640110612, -0.1717063487, -0.044398699, -0.3305115998, -0.1962453574, 0.1015936062, -0.0773412734, -0.0110169947, -0.2546044588, -0.2404216379, 0.0585461818, -0.1407698691, -0.0308450628, 0.2748159468, 0.3899732232, -0.3540614843, 0.429497689, 0.1651095748, 0.0523289181, -0.4855332375, -0.3365268707, -0.1744022667, 0.1532573104, -0.2032445073, 0.0389754251, 0.1297925711, 0.1396334916, 0.0596540496, -0.0428831056, -0.062589474, -0.4402189851, -0.0094447248, -0.0241031349, 0.0207529441, -0.0425147153, 0.0630241409, -0.0233657025, 0.2347659618, -0.124994956, -0.0253818408, 0.347791791, -0.2762121558, -0.2276411355, 0.0509795845, 0.2101765722, -0.0282484777, 0.2091551125, 0.0285063665, 0.0768566355, 0.5359786153, 0.0066538528, -0.0471477136, -0.3782058656, 0.0945213884, 0.0920172632, 0.1619913876, 0.176630199, -0.0425881371, -0.0726022273, 0.0063893422, 0.1035867184, 0.2550407052, 0.3907325864, -0.2264701426, 0.2906734943, 0.0070044398, -0.0566552505, 0.5319920778, -0.2620993257, -0.4484443367, 0.2338527292, -0.3511552811, 0.1081818715, 0.2206701636, 0.1759027839, 0.1138891205, -0.1089850664, 0.2690153718, 0.3656853139, 0.0705155656, 0.1209142357, -0.2366243303, -0.0922287181, -0.1715156287, 0.4842028618, 0.3627239466, -0.0255859457, -0.073089242, 0.3563796282, 0.1833981574, -0.1992233247, 0.1312805116, 0.3693415821, -0.1744246632, -0.126158461, 0.0944814235, 0.065598011, -0.3000324965, -0.0709871948, -0.0908449665, 0.0828673989, -0.1136057153, -0.0894262716, 0.2825582922, 0.0228702873, -0.076665625, 0.1131795123, 0.0379412323, -0.1318216175, 0.0248667225, -0.0704166517, 0.1261966676, 0.0154216904, -0.0932069048, 0.0873603672, -0.218415916, -0.1239407063, 0.0018027006, -0.1370613873, 0.0769498348, 0.1752731055, -0.0511879697, 0.038401451, -0.1730400473, 0.1069573462, -0.1529778838, -0.2987068594, -0.5071570873, 0.435893029, 0.1864787787, -0.338109374, 0.3402595818, 0.104027614, -0.0295993853, -0.1062710211, -0.1198950633, 0.5749330521, 0.2730709612, 0.019617686, 0.2990927994, -0.2454116046, 0.2542809844, -0.2662388682, -0.1036153138, -0.0977649093, 0.4209234118, 0.0310557783, 0.2350629866, 0.148280412, -0.0869456232, -0.2320593894, 0.4703808725, -0.1788093746, -0.0138720572, 0.2665264308, -0.4247210622, -0.0115635023, -0.3480208218, -0.2494779825, 0.0067337304, 0.1732288599, -0.0944132358, -0.0554348305, 0.008538004, -0.2627931833, 0.0078360327, 0.153237164, -0.0718519539, 0.3198346198, 0.1165152863, 0.1005953103, -0.1228921488, -0.1794068366, 0.1028476655, 0.1782900989, -0.275191009, 0.1844428182, -0.0228228644, -0.0134468302, -0.2862631679, 0.1904247552, -0.1268012226, 0.0388155058, -0.064743273, 0.2586891651, -0.0779029801, 0.1019122973, 0.3001608849, 0.3549228907, 0.462962091, -0.0780947804, 0.3263480961, -0.0826636776, -0.1796956658, 0.0329436511, 0.3345074356, -0.3179924488, -0.0248901304, 0.0053771734, 0.0511769727, -0.098929733, -0.1660347879, 0.0946891308, 0.1968245208, -0.1661896259, -0.2573056519, -0.0721762553, 0.1776912063, -0.0424321517, 0.1197286695, -0.2324413657, 0.0084067807, 0.1450569928, -0.0062898882, -0.1843237281, 0.0404452905, -0.1199223399, -0.0470516533, -0.4116415977, -0.053518068, 0.1070510522, 0.1367316544, 0.3593674898, 0.1067891791, 0.194451645, 0.2884450853, 0.1585922837, 0.0063672885, -0.0885991901, 0.5141718388, -0.0175727978, -0.2971983254, -0.2908624411, -0.2397544682, 0.0251131542, 0.3155243993, -0.6842602491, 0.2337829173, -0.1138520464, -0.0940265656, -0.3955192268, 0.0007257089, 0.252205044, -0.0761481375, 0.0163395926, 0.0954741985, 0.2302769721, 0.1546791494, -0.0387872718, 0.3857944906, -0.4130838811, 0.359408766, -0.1354579031, 0.4855529666, -0.1221182644, -0.2300207019, 0.0988750979, 0.1504482776, -0.0171156786, -0.0218024775, -0.0139686503, -0.2593793869, -0.0582374632, 0.0227312893, 0.5197216272, 0.0168899372, -0.1655897647, -0.0158990994, 0.0681643113, 0.2563194036, -0.2606988251, 0.4905276, -0.1907785088, 0.0184165761, -0.1996744126, 0.0187455565, -0.2049808204, -0.2150854766, -0.1641347706, 0.0934438407, -0.0303831622, -0.2240440995, -0.2116935551, 0.10314022, -0.3681344092, -0.0437650606, 0.1179903895, 0.2293324769, 0.1807705015, -0.1961435974, -0.086288482, -0.2018707544, 0.3574145138, 0.1934524179, 0.1124655455, 0.1632595956, -0.0568787605, -0.2155371606, -0.1774517596, -0.2904546559, 0.1419078708, 0.6505103111, 0.6102460623, -0.2162735164, -0.1585993618, 0.0791448355, 0.0026035607, -0.1652731597, -0.5487569571, -0.1285093129, -0.1654994041, -0.4410517216, 0.0046067014, 0.3584984243, 0.3558533788, -0.2264456898, -0.0318751894, 0.1387300938, -0.0007012784, 0.3717512488, -0.1297643632, 0.0150536448, -0.066410847, 0.2258286923, -0.1074174196, 0.3673555255, 0.2488899529, 0.3563638031, 0.1917256266, -0.3175569773, 0.0706882849, -0.0298886281, 0.2314305753, 0.115161255, 0.1102285907, 0.3546282947, 0.1289709657, 0.3322556615, -0.1909662783, 0.6737231016, -0.0073441286, 0.2068271488, -0.8374903798, -0.2673701644, 0.3314557374, -0.0263577811, 0.1994024962, 0.215717867, -0.4038517177, -0.3216702938, 0.3766611218, 0.0990837663, 0.7675249577, -0.2972689867, 0.3962582052, -0.1552994698, 0.6611967087, 0.1280310154, -0.3813101053, 0.1585618854, -0.3987909555, -0.2663655877, 0.2130052447, -0.0923116431, 0.0365401059, 0.1605039835, -0.2905768156, -0.0298117511, 0.0862519294, 0.2249331921, 0.1306698024, 0.5134935379, 0.120926626, -0.6006980538, -0.0828129277, 0.0265690908, -0.0234003812, -0.1049019024, -0.0492794514, 0.0799459741, -0.0313268006, -0.2085639238, -0.0284662768, 0.2052582055, -0.2474462092, 0.0866158754, 0.0637520179, -0.0720534623, 0.5367990732, 0.1062302291, -0.0013607293, 0.4988959134, -0.0392867513, 0.1321583986, -0.2274820209, -0.1652866751, 0.2007084042, -0.0658488423, 0.4217028916, 0.0883791074, -0.00058797, 0.1141032875, 0.0632384419, -0.0466908254, 0.0947640389, -0.3643093109, 0.1054970175, -0.4112315476, -0.2154278457, -0.0884305984, -0.3007827401, -0.1009221599, 0.123109594, 0.1373566687, -0.464446038, 0.1151452065, 0.2652114034, -0.375010103, 0.0119969212, 0.3838247359, 0.1710590869, -0.1051390767, 0.499843061, 0.457587719, -0.0533619151, -0.2644975185, -0.1096296161, 0.2944480777, -0.2731434405, 0.0849161297, -0.2455591708, -0.3016008735, 0.2093508542, 0.0808756128, 0.1000995561, -0.0149994157, -0.0632621273, -0.2510291338, -0.6380496025, -0.1814441532, -0.0027227891, -0.0035484964, 0.1460835785, 0.2738982141, -0.3100468516, 0.5892775059, -0.2788560092, -0.0529419072, -0.2076085508, 0.2782230973, -0.0555623658, -0.3855459988, 0.2115361542, 0.2033527046, 0.1001042053, 0.0805963874, 0.0059093423, -0.2213539034, -0.1435202807, 0.112715289, 0.2364698499, -0.4196630418, -0.0159935355, -0.3129969835, -0.0924898088, -0.2700214386, 0.0660682172, 0.0170285404, 0.0139458058, -0.2261784673, 0.0093940832, 0.0475259274, -0.2552625537, 0.042060256, -0.2545849085, 0.1250909567, 0.246571213, 0.1331031621, 0.1178301424, 0.0006369948, -0.3931023479, 0.1796455681, 0.6661384702, -0.0175323728, 0.419082135, -0.4357323647, -0.0715063736, 0.247659564, 0.0498031005, 0.0480658486, -0.3416772485, -0.0107186846, 0.2935878038, 0.2268191278, 0.0152087137, -0.0147332661, 0.2263551354, -0.4112984538, 0.1301490664, 0.3453423977, -0.1119266823, -0.0674753264, 0.1559654474, 0.0913231522, 0.5066777468, 0.1149650663, 0.0057111867, 0.2489495873, -0.0012200177, 0.1548573822, 0.4856925309, 0.2611194551, 0.2386806309, 0.3016281128, -0.0326658972, -0.0406999141, -0.2312837243, 0.2294136882, -0.435616076, -0.415166378, -0.2674563825, 0.5044151545, -0.0193236805, 0.1242090911, -0.3350303173, -0.0662411451, -0.0406532101, 0.2541327477, -0.1192157567, 0.2744088173, -0.4530941248, -0.1705991924, -0.0894243345, -0.123961933, 0.0115341395, 0.1896333247, -0.0558586121, 0.2722765803, -0.5796659589, 0.2293862104, 0.0521361418, 0.1451499909, -0.1346786618, 0.1227787733, 0.1055012345, -0.5078569651, 0.3433353007, 0.2378563285, 0.2918590903, 0.115404956, -0.221906662, 0.3443704844, 0.4366974831, -0.1908073723, -0.0406121649, -0.1209841669, -0.0356197953, -0.1499129832, 0.1974350214, 0.3490367532, 0.1878990531, 0.2596160173, 0.0557278506, -0.265966177, 0.2785172462, 0.0650140941, -0.3240586519, 0.1786668301, -0.1329390854, 0.4019319415, -0.3285808265, -0.4429640174, 0.009867128, -0.3941243589, 0.2155868858, 0.6670196056, -0.2954224646, 0.2878553569, -0.0254881512, 0.0485503711, 0.0018406976, 0.4760496616, 0.4697353244, 0.2841596603, -0.3783711195, -0.0256674215, -0.3288833499, 0.2765586376, -0.2024302334, 0.0669976398, 0.0222478993, 0.1707946956, 0.1645776778, -0.0033023204, -0.0796225369, -0.0136221573, 0.218803376, -0.1950690895, -0.1675592959, -0.1280471683, 0.0020943638, 0.3573175073, -0.0259005763, -0.2975495458, 0.1588650346, 0.1150627136, 0.0991571695, -0.3214533329, 0.3626994789, -0.1984633654, -0.0203625765, 0.0691283345, -0.1504710466, 0.3792098761, -0.2204388827, -0.0330470726, -0.0818270966, 0.1281416565, -0.1670770645, -0.2209998518, 0.3402220011, 0.3884243667, -0.1076362282, 0.1082137078, -0.255192548, -0.0864622518, 0.0403043032, -0.369690299, -0.3087161779, -0.0074875243, -0.0002530813, 0.0085244328, 0.1030983254, 0.2458303273, 0.1053374782, -0.0278403312, -0.2292289734, -0.1855804026, 0.2094751894, -0.2946497798, -0.3130407035, -0.0353798121, 0.055624567, -0.1636292785, 0.2511017621, -0.4624068141, 0.1457485557, 0.2297439128, -0.0255144164, -0.4540739357, 0.2232972682, 0.2540614009, 0.0164720193, -0.2076969594, 0.0534572713, 0.152623266, -0.1369921267, -0.2016401291, -0.1757197082 ]
https://github.com/huggingface/datasets/issues/633
Load large text file for LM pre-training resulting in OOM
@lhoestq But despite this, I got lost into the [class Dataset()](https://huggingface.co/docs/datasets/_modules/datasets/arrow_dataset.html#Dataset) reading the pyarrow files. Edit: but you should be rigth, that it does not have to be related to multiprocessing since it keeps happening when `num_workers=0`
I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks.
37
Load large text file for LM pre-training resulting in OOM I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks. @lhoestq But despite this, I got lost into the [class Dataset()](https://huggingface.co/docs/datasets/_modules/datasets/arrow_dataset.html#Dataset) reading the pyarrow files. Edit: but you should be rigth, that it does not have to be related to multiprocessing since it keeps happening when `num_workers=0`
[ -0.6339286566, -0.4775367379, 0.0106935017, 0.2986355722, 0.3600474596, -0.1518248618, 0.5567324758, 0.3738059402, 0.0108827576, 0.0107196309, -0.129592672, -0.1827997565, -0.2669844627, -0.1620898247, -0.0323721357, 0.0232065637, -0.0998215079, 0.1968753189, -0.2772667408, -0.0966669172, 0.0570117384, -0.0776291862, -0.2127603889, -0.0297442861, -0.3879216015, -0.0617137849, 0.3158174157, 0.1221210212, -0.1640810221, -0.0958083943, -0.2297017574, 0.1197900772, 0.439348489, 0.4340211749, -0.0001147721, 0.0066160336, 0.1890438497, -0.2363324165, -0.1672690213, 0.0192163214, 0.0775555521, -0.3022914231, -0.1064001918, -0.3036173582, -0.0272463262, -0.1061252505, -0.0156821907, -0.2243723869, 0.4987519681, 0.5414042473, 0.158762753, 0.2508125007, -0.2136608064, 0.0723968297, 0.2544375062, 0.0488999188, 0.1056828648, -0.2795148194, 0.4687834084, -0.1923182011, -0.4726887643, 0.147817716, -0.2247735411, -0.2016820759, -0.0950891525, 0.0382048525, 0.3660891652, 0.0315695107, 0.2641872764, 0.3474208415, 0.4272072911, -0.0008119822, -0.2896353602, -0.3209854364, -0.142623961, -0.0843160674, 0.1892364621, 0.2010776401, 0.0598267131, 0.1184588969, 0.1071273386, -0.1253313124, -0.1519896984, 0.0761606991, -0.2748077214, 0.3397959173, -0.2012510002, -0.0380433276, 0.1555755138, -0.0944792777, 0.1961691678, -0.0997751206, 0.0034044338, 0.256714642, -0.2454416156, -0.1123953462, -0.0716817975, -0.5194139481, 0.1627842039, -0.275945127, 0.1981878579, 0.2972391844, -0.0876616836, -0.0676169693, 0.0863933712, 0.4202023745, -0.2524377108, 0.2605276108, 0.2272560447, 0.1640110612, -0.1717063487, -0.044398699, -0.3305115998, -0.1962453574, 0.1015936062, -0.0773412734, -0.0110169947, -0.2546044588, -0.2404216379, 0.0585461818, -0.1407698691, -0.0308450628, 0.2748159468, 0.3899732232, -0.3540614843, 0.429497689, 0.1651095748, 0.0523289181, -0.4855332375, -0.3365268707, -0.1744022667, 0.1532573104, -0.2032445073, 0.0389754251, 0.1297925711, 0.1396334916, 0.0596540496, -0.0428831056, -0.062589474, -0.4402189851, -0.0094447248, -0.0241031349, 0.0207529441, -0.0425147153, 0.0630241409, -0.0233657025, 0.2347659618, -0.124994956, -0.0253818408, 0.347791791, -0.2762121558, -0.2276411355, 0.0509795845, 0.2101765722, -0.0282484777, 0.2091551125, 0.0285063665, 0.0768566355, 0.5359786153, 0.0066538528, -0.0471477136, -0.3782058656, 0.0945213884, 0.0920172632, 0.1619913876, 0.176630199, -0.0425881371, -0.0726022273, 0.0063893422, 0.1035867184, 0.2550407052, 0.3907325864, -0.2264701426, 0.2906734943, 0.0070044398, -0.0566552505, 0.5319920778, -0.2620993257, -0.4484443367, 0.2338527292, -0.3511552811, 0.1081818715, 0.2206701636, 0.1759027839, 0.1138891205, -0.1089850664, 0.2690153718, 0.3656853139, 0.0705155656, 0.1209142357, -0.2366243303, -0.0922287181, -0.1715156287, 0.4842028618, 0.3627239466, -0.0255859457, -0.073089242, 0.3563796282, 0.1833981574, -0.1992233247, 0.1312805116, 0.3693415821, -0.1744246632, -0.126158461, 0.0944814235, 0.065598011, -0.3000324965, -0.0709871948, -0.0908449665, 0.0828673989, -0.1136057153, -0.0894262716, 0.2825582922, 0.0228702873, -0.076665625, 0.1131795123, 0.0379412323, -0.1318216175, 0.0248667225, -0.0704166517, 0.1261966676, 0.0154216904, -0.0932069048, 0.0873603672, -0.218415916, -0.1239407063, 0.0018027006, -0.1370613873, 0.0769498348, 0.1752731055, -0.0511879697, 0.038401451, -0.1730400473, 0.1069573462, -0.1529778838, -0.2987068594, -0.5071570873, 0.435893029, 0.1864787787, -0.338109374, 0.3402595818, 0.104027614, -0.0295993853, -0.1062710211, -0.1198950633, 0.5749330521, 0.2730709612, 0.019617686, 0.2990927994, -0.2454116046, 0.2542809844, -0.2662388682, -0.1036153138, -0.0977649093, 0.4209234118, 0.0310557783, 0.2350629866, 0.148280412, -0.0869456232, -0.2320593894, 0.4703808725, -0.1788093746, -0.0138720572, 0.2665264308, -0.4247210622, -0.0115635023, -0.3480208218, -0.2494779825, 0.0067337304, 0.1732288599, -0.0944132358, -0.0554348305, 0.008538004, -0.2627931833, 0.0078360327, 0.153237164, -0.0718519539, 0.3198346198, 0.1165152863, 0.1005953103, -0.1228921488, -0.1794068366, 0.1028476655, 0.1782900989, -0.275191009, 0.1844428182, -0.0228228644, -0.0134468302, -0.2862631679, 0.1904247552, -0.1268012226, 0.0388155058, -0.064743273, 0.2586891651, -0.0779029801, 0.1019122973, 0.3001608849, 0.3549228907, 0.462962091, -0.0780947804, 0.3263480961, -0.0826636776, -0.1796956658, 0.0329436511, 0.3345074356, -0.3179924488, -0.0248901304, 0.0053771734, 0.0511769727, -0.098929733, -0.1660347879, 0.0946891308, 0.1968245208, -0.1661896259, -0.2573056519, -0.0721762553, 0.1776912063, -0.0424321517, 0.1197286695, -0.2324413657, 0.0084067807, 0.1450569928, -0.0062898882, -0.1843237281, 0.0404452905, -0.1199223399, -0.0470516533, -0.4116415977, -0.053518068, 0.1070510522, 0.1367316544, 0.3593674898, 0.1067891791, 0.194451645, 0.2884450853, 0.1585922837, 0.0063672885, -0.0885991901, 0.5141718388, -0.0175727978, -0.2971983254, -0.2908624411, -0.2397544682, 0.0251131542, 0.3155243993, -0.6842602491, 0.2337829173, -0.1138520464, -0.0940265656, -0.3955192268, 0.0007257089, 0.252205044, -0.0761481375, 0.0163395926, 0.0954741985, 0.2302769721, 0.1546791494, -0.0387872718, 0.3857944906, -0.4130838811, 0.359408766, -0.1354579031, 0.4855529666, -0.1221182644, -0.2300207019, 0.0988750979, 0.1504482776, -0.0171156786, -0.0218024775, -0.0139686503, -0.2593793869, -0.0582374632, 0.0227312893, 0.5197216272, 0.0168899372, -0.1655897647, -0.0158990994, 0.0681643113, 0.2563194036, -0.2606988251, 0.4905276, -0.1907785088, 0.0184165761, -0.1996744126, 0.0187455565, -0.2049808204, -0.2150854766, -0.1641347706, 0.0934438407, -0.0303831622, -0.2240440995, -0.2116935551, 0.10314022, -0.3681344092, -0.0437650606, 0.1179903895, 0.2293324769, 0.1807705015, -0.1961435974, -0.086288482, -0.2018707544, 0.3574145138, 0.1934524179, 0.1124655455, 0.1632595956, -0.0568787605, -0.2155371606, -0.1774517596, -0.2904546559, 0.1419078708, 0.6505103111, 0.6102460623, -0.2162735164, -0.1585993618, 0.0791448355, 0.0026035607, -0.1652731597, -0.5487569571, -0.1285093129, -0.1654994041, -0.4410517216, 0.0046067014, 0.3584984243, 0.3558533788, -0.2264456898, -0.0318751894, 0.1387300938, -0.0007012784, 0.3717512488, -0.1297643632, 0.0150536448, -0.066410847, 0.2258286923, -0.1074174196, 0.3673555255, 0.2488899529, 0.3563638031, 0.1917256266, -0.3175569773, 0.0706882849, -0.0298886281, 0.2314305753, 0.115161255, 0.1102285907, 0.3546282947, 0.1289709657, 0.3322556615, -0.1909662783, 0.6737231016, -0.0073441286, 0.2068271488, -0.8374903798, -0.2673701644, 0.3314557374, -0.0263577811, 0.1994024962, 0.215717867, -0.4038517177, -0.3216702938, 0.3766611218, 0.0990837663, 0.7675249577, -0.2972689867, 0.3962582052, -0.1552994698, 0.6611967087, 0.1280310154, -0.3813101053, 0.1585618854, -0.3987909555, -0.2663655877, 0.2130052447, -0.0923116431, 0.0365401059, 0.1605039835, -0.2905768156, -0.0298117511, 0.0862519294, 0.2249331921, 0.1306698024, 0.5134935379, 0.120926626, -0.6006980538, -0.0828129277, 0.0265690908, -0.0234003812, -0.1049019024, -0.0492794514, 0.0799459741, -0.0313268006, -0.2085639238, -0.0284662768, 0.2052582055, -0.2474462092, 0.0866158754, 0.0637520179, -0.0720534623, 0.5367990732, 0.1062302291, -0.0013607293, 0.4988959134, -0.0392867513, 0.1321583986, -0.2274820209, -0.1652866751, 0.2007084042, -0.0658488423, 0.4217028916, 0.0883791074, -0.00058797, 0.1141032875, 0.0632384419, -0.0466908254, 0.0947640389, -0.3643093109, 0.1054970175, -0.4112315476, -0.2154278457, -0.0884305984, -0.3007827401, -0.1009221599, 0.123109594, 0.1373566687, -0.464446038, 0.1151452065, 0.2652114034, -0.375010103, 0.0119969212, 0.3838247359, 0.1710590869, -0.1051390767, 0.499843061, 0.457587719, -0.0533619151, -0.2644975185, -0.1096296161, 0.2944480777, -0.2731434405, 0.0849161297, -0.2455591708, -0.3016008735, 0.2093508542, 0.0808756128, 0.1000995561, -0.0149994157, -0.0632621273, -0.2510291338, -0.6380496025, -0.1814441532, -0.0027227891, -0.0035484964, 0.1460835785, 0.2738982141, -0.3100468516, 0.5892775059, -0.2788560092, -0.0529419072, -0.2076085508, 0.2782230973, -0.0555623658, -0.3855459988, 0.2115361542, 0.2033527046, 0.1001042053, 0.0805963874, 0.0059093423, -0.2213539034, -0.1435202807, 0.112715289, 0.2364698499, -0.4196630418, -0.0159935355, -0.3129969835, -0.0924898088, -0.2700214386, 0.0660682172, 0.0170285404, 0.0139458058, -0.2261784673, 0.0093940832, 0.0475259274, -0.2552625537, 0.042060256, -0.2545849085, 0.1250909567, 0.246571213, 0.1331031621, 0.1178301424, 0.0006369948, -0.3931023479, 0.1796455681, 0.6661384702, -0.0175323728, 0.419082135, -0.4357323647, -0.0715063736, 0.247659564, 0.0498031005, 0.0480658486, -0.3416772485, -0.0107186846, 0.2935878038, 0.2268191278, 0.0152087137, -0.0147332661, 0.2263551354, -0.4112984538, 0.1301490664, 0.3453423977, -0.1119266823, -0.0674753264, 0.1559654474, 0.0913231522, 0.5066777468, 0.1149650663, 0.0057111867, 0.2489495873, -0.0012200177, 0.1548573822, 0.4856925309, 0.2611194551, 0.2386806309, 0.3016281128, -0.0326658972, -0.0406999141, -0.2312837243, 0.2294136882, -0.435616076, -0.415166378, -0.2674563825, 0.5044151545, -0.0193236805, 0.1242090911, -0.3350303173, -0.0662411451, -0.0406532101, 0.2541327477, -0.1192157567, 0.2744088173, -0.4530941248, -0.1705991924, -0.0894243345, -0.123961933, 0.0115341395, 0.1896333247, -0.0558586121, 0.2722765803, -0.5796659589, 0.2293862104, 0.0521361418, 0.1451499909, -0.1346786618, 0.1227787733, 0.1055012345, -0.5078569651, 0.3433353007, 0.2378563285, 0.2918590903, 0.115404956, -0.221906662, 0.3443704844, 0.4366974831, -0.1908073723, -0.0406121649, -0.1209841669, -0.0356197953, -0.1499129832, 0.1974350214, 0.3490367532, 0.1878990531, 0.2596160173, 0.0557278506, -0.265966177, 0.2785172462, 0.0650140941, -0.3240586519, 0.1786668301, -0.1329390854, 0.4019319415, -0.3285808265, -0.4429640174, 0.009867128, -0.3941243589, 0.2155868858, 0.6670196056, -0.2954224646, 0.2878553569, -0.0254881512, 0.0485503711, 0.0018406976, 0.4760496616, 0.4697353244, 0.2841596603, -0.3783711195, -0.0256674215, -0.3288833499, 0.2765586376, -0.2024302334, 0.0669976398, 0.0222478993, 0.1707946956, 0.1645776778, -0.0033023204, -0.0796225369, -0.0136221573, 0.218803376, -0.1950690895, -0.1675592959, -0.1280471683, 0.0020943638, 0.3573175073, -0.0259005763, -0.2975495458, 0.1588650346, 0.1150627136, 0.0991571695, -0.3214533329, 0.3626994789, -0.1984633654, -0.0203625765, 0.0691283345, -0.1504710466, 0.3792098761, -0.2204388827, -0.0330470726, -0.0818270966, 0.1281416565, -0.1670770645, -0.2209998518, 0.3402220011, 0.3884243667, -0.1076362282, 0.1082137078, -0.255192548, -0.0864622518, 0.0403043032, -0.369690299, -0.3087161779, -0.0074875243, -0.0002530813, 0.0085244328, 0.1030983254, 0.2458303273, 0.1053374782, -0.0278403312, -0.2292289734, -0.1855804026, 0.2094751894, -0.2946497798, -0.3130407035, -0.0353798121, 0.055624567, -0.1636292785, 0.2511017621, -0.4624068141, 0.1457485557, 0.2297439128, -0.0255144164, -0.4540739357, 0.2232972682, 0.2540614009, 0.0164720193, -0.2076969594, 0.0534572713, 0.152623266, -0.1369921267, -0.2016401291, -0.1757197082 ]
https://github.com/huggingface/datasets/issues/633
Load large text file for LM pre-training resulting in OOM
Or maybe wandb uses multiprocessing ? One process for wandb logging and one for actual training ? If this is the case then even setting `num_workers=0` would cause the process to be forked for wandb and therefore cause the memory issue.
I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks.
41
Load large text file for LM pre-training resulting in OOM I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks. Or maybe wandb uses multiprocessing ? One process for wandb logging and one for actual training ? If this is the case then even setting `num_workers=0` would cause the process to be forked for wandb and therefore cause the memory issue.
[ -0.6339286566, -0.4775367379, 0.0106935017, 0.2986355722, 0.3600474596, -0.1518248618, 0.5567324758, 0.3738059402, 0.0108827576, 0.0107196309, -0.129592672, -0.1827997565, -0.2669844627, -0.1620898247, -0.0323721357, 0.0232065637, -0.0998215079, 0.1968753189, -0.2772667408, -0.0966669172, 0.0570117384, -0.0776291862, -0.2127603889, -0.0297442861, -0.3879216015, -0.0617137849, 0.3158174157, 0.1221210212, -0.1640810221, -0.0958083943, -0.2297017574, 0.1197900772, 0.439348489, 0.4340211749, -0.0001147721, 0.0066160336, 0.1890438497, -0.2363324165, -0.1672690213, 0.0192163214, 0.0775555521, -0.3022914231, -0.1064001918, -0.3036173582, -0.0272463262, -0.1061252505, -0.0156821907, -0.2243723869, 0.4987519681, 0.5414042473, 0.158762753, 0.2508125007, -0.2136608064, 0.0723968297, 0.2544375062, 0.0488999188, 0.1056828648, -0.2795148194, 0.4687834084, -0.1923182011, -0.4726887643, 0.147817716, -0.2247735411, -0.2016820759, -0.0950891525, 0.0382048525, 0.3660891652, 0.0315695107, 0.2641872764, 0.3474208415, 0.4272072911, -0.0008119822, -0.2896353602, -0.3209854364, -0.142623961, -0.0843160674, 0.1892364621, 0.2010776401, 0.0598267131, 0.1184588969, 0.1071273386, -0.1253313124, -0.1519896984, 0.0761606991, -0.2748077214, 0.3397959173, -0.2012510002, -0.0380433276, 0.1555755138, -0.0944792777, 0.1961691678, -0.0997751206, 0.0034044338, 0.256714642, -0.2454416156, -0.1123953462, -0.0716817975, -0.5194139481, 0.1627842039, -0.275945127, 0.1981878579, 0.2972391844, -0.0876616836, -0.0676169693, 0.0863933712, 0.4202023745, -0.2524377108, 0.2605276108, 0.2272560447, 0.1640110612, -0.1717063487, -0.044398699, -0.3305115998, -0.1962453574, 0.1015936062, -0.0773412734, -0.0110169947, -0.2546044588, -0.2404216379, 0.0585461818, -0.1407698691, -0.0308450628, 0.2748159468, 0.3899732232, -0.3540614843, 0.429497689, 0.1651095748, 0.0523289181, -0.4855332375, -0.3365268707, -0.1744022667, 0.1532573104, -0.2032445073, 0.0389754251, 0.1297925711, 0.1396334916, 0.0596540496, -0.0428831056, -0.062589474, -0.4402189851, -0.0094447248, -0.0241031349, 0.0207529441, -0.0425147153, 0.0630241409, -0.0233657025, 0.2347659618, -0.124994956, -0.0253818408, 0.347791791, -0.2762121558, -0.2276411355, 0.0509795845, 0.2101765722, -0.0282484777, 0.2091551125, 0.0285063665, 0.0768566355, 0.5359786153, 0.0066538528, -0.0471477136, -0.3782058656, 0.0945213884, 0.0920172632, 0.1619913876, 0.176630199, -0.0425881371, -0.0726022273, 0.0063893422, 0.1035867184, 0.2550407052, 0.3907325864, -0.2264701426, 0.2906734943, 0.0070044398, -0.0566552505, 0.5319920778, -0.2620993257, -0.4484443367, 0.2338527292, -0.3511552811, 0.1081818715, 0.2206701636, 0.1759027839, 0.1138891205, -0.1089850664, 0.2690153718, 0.3656853139, 0.0705155656, 0.1209142357, -0.2366243303, -0.0922287181, -0.1715156287, 0.4842028618, 0.3627239466, -0.0255859457, -0.073089242, 0.3563796282, 0.1833981574, -0.1992233247, 0.1312805116, 0.3693415821, -0.1744246632, -0.126158461, 0.0944814235, 0.065598011, -0.3000324965, -0.0709871948, -0.0908449665, 0.0828673989, -0.1136057153, -0.0894262716, 0.2825582922, 0.0228702873, -0.076665625, 0.1131795123, 0.0379412323, -0.1318216175, 0.0248667225, -0.0704166517, 0.1261966676, 0.0154216904, -0.0932069048, 0.0873603672, -0.218415916, -0.1239407063, 0.0018027006, -0.1370613873, 0.0769498348, 0.1752731055, -0.0511879697, 0.038401451, -0.1730400473, 0.1069573462, -0.1529778838, -0.2987068594, -0.5071570873, 0.435893029, 0.1864787787, -0.338109374, 0.3402595818, 0.104027614, -0.0295993853, -0.1062710211, -0.1198950633, 0.5749330521, 0.2730709612, 0.019617686, 0.2990927994, -0.2454116046, 0.2542809844, -0.2662388682, -0.1036153138, -0.0977649093, 0.4209234118, 0.0310557783, 0.2350629866, 0.148280412, -0.0869456232, -0.2320593894, 0.4703808725, -0.1788093746, -0.0138720572, 0.2665264308, -0.4247210622, -0.0115635023, -0.3480208218, -0.2494779825, 0.0067337304, 0.1732288599, -0.0944132358, -0.0554348305, 0.008538004, -0.2627931833, 0.0078360327, 0.153237164, -0.0718519539, 0.3198346198, 0.1165152863, 0.1005953103, -0.1228921488, -0.1794068366, 0.1028476655, 0.1782900989, -0.275191009, 0.1844428182, -0.0228228644, -0.0134468302, -0.2862631679, 0.1904247552, -0.1268012226, 0.0388155058, -0.064743273, 0.2586891651, -0.0779029801, 0.1019122973, 0.3001608849, 0.3549228907, 0.462962091, -0.0780947804, 0.3263480961, -0.0826636776, -0.1796956658, 0.0329436511, 0.3345074356, -0.3179924488, -0.0248901304, 0.0053771734, 0.0511769727, -0.098929733, -0.1660347879, 0.0946891308, 0.1968245208, -0.1661896259, -0.2573056519, -0.0721762553, 0.1776912063, -0.0424321517, 0.1197286695, -0.2324413657, 0.0084067807, 0.1450569928, -0.0062898882, -0.1843237281, 0.0404452905, -0.1199223399, -0.0470516533, -0.4116415977, -0.053518068, 0.1070510522, 0.1367316544, 0.3593674898, 0.1067891791, 0.194451645, 0.2884450853, 0.1585922837, 0.0063672885, -0.0885991901, 0.5141718388, -0.0175727978, -0.2971983254, -0.2908624411, -0.2397544682, 0.0251131542, 0.3155243993, -0.6842602491, 0.2337829173, -0.1138520464, -0.0940265656, -0.3955192268, 0.0007257089, 0.252205044, -0.0761481375, 0.0163395926, 0.0954741985, 0.2302769721, 0.1546791494, -0.0387872718, 0.3857944906, -0.4130838811, 0.359408766, -0.1354579031, 0.4855529666, -0.1221182644, -0.2300207019, 0.0988750979, 0.1504482776, -0.0171156786, -0.0218024775, -0.0139686503, -0.2593793869, -0.0582374632, 0.0227312893, 0.5197216272, 0.0168899372, -0.1655897647, -0.0158990994, 0.0681643113, 0.2563194036, -0.2606988251, 0.4905276, -0.1907785088, 0.0184165761, -0.1996744126, 0.0187455565, -0.2049808204, -0.2150854766, -0.1641347706, 0.0934438407, -0.0303831622, -0.2240440995, -0.2116935551, 0.10314022, -0.3681344092, -0.0437650606, 0.1179903895, 0.2293324769, 0.1807705015, -0.1961435974, -0.086288482, -0.2018707544, 0.3574145138, 0.1934524179, 0.1124655455, 0.1632595956, -0.0568787605, -0.2155371606, -0.1774517596, -0.2904546559, 0.1419078708, 0.6505103111, 0.6102460623, -0.2162735164, -0.1585993618, 0.0791448355, 0.0026035607, -0.1652731597, -0.5487569571, -0.1285093129, -0.1654994041, -0.4410517216, 0.0046067014, 0.3584984243, 0.3558533788, -0.2264456898, -0.0318751894, 0.1387300938, -0.0007012784, 0.3717512488, -0.1297643632, 0.0150536448, -0.066410847, 0.2258286923, -0.1074174196, 0.3673555255, 0.2488899529, 0.3563638031, 0.1917256266, -0.3175569773, 0.0706882849, -0.0298886281, 0.2314305753, 0.115161255, 0.1102285907, 0.3546282947, 0.1289709657, 0.3322556615, -0.1909662783, 0.6737231016, -0.0073441286, 0.2068271488, -0.8374903798, -0.2673701644, 0.3314557374, -0.0263577811, 0.1994024962, 0.215717867, -0.4038517177, -0.3216702938, 0.3766611218, 0.0990837663, 0.7675249577, -0.2972689867, 0.3962582052, -0.1552994698, 0.6611967087, 0.1280310154, -0.3813101053, 0.1585618854, -0.3987909555, -0.2663655877, 0.2130052447, -0.0923116431, 0.0365401059, 0.1605039835, -0.2905768156, -0.0298117511, 0.0862519294, 0.2249331921, 0.1306698024, 0.5134935379, 0.120926626, -0.6006980538, -0.0828129277, 0.0265690908, -0.0234003812, -0.1049019024, -0.0492794514, 0.0799459741, -0.0313268006, -0.2085639238, -0.0284662768, 0.2052582055, -0.2474462092, 0.0866158754, 0.0637520179, -0.0720534623, 0.5367990732, 0.1062302291, -0.0013607293, 0.4988959134, -0.0392867513, 0.1321583986, -0.2274820209, -0.1652866751, 0.2007084042, -0.0658488423, 0.4217028916, 0.0883791074, -0.00058797, 0.1141032875, 0.0632384419, -0.0466908254, 0.0947640389, -0.3643093109, 0.1054970175, -0.4112315476, -0.2154278457, -0.0884305984, -0.3007827401, -0.1009221599, 0.123109594, 0.1373566687, -0.464446038, 0.1151452065, 0.2652114034, -0.375010103, 0.0119969212, 0.3838247359, 0.1710590869, -0.1051390767, 0.499843061, 0.457587719, -0.0533619151, -0.2644975185, -0.1096296161, 0.2944480777, -0.2731434405, 0.0849161297, -0.2455591708, -0.3016008735, 0.2093508542, 0.0808756128, 0.1000995561, -0.0149994157, -0.0632621273, -0.2510291338, -0.6380496025, -0.1814441532, -0.0027227891, -0.0035484964, 0.1460835785, 0.2738982141, -0.3100468516, 0.5892775059, -0.2788560092, -0.0529419072, -0.2076085508, 0.2782230973, -0.0555623658, -0.3855459988, 0.2115361542, 0.2033527046, 0.1001042053, 0.0805963874, 0.0059093423, -0.2213539034, -0.1435202807, 0.112715289, 0.2364698499, -0.4196630418, -0.0159935355, -0.3129969835, -0.0924898088, -0.2700214386, 0.0660682172, 0.0170285404, 0.0139458058, -0.2261784673, 0.0093940832, 0.0475259274, -0.2552625537, 0.042060256, -0.2545849085, 0.1250909567, 0.246571213, 0.1331031621, 0.1178301424, 0.0006369948, -0.3931023479, 0.1796455681, 0.6661384702, -0.0175323728, 0.419082135, -0.4357323647, -0.0715063736, 0.247659564, 0.0498031005, 0.0480658486, -0.3416772485, -0.0107186846, 0.2935878038, 0.2268191278, 0.0152087137, -0.0147332661, 0.2263551354, -0.4112984538, 0.1301490664, 0.3453423977, -0.1119266823, -0.0674753264, 0.1559654474, 0.0913231522, 0.5066777468, 0.1149650663, 0.0057111867, 0.2489495873, -0.0012200177, 0.1548573822, 0.4856925309, 0.2611194551, 0.2386806309, 0.3016281128, -0.0326658972, -0.0406999141, -0.2312837243, 0.2294136882, -0.435616076, -0.415166378, -0.2674563825, 0.5044151545, -0.0193236805, 0.1242090911, -0.3350303173, -0.0662411451, -0.0406532101, 0.2541327477, -0.1192157567, 0.2744088173, -0.4530941248, -0.1705991924, -0.0894243345, -0.123961933, 0.0115341395, 0.1896333247, -0.0558586121, 0.2722765803, -0.5796659589, 0.2293862104, 0.0521361418, 0.1451499909, -0.1346786618, 0.1227787733, 0.1055012345, -0.5078569651, 0.3433353007, 0.2378563285, 0.2918590903, 0.115404956, -0.221906662, 0.3443704844, 0.4366974831, -0.1908073723, -0.0406121649, -0.1209841669, -0.0356197953, -0.1499129832, 0.1974350214, 0.3490367532, 0.1878990531, 0.2596160173, 0.0557278506, -0.265966177, 0.2785172462, 0.0650140941, -0.3240586519, 0.1786668301, -0.1329390854, 0.4019319415, -0.3285808265, -0.4429640174, 0.009867128, -0.3941243589, 0.2155868858, 0.6670196056, -0.2954224646, 0.2878553569, -0.0254881512, 0.0485503711, 0.0018406976, 0.4760496616, 0.4697353244, 0.2841596603, -0.3783711195, -0.0256674215, -0.3288833499, 0.2765586376, -0.2024302334, 0.0669976398, 0.0222478993, 0.1707946956, 0.1645776778, -0.0033023204, -0.0796225369, -0.0136221573, 0.218803376, -0.1950690895, -0.1675592959, -0.1280471683, 0.0020943638, 0.3573175073, -0.0259005763, -0.2975495458, 0.1588650346, 0.1150627136, 0.0991571695, -0.3214533329, 0.3626994789, -0.1984633654, -0.0203625765, 0.0691283345, -0.1504710466, 0.3792098761, -0.2204388827, -0.0330470726, -0.0818270966, 0.1281416565, -0.1670770645, -0.2209998518, 0.3402220011, 0.3884243667, -0.1076362282, 0.1082137078, -0.255192548, -0.0864622518, 0.0403043032, -0.369690299, -0.3087161779, -0.0074875243, -0.0002530813, 0.0085244328, 0.1030983254, 0.2458303273, 0.1053374782, -0.0278403312, -0.2292289734, -0.1855804026, 0.2094751894, -0.2946497798, -0.3130407035, -0.0353798121, 0.055624567, -0.1636292785, 0.2511017621, -0.4624068141, 0.1457485557, 0.2297439128, -0.0255144164, -0.4540739357, 0.2232972682, 0.2540614009, 0.0164720193, -0.2076969594, 0.0534572713, 0.152623266, -0.1369921267, -0.2016401291, -0.1757197082 ]
https://github.com/huggingface/datasets/issues/633
Load large text file for LM pre-training resulting in OOM
@lhoestq could be, but if we set wandb to false this should not happen. I am going to try.
I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks.
19
Load large text file for LM pre-training resulting in OOM I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks. @lhoestq could be, but if we set wandb to false this should not happen. I am going to try.
[ -0.6339286566, -0.4775367379, 0.0106935017, 0.2986355722, 0.3600474596, -0.1518248618, 0.5567324758, 0.3738059402, 0.0108827576, 0.0107196309, -0.129592672, -0.1827997565, -0.2669844627, -0.1620898247, -0.0323721357, 0.0232065637, -0.0998215079, 0.1968753189, -0.2772667408, -0.0966669172, 0.0570117384, -0.0776291862, -0.2127603889, -0.0297442861, -0.3879216015, -0.0617137849, 0.3158174157, 0.1221210212, -0.1640810221, -0.0958083943, -0.2297017574, 0.1197900772, 0.439348489, 0.4340211749, -0.0001147721, 0.0066160336, 0.1890438497, -0.2363324165, -0.1672690213, 0.0192163214, 0.0775555521, -0.3022914231, -0.1064001918, -0.3036173582, -0.0272463262, -0.1061252505, -0.0156821907, -0.2243723869, 0.4987519681, 0.5414042473, 0.158762753, 0.2508125007, -0.2136608064, 0.0723968297, 0.2544375062, 0.0488999188, 0.1056828648, -0.2795148194, 0.4687834084, -0.1923182011, -0.4726887643, 0.147817716, -0.2247735411, -0.2016820759, -0.0950891525, 0.0382048525, 0.3660891652, 0.0315695107, 0.2641872764, 0.3474208415, 0.4272072911, -0.0008119822, -0.2896353602, -0.3209854364, -0.142623961, -0.0843160674, 0.1892364621, 0.2010776401, 0.0598267131, 0.1184588969, 0.1071273386, -0.1253313124, -0.1519896984, 0.0761606991, -0.2748077214, 0.3397959173, -0.2012510002, -0.0380433276, 0.1555755138, -0.0944792777, 0.1961691678, -0.0997751206, 0.0034044338, 0.256714642, -0.2454416156, -0.1123953462, -0.0716817975, -0.5194139481, 0.1627842039, -0.275945127, 0.1981878579, 0.2972391844, -0.0876616836, -0.0676169693, 0.0863933712, 0.4202023745, -0.2524377108, 0.2605276108, 0.2272560447, 0.1640110612, -0.1717063487, -0.044398699, -0.3305115998, -0.1962453574, 0.1015936062, -0.0773412734, -0.0110169947, -0.2546044588, -0.2404216379, 0.0585461818, -0.1407698691, -0.0308450628, 0.2748159468, 0.3899732232, -0.3540614843, 0.429497689, 0.1651095748, 0.0523289181, -0.4855332375, -0.3365268707, -0.1744022667, 0.1532573104, -0.2032445073, 0.0389754251, 0.1297925711, 0.1396334916, 0.0596540496, -0.0428831056, -0.062589474, -0.4402189851, -0.0094447248, -0.0241031349, 0.0207529441, -0.0425147153, 0.0630241409, -0.0233657025, 0.2347659618, -0.124994956, -0.0253818408, 0.347791791, -0.2762121558, -0.2276411355, 0.0509795845, 0.2101765722, -0.0282484777, 0.2091551125, 0.0285063665, 0.0768566355, 0.5359786153, 0.0066538528, -0.0471477136, -0.3782058656, 0.0945213884, 0.0920172632, 0.1619913876, 0.176630199, -0.0425881371, -0.0726022273, 0.0063893422, 0.1035867184, 0.2550407052, 0.3907325864, -0.2264701426, 0.2906734943, 0.0070044398, -0.0566552505, 0.5319920778, -0.2620993257, -0.4484443367, 0.2338527292, -0.3511552811, 0.1081818715, 0.2206701636, 0.1759027839, 0.1138891205, -0.1089850664, 0.2690153718, 0.3656853139, 0.0705155656, 0.1209142357, -0.2366243303, -0.0922287181, -0.1715156287, 0.4842028618, 0.3627239466, -0.0255859457, -0.073089242, 0.3563796282, 0.1833981574, -0.1992233247, 0.1312805116, 0.3693415821, -0.1744246632, -0.126158461, 0.0944814235, 0.065598011, -0.3000324965, -0.0709871948, -0.0908449665, 0.0828673989, -0.1136057153, -0.0894262716, 0.2825582922, 0.0228702873, -0.076665625, 0.1131795123, 0.0379412323, -0.1318216175, 0.0248667225, -0.0704166517, 0.1261966676, 0.0154216904, -0.0932069048, 0.0873603672, -0.218415916, -0.1239407063, 0.0018027006, -0.1370613873, 0.0769498348, 0.1752731055, -0.0511879697, 0.038401451, -0.1730400473, 0.1069573462, -0.1529778838, -0.2987068594, -0.5071570873, 0.435893029, 0.1864787787, -0.338109374, 0.3402595818, 0.104027614, -0.0295993853, -0.1062710211, -0.1198950633, 0.5749330521, 0.2730709612, 0.019617686, 0.2990927994, -0.2454116046, 0.2542809844, -0.2662388682, -0.1036153138, -0.0977649093, 0.4209234118, 0.0310557783, 0.2350629866, 0.148280412, -0.0869456232, -0.2320593894, 0.4703808725, -0.1788093746, -0.0138720572, 0.2665264308, -0.4247210622, -0.0115635023, -0.3480208218, -0.2494779825, 0.0067337304, 0.1732288599, -0.0944132358, -0.0554348305, 0.008538004, -0.2627931833, 0.0078360327, 0.153237164, -0.0718519539, 0.3198346198, 0.1165152863, 0.1005953103, -0.1228921488, -0.1794068366, 0.1028476655, 0.1782900989, -0.275191009, 0.1844428182, -0.0228228644, -0.0134468302, -0.2862631679, 0.1904247552, -0.1268012226, 0.0388155058, -0.064743273, 0.2586891651, -0.0779029801, 0.1019122973, 0.3001608849, 0.3549228907, 0.462962091, -0.0780947804, 0.3263480961, -0.0826636776, -0.1796956658, 0.0329436511, 0.3345074356, -0.3179924488, -0.0248901304, 0.0053771734, 0.0511769727, -0.098929733, -0.1660347879, 0.0946891308, 0.1968245208, -0.1661896259, -0.2573056519, -0.0721762553, 0.1776912063, -0.0424321517, 0.1197286695, -0.2324413657, 0.0084067807, 0.1450569928, -0.0062898882, -0.1843237281, 0.0404452905, -0.1199223399, -0.0470516533, -0.4116415977, -0.053518068, 0.1070510522, 0.1367316544, 0.3593674898, 0.1067891791, 0.194451645, 0.2884450853, 0.1585922837, 0.0063672885, -0.0885991901, 0.5141718388, -0.0175727978, -0.2971983254, -0.2908624411, -0.2397544682, 0.0251131542, 0.3155243993, -0.6842602491, 0.2337829173, -0.1138520464, -0.0940265656, -0.3955192268, 0.0007257089, 0.252205044, -0.0761481375, 0.0163395926, 0.0954741985, 0.2302769721, 0.1546791494, -0.0387872718, 0.3857944906, -0.4130838811, 0.359408766, -0.1354579031, 0.4855529666, -0.1221182644, -0.2300207019, 0.0988750979, 0.1504482776, -0.0171156786, -0.0218024775, -0.0139686503, -0.2593793869, -0.0582374632, 0.0227312893, 0.5197216272, 0.0168899372, -0.1655897647, -0.0158990994, 0.0681643113, 0.2563194036, -0.2606988251, 0.4905276, -0.1907785088, 0.0184165761, -0.1996744126, 0.0187455565, -0.2049808204, -0.2150854766, -0.1641347706, 0.0934438407, -0.0303831622, -0.2240440995, -0.2116935551, 0.10314022, -0.3681344092, -0.0437650606, 0.1179903895, 0.2293324769, 0.1807705015, -0.1961435974, -0.086288482, -0.2018707544, 0.3574145138, 0.1934524179, 0.1124655455, 0.1632595956, -0.0568787605, -0.2155371606, -0.1774517596, -0.2904546559, 0.1419078708, 0.6505103111, 0.6102460623, -0.2162735164, -0.1585993618, 0.0791448355, 0.0026035607, -0.1652731597, -0.5487569571, -0.1285093129, -0.1654994041, -0.4410517216, 0.0046067014, 0.3584984243, 0.3558533788, -0.2264456898, -0.0318751894, 0.1387300938, -0.0007012784, 0.3717512488, -0.1297643632, 0.0150536448, -0.066410847, 0.2258286923, -0.1074174196, 0.3673555255, 0.2488899529, 0.3563638031, 0.1917256266, -0.3175569773, 0.0706882849, -0.0298886281, 0.2314305753, 0.115161255, 0.1102285907, 0.3546282947, 0.1289709657, 0.3322556615, -0.1909662783, 0.6737231016, -0.0073441286, 0.2068271488, -0.8374903798, -0.2673701644, 0.3314557374, -0.0263577811, 0.1994024962, 0.215717867, -0.4038517177, -0.3216702938, 0.3766611218, 0.0990837663, 0.7675249577, -0.2972689867, 0.3962582052, -0.1552994698, 0.6611967087, 0.1280310154, -0.3813101053, 0.1585618854, -0.3987909555, -0.2663655877, 0.2130052447, -0.0923116431, 0.0365401059, 0.1605039835, -0.2905768156, -0.0298117511, 0.0862519294, 0.2249331921, 0.1306698024, 0.5134935379, 0.120926626, -0.6006980538, -0.0828129277, 0.0265690908, -0.0234003812, -0.1049019024, -0.0492794514, 0.0799459741, -0.0313268006, -0.2085639238, -0.0284662768, 0.2052582055, -0.2474462092, 0.0866158754, 0.0637520179, -0.0720534623, 0.5367990732, 0.1062302291, -0.0013607293, 0.4988959134, -0.0392867513, 0.1321583986, -0.2274820209, -0.1652866751, 0.2007084042, -0.0658488423, 0.4217028916, 0.0883791074, -0.00058797, 0.1141032875, 0.0632384419, -0.0466908254, 0.0947640389, -0.3643093109, 0.1054970175, -0.4112315476, -0.2154278457, -0.0884305984, -0.3007827401, -0.1009221599, 0.123109594, 0.1373566687, -0.464446038, 0.1151452065, 0.2652114034, -0.375010103, 0.0119969212, 0.3838247359, 0.1710590869, -0.1051390767, 0.499843061, 0.457587719, -0.0533619151, -0.2644975185, -0.1096296161, 0.2944480777, -0.2731434405, 0.0849161297, -0.2455591708, -0.3016008735, 0.2093508542, 0.0808756128, 0.1000995561, -0.0149994157, -0.0632621273, -0.2510291338, -0.6380496025, -0.1814441532, -0.0027227891, -0.0035484964, 0.1460835785, 0.2738982141, -0.3100468516, 0.5892775059, -0.2788560092, -0.0529419072, -0.2076085508, 0.2782230973, -0.0555623658, -0.3855459988, 0.2115361542, 0.2033527046, 0.1001042053, 0.0805963874, 0.0059093423, -0.2213539034, -0.1435202807, 0.112715289, 0.2364698499, -0.4196630418, -0.0159935355, -0.3129969835, -0.0924898088, -0.2700214386, 0.0660682172, 0.0170285404, 0.0139458058, -0.2261784673, 0.0093940832, 0.0475259274, -0.2552625537, 0.042060256, -0.2545849085, 0.1250909567, 0.246571213, 0.1331031621, 0.1178301424, 0.0006369948, -0.3931023479, 0.1796455681, 0.6661384702, -0.0175323728, 0.419082135, -0.4357323647, -0.0715063736, 0.247659564, 0.0498031005, 0.0480658486, -0.3416772485, -0.0107186846, 0.2935878038, 0.2268191278, 0.0152087137, -0.0147332661, 0.2263551354, -0.4112984538, 0.1301490664, 0.3453423977, -0.1119266823, -0.0674753264, 0.1559654474, 0.0913231522, 0.5066777468, 0.1149650663, 0.0057111867, 0.2489495873, -0.0012200177, 0.1548573822, 0.4856925309, 0.2611194551, 0.2386806309, 0.3016281128, -0.0326658972, -0.0406999141, -0.2312837243, 0.2294136882, -0.435616076, -0.415166378, -0.2674563825, 0.5044151545, -0.0193236805, 0.1242090911, -0.3350303173, -0.0662411451, -0.0406532101, 0.2541327477, -0.1192157567, 0.2744088173, -0.4530941248, -0.1705991924, -0.0894243345, -0.123961933, 0.0115341395, 0.1896333247, -0.0558586121, 0.2722765803, -0.5796659589, 0.2293862104, 0.0521361418, 0.1451499909, -0.1346786618, 0.1227787733, 0.1055012345, -0.5078569651, 0.3433353007, 0.2378563285, 0.2918590903, 0.115404956, -0.221906662, 0.3443704844, 0.4366974831, -0.1908073723, -0.0406121649, -0.1209841669, -0.0356197953, -0.1499129832, 0.1974350214, 0.3490367532, 0.1878990531, 0.2596160173, 0.0557278506, -0.265966177, 0.2785172462, 0.0650140941, -0.3240586519, 0.1786668301, -0.1329390854, 0.4019319415, -0.3285808265, -0.4429640174, 0.009867128, -0.3941243589, 0.2155868858, 0.6670196056, -0.2954224646, 0.2878553569, -0.0254881512, 0.0485503711, 0.0018406976, 0.4760496616, 0.4697353244, 0.2841596603, -0.3783711195, -0.0256674215, -0.3288833499, 0.2765586376, -0.2024302334, 0.0669976398, 0.0222478993, 0.1707946956, 0.1645776778, -0.0033023204, -0.0796225369, -0.0136221573, 0.218803376, -0.1950690895, -0.1675592959, -0.1280471683, 0.0020943638, 0.3573175073, -0.0259005763, -0.2975495458, 0.1588650346, 0.1150627136, 0.0991571695, -0.3214533329, 0.3626994789, -0.1984633654, -0.0203625765, 0.0691283345, -0.1504710466, 0.3792098761, -0.2204388827, -0.0330470726, -0.0818270966, 0.1281416565, -0.1670770645, -0.2209998518, 0.3402220011, 0.3884243667, -0.1076362282, 0.1082137078, -0.255192548, -0.0864622518, 0.0403043032, -0.369690299, -0.3087161779, -0.0074875243, -0.0002530813, 0.0085244328, 0.1030983254, 0.2458303273, 0.1053374782, -0.0278403312, -0.2292289734, -0.1855804026, 0.2094751894, -0.2946497798, -0.3130407035, -0.0353798121, 0.055624567, -0.1636292785, 0.2511017621, -0.4624068141, 0.1457485557, 0.2297439128, -0.0255144164, -0.4540739357, 0.2232972682, 0.2540614009, 0.0164720193, -0.2076969594, 0.0534572713, 0.152623266, -0.1369921267, -0.2016401291, -0.1757197082 ]
https://github.com/huggingface/datasets/issues/633
Load large text file for LM pre-training resulting in OOM
@lhoestq It keeps happening. I have uninstalled wandb from my env, setted `%env WANDB_DISABLED=true` on my notebook, and commented this func: ``` def get_available_reporting_integrations(): integrations = [] if is_azureml_available(): integrations.append("azure_ml") if is_comet_available(): integrations.append("comet_ml") if is_mlflow_available(): integrations.append("mlflow") if is_tensorboard_available(): integrations.append("tensorboard") # if is_wandb_available(): # integrations.append("wandb") return integrations ``` As a fast test and it keeps increasing the ram memory. Wandb could not be the blameworthy here.
I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks.
65
Load large text file for LM pre-training resulting in OOM I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks. @lhoestq It keeps happening. I have uninstalled wandb from my env, setted `%env WANDB_DISABLED=true` on my notebook, and commented this func: ``` def get_available_reporting_integrations(): integrations = [] if is_azureml_available(): integrations.append("azure_ml") if is_comet_available(): integrations.append("comet_ml") if is_mlflow_available(): integrations.append("mlflow") if is_tensorboard_available(): integrations.append("tensorboard") # if is_wandb_available(): # integrations.append("wandb") return integrations ``` As a fast test and it keeps increasing the ram memory. Wandb could not be the blameworthy here.
[ -0.6339286566, -0.4775367379, 0.0106935017, 0.2986355722, 0.3600474596, -0.1518248618, 0.5567324758, 0.3738059402, 0.0108827576, 0.0107196309, -0.129592672, -0.1827997565, -0.2669844627, -0.1620898247, -0.0323721357, 0.0232065637, -0.0998215079, 0.1968753189, -0.2772667408, -0.0966669172, 0.0570117384, -0.0776291862, -0.2127603889, -0.0297442861, -0.3879216015, -0.0617137849, 0.3158174157, 0.1221210212, -0.1640810221, -0.0958083943, -0.2297017574, 0.1197900772, 0.439348489, 0.4340211749, -0.0001147721, 0.0066160336, 0.1890438497, -0.2363324165, -0.1672690213, 0.0192163214, 0.0775555521, -0.3022914231, -0.1064001918, -0.3036173582, -0.0272463262, -0.1061252505, -0.0156821907, -0.2243723869, 0.4987519681, 0.5414042473, 0.158762753, 0.2508125007, -0.2136608064, 0.0723968297, 0.2544375062, 0.0488999188, 0.1056828648, -0.2795148194, 0.4687834084, -0.1923182011, -0.4726887643, 0.147817716, -0.2247735411, -0.2016820759, -0.0950891525, 0.0382048525, 0.3660891652, 0.0315695107, 0.2641872764, 0.3474208415, 0.4272072911, -0.0008119822, -0.2896353602, -0.3209854364, -0.142623961, -0.0843160674, 0.1892364621, 0.2010776401, 0.0598267131, 0.1184588969, 0.1071273386, -0.1253313124, -0.1519896984, 0.0761606991, -0.2748077214, 0.3397959173, -0.2012510002, -0.0380433276, 0.1555755138, -0.0944792777, 0.1961691678, -0.0997751206, 0.0034044338, 0.256714642, -0.2454416156, -0.1123953462, -0.0716817975, -0.5194139481, 0.1627842039, -0.275945127, 0.1981878579, 0.2972391844, -0.0876616836, -0.0676169693, 0.0863933712, 0.4202023745, -0.2524377108, 0.2605276108, 0.2272560447, 0.1640110612, -0.1717063487, -0.044398699, -0.3305115998, -0.1962453574, 0.1015936062, -0.0773412734, -0.0110169947, -0.2546044588, -0.2404216379, 0.0585461818, -0.1407698691, -0.0308450628, 0.2748159468, 0.3899732232, -0.3540614843, 0.429497689, 0.1651095748, 0.0523289181, -0.4855332375, -0.3365268707, -0.1744022667, 0.1532573104, -0.2032445073, 0.0389754251, 0.1297925711, 0.1396334916, 0.0596540496, -0.0428831056, -0.062589474, -0.4402189851, -0.0094447248, -0.0241031349, 0.0207529441, -0.0425147153, 0.0630241409, -0.0233657025, 0.2347659618, -0.124994956, -0.0253818408, 0.347791791, -0.2762121558, -0.2276411355, 0.0509795845, 0.2101765722, -0.0282484777, 0.2091551125, 0.0285063665, 0.0768566355, 0.5359786153, 0.0066538528, -0.0471477136, -0.3782058656, 0.0945213884, 0.0920172632, 0.1619913876, 0.176630199, -0.0425881371, -0.0726022273, 0.0063893422, 0.1035867184, 0.2550407052, 0.3907325864, -0.2264701426, 0.2906734943, 0.0070044398, -0.0566552505, 0.5319920778, -0.2620993257, -0.4484443367, 0.2338527292, -0.3511552811, 0.1081818715, 0.2206701636, 0.1759027839, 0.1138891205, -0.1089850664, 0.2690153718, 0.3656853139, 0.0705155656, 0.1209142357, -0.2366243303, -0.0922287181, -0.1715156287, 0.4842028618, 0.3627239466, -0.0255859457, -0.073089242, 0.3563796282, 0.1833981574, -0.1992233247, 0.1312805116, 0.3693415821, -0.1744246632, -0.126158461, 0.0944814235, 0.065598011, -0.3000324965, -0.0709871948, -0.0908449665, 0.0828673989, -0.1136057153, -0.0894262716, 0.2825582922, 0.0228702873, -0.076665625, 0.1131795123, 0.0379412323, -0.1318216175, 0.0248667225, -0.0704166517, 0.1261966676, 0.0154216904, -0.0932069048, 0.0873603672, -0.218415916, -0.1239407063, 0.0018027006, -0.1370613873, 0.0769498348, 0.1752731055, -0.0511879697, 0.038401451, -0.1730400473, 0.1069573462, -0.1529778838, -0.2987068594, -0.5071570873, 0.435893029, 0.1864787787, -0.338109374, 0.3402595818, 0.104027614, -0.0295993853, -0.1062710211, -0.1198950633, 0.5749330521, 0.2730709612, 0.019617686, 0.2990927994, -0.2454116046, 0.2542809844, -0.2662388682, -0.1036153138, -0.0977649093, 0.4209234118, 0.0310557783, 0.2350629866, 0.148280412, -0.0869456232, -0.2320593894, 0.4703808725, -0.1788093746, -0.0138720572, 0.2665264308, -0.4247210622, -0.0115635023, -0.3480208218, -0.2494779825, 0.0067337304, 0.1732288599, -0.0944132358, -0.0554348305, 0.008538004, -0.2627931833, 0.0078360327, 0.153237164, -0.0718519539, 0.3198346198, 0.1165152863, 0.1005953103, -0.1228921488, -0.1794068366, 0.1028476655, 0.1782900989, -0.275191009, 0.1844428182, -0.0228228644, -0.0134468302, -0.2862631679, 0.1904247552, -0.1268012226, 0.0388155058, -0.064743273, 0.2586891651, -0.0779029801, 0.1019122973, 0.3001608849, 0.3549228907, 0.462962091, -0.0780947804, 0.3263480961, -0.0826636776, -0.1796956658, 0.0329436511, 0.3345074356, -0.3179924488, -0.0248901304, 0.0053771734, 0.0511769727, -0.098929733, -0.1660347879, 0.0946891308, 0.1968245208, -0.1661896259, -0.2573056519, -0.0721762553, 0.1776912063, -0.0424321517, 0.1197286695, -0.2324413657, 0.0084067807, 0.1450569928, -0.0062898882, -0.1843237281, 0.0404452905, -0.1199223399, -0.0470516533, -0.4116415977, -0.053518068, 0.1070510522, 0.1367316544, 0.3593674898, 0.1067891791, 0.194451645, 0.2884450853, 0.1585922837, 0.0063672885, -0.0885991901, 0.5141718388, -0.0175727978, -0.2971983254, -0.2908624411, -0.2397544682, 0.0251131542, 0.3155243993, -0.6842602491, 0.2337829173, -0.1138520464, -0.0940265656, -0.3955192268, 0.0007257089, 0.252205044, -0.0761481375, 0.0163395926, 0.0954741985, 0.2302769721, 0.1546791494, -0.0387872718, 0.3857944906, -0.4130838811, 0.359408766, -0.1354579031, 0.4855529666, -0.1221182644, -0.2300207019, 0.0988750979, 0.1504482776, -0.0171156786, -0.0218024775, -0.0139686503, -0.2593793869, -0.0582374632, 0.0227312893, 0.5197216272, 0.0168899372, -0.1655897647, -0.0158990994, 0.0681643113, 0.2563194036, -0.2606988251, 0.4905276, -0.1907785088, 0.0184165761, -0.1996744126, 0.0187455565, -0.2049808204, -0.2150854766, -0.1641347706, 0.0934438407, -0.0303831622, -0.2240440995, -0.2116935551, 0.10314022, -0.3681344092, -0.0437650606, 0.1179903895, 0.2293324769, 0.1807705015, -0.1961435974, -0.086288482, -0.2018707544, 0.3574145138, 0.1934524179, 0.1124655455, 0.1632595956, -0.0568787605, -0.2155371606, -0.1774517596, -0.2904546559, 0.1419078708, 0.6505103111, 0.6102460623, -0.2162735164, -0.1585993618, 0.0791448355, 0.0026035607, -0.1652731597, -0.5487569571, -0.1285093129, -0.1654994041, -0.4410517216, 0.0046067014, 0.3584984243, 0.3558533788, -0.2264456898, -0.0318751894, 0.1387300938, -0.0007012784, 0.3717512488, -0.1297643632, 0.0150536448, -0.066410847, 0.2258286923, -0.1074174196, 0.3673555255, 0.2488899529, 0.3563638031, 0.1917256266, -0.3175569773, 0.0706882849, -0.0298886281, 0.2314305753, 0.115161255, 0.1102285907, 0.3546282947, 0.1289709657, 0.3322556615, -0.1909662783, 0.6737231016, -0.0073441286, 0.2068271488, -0.8374903798, -0.2673701644, 0.3314557374, -0.0263577811, 0.1994024962, 0.215717867, -0.4038517177, -0.3216702938, 0.3766611218, 0.0990837663, 0.7675249577, -0.2972689867, 0.3962582052, -0.1552994698, 0.6611967087, 0.1280310154, -0.3813101053, 0.1585618854, -0.3987909555, -0.2663655877, 0.2130052447, -0.0923116431, 0.0365401059, 0.1605039835, -0.2905768156, -0.0298117511, 0.0862519294, 0.2249331921, 0.1306698024, 0.5134935379, 0.120926626, -0.6006980538, -0.0828129277, 0.0265690908, -0.0234003812, -0.1049019024, -0.0492794514, 0.0799459741, -0.0313268006, -0.2085639238, -0.0284662768, 0.2052582055, -0.2474462092, 0.0866158754, 0.0637520179, -0.0720534623, 0.5367990732, 0.1062302291, -0.0013607293, 0.4988959134, -0.0392867513, 0.1321583986, -0.2274820209, -0.1652866751, 0.2007084042, -0.0658488423, 0.4217028916, 0.0883791074, -0.00058797, 0.1141032875, 0.0632384419, -0.0466908254, 0.0947640389, -0.3643093109, 0.1054970175, -0.4112315476, -0.2154278457, -0.0884305984, -0.3007827401, -0.1009221599, 0.123109594, 0.1373566687, -0.464446038, 0.1151452065, 0.2652114034, -0.375010103, 0.0119969212, 0.3838247359, 0.1710590869, -0.1051390767, 0.499843061, 0.457587719, -0.0533619151, -0.2644975185, -0.1096296161, 0.2944480777, -0.2731434405, 0.0849161297, -0.2455591708, -0.3016008735, 0.2093508542, 0.0808756128, 0.1000995561, -0.0149994157, -0.0632621273, -0.2510291338, -0.6380496025, -0.1814441532, -0.0027227891, -0.0035484964, 0.1460835785, 0.2738982141, -0.3100468516, 0.5892775059, -0.2788560092, -0.0529419072, -0.2076085508, 0.2782230973, -0.0555623658, -0.3855459988, 0.2115361542, 0.2033527046, 0.1001042053, 0.0805963874, 0.0059093423, -0.2213539034, -0.1435202807, 0.112715289, 0.2364698499, -0.4196630418, -0.0159935355, -0.3129969835, -0.0924898088, -0.2700214386, 0.0660682172, 0.0170285404, 0.0139458058, -0.2261784673, 0.0093940832, 0.0475259274, -0.2552625537, 0.042060256, -0.2545849085, 0.1250909567, 0.246571213, 0.1331031621, 0.1178301424, 0.0006369948, -0.3931023479, 0.1796455681, 0.6661384702, -0.0175323728, 0.419082135, -0.4357323647, -0.0715063736, 0.247659564, 0.0498031005, 0.0480658486, -0.3416772485, -0.0107186846, 0.2935878038, 0.2268191278, 0.0152087137, -0.0147332661, 0.2263551354, -0.4112984538, 0.1301490664, 0.3453423977, -0.1119266823, -0.0674753264, 0.1559654474, 0.0913231522, 0.5066777468, 0.1149650663, 0.0057111867, 0.2489495873, -0.0012200177, 0.1548573822, 0.4856925309, 0.2611194551, 0.2386806309, 0.3016281128, -0.0326658972, -0.0406999141, -0.2312837243, 0.2294136882, -0.435616076, -0.415166378, -0.2674563825, 0.5044151545, -0.0193236805, 0.1242090911, -0.3350303173, -0.0662411451, -0.0406532101, 0.2541327477, -0.1192157567, 0.2744088173, -0.4530941248, -0.1705991924, -0.0894243345, -0.123961933, 0.0115341395, 0.1896333247, -0.0558586121, 0.2722765803, -0.5796659589, 0.2293862104, 0.0521361418, 0.1451499909, -0.1346786618, 0.1227787733, 0.1055012345, -0.5078569651, 0.3433353007, 0.2378563285, 0.2918590903, 0.115404956, -0.221906662, 0.3443704844, 0.4366974831, -0.1908073723, -0.0406121649, -0.1209841669, -0.0356197953, -0.1499129832, 0.1974350214, 0.3490367532, 0.1878990531, 0.2596160173, 0.0557278506, -0.265966177, 0.2785172462, 0.0650140941, -0.3240586519, 0.1786668301, -0.1329390854, 0.4019319415, -0.3285808265, -0.4429640174, 0.009867128, -0.3941243589, 0.2155868858, 0.6670196056, -0.2954224646, 0.2878553569, -0.0254881512, 0.0485503711, 0.0018406976, 0.4760496616, 0.4697353244, 0.2841596603, -0.3783711195, -0.0256674215, -0.3288833499, 0.2765586376, -0.2024302334, 0.0669976398, 0.0222478993, 0.1707946956, 0.1645776778, -0.0033023204, -0.0796225369, -0.0136221573, 0.218803376, -0.1950690895, -0.1675592959, -0.1280471683, 0.0020943638, 0.3573175073, -0.0259005763, -0.2975495458, 0.1588650346, 0.1150627136, 0.0991571695, -0.3214533329, 0.3626994789, -0.1984633654, -0.0203625765, 0.0691283345, -0.1504710466, 0.3792098761, -0.2204388827, -0.0330470726, -0.0818270966, 0.1281416565, -0.1670770645, -0.2209998518, 0.3402220011, 0.3884243667, -0.1076362282, 0.1082137078, -0.255192548, -0.0864622518, 0.0403043032, -0.369690299, -0.3087161779, -0.0074875243, -0.0002530813, 0.0085244328, 0.1030983254, 0.2458303273, 0.1053374782, -0.0278403312, -0.2292289734, -0.1855804026, 0.2094751894, -0.2946497798, -0.3130407035, -0.0353798121, 0.055624567, -0.1636292785, 0.2511017621, -0.4624068141, 0.1457485557, 0.2297439128, -0.0255144164, -0.4540739357, 0.2232972682, 0.2540614009, 0.0164720193, -0.2076969594, 0.0534572713, 0.152623266, -0.1369921267, -0.2016401291, -0.1757197082 ]
https://github.com/huggingface/datasets/issues/633
Load large text file for LM pre-training resulting in OOM
Thanks for checking @gaceladri . Let's investigate the single process setting then. If you have some sort of colab notebook with a minimal code example that shows this behavior feel free to share it @gaceladri so that we can play around with it to find what causes this. Otherwise I'll probably try to reproduce on my side at one point
I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks.
60
Load large text file for LM pre-training resulting in OOM I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks. Thanks for checking @gaceladri . Let's investigate the single process setting then. If you have some sort of colab notebook with a minimal code example that shows this behavior feel free to share it @gaceladri so that we can play around with it to find what causes this. Otherwise I'll probably try to reproduce on my side at one point
[ -0.6339286566, -0.4775367379, 0.0106935017, 0.2986355722, 0.3600474596, -0.1518248618, 0.5567324758, 0.3738059402, 0.0108827576, 0.0107196309, -0.129592672, -0.1827997565, -0.2669844627, -0.1620898247, -0.0323721357, 0.0232065637, -0.0998215079, 0.1968753189, -0.2772667408, -0.0966669172, 0.0570117384, -0.0776291862, -0.2127603889, -0.0297442861, -0.3879216015, -0.0617137849, 0.3158174157, 0.1221210212, -0.1640810221, -0.0958083943, -0.2297017574, 0.1197900772, 0.439348489, 0.4340211749, -0.0001147721, 0.0066160336, 0.1890438497, -0.2363324165, -0.1672690213, 0.0192163214, 0.0775555521, -0.3022914231, -0.1064001918, -0.3036173582, -0.0272463262, -0.1061252505, -0.0156821907, -0.2243723869, 0.4987519681, 0.5414042473, 0.158762753, 0.2508125007, -0.2136608064, 0.0723968297, 0.2544375062, 0.0488999188, 0.1056828648, -0.2795148194, 0.4687834084, -0.1923182011, -0.4726887643, 0.147817716, -0.2247735411, -0.2016820759, -0.0950891525, 0.0382048525, 0.3660891652, 0.0315695107, 0.2641872764, 0.3474208415, 0.4272072911, -0.0008119822, -0.2896353602, -0.3209854364, -0.142623961, -0.0843160674, 0.1892364621, 0.2010776401, 0.0598267131, 0.1184588969, 0.1071273386, -0.1253313124, -0.1519896984, 0.0761606991, -0.2748077214, 0.3397959173, -0.2012510002, -0.0380433276, 0.1555755138, -0.0944792777, 0.1961691678, -0.0997751206, 0.0034044338, 0.256714642, -0.2454416156, -0.1123953462, -0.0716817975, -0.5194139481, 0.1627842039, -0.275945127, 0.1981878579, 0.2972391844, -0.0876616836, -0.0676169693, 0.0863933712, 0.4202023745, -0.2524377108, 0.2605276108, 0.2272560447, 0.1640110612, -0.1717063487, -0.044398699, -0.3305115998, -0.1962453574, 0.1015936062, -0.0773412734, -0.0110169947, -0.2546044588, -0.2404216379, 0.0585461818, -0.1407698691, -0.0308450628, 0.2748159468, 0.3899732232, -0.3540614843, 0.429497689, 0.1651095748, 0.0523289181, -0.4855332375, -0.3365268707, -0.1744022667, 0.1532573104, -0.2032445073, 0.0389754251, 0.1297925711, 0.1396334916, 0.0596540496, -0.0428831056, -0.062589474, -0.4402189851, -0.0094447248, -0.0241031349, 0.0207529441, -0.0425147153, 0.0630241409, -0.0233657025, 0.2347659618, -0.124994956, -0.0253818408, 0.347791791, -0.2762121558, -0.2276411355, 0.0509795845, 0.2101765722, -0.0282484777, 0.2091551125, 0.0285063665, 0.0768566355, 0.5359786153, 0.0066538528, -0.0471477136, -0.3782058656, 0.0945213884, 0.0920172632, 0.1619913876, 0.176630199, -0.0425881371, -0.0726022273, 0.0063893422, 0.1035867184, 0.2550407052, 0.3907325864, -0.2264701426, 0.2906734943, 0.0070044398, -0.0566552505, 0.5319920778, -0.2620993257, -0.4484443367, 0.2338527292, -0.3511552811, 0.1081818715, 0.2206701636, 0.1759027839, 0.1138891205, -0.1089850664, 0.2690153718, 0.3656853139, 0.0705155656, 0.1209142357, -0.2366243303, -0.0922287181, -0.1715156287, 0.4842028618, 0.3627239466, -0.0255859457, -0.073089242, 0.3563796282, 0.1833981574, -0.1992233247, 0.1312805116, 0.3693415821, -0.1744246632, -0.126158461, 0.0944814235, 0.065598011, -0.3000324965, -0.0709871948, -0.0908449665, 0.0828673989, -0.1136057153, -0.0894262716, 0.2825582922, 0.0228702873, -0.076665625, 0.1131795123, 0.0379412323, -0.1318216175, 0.0248667225, -0.0704166517, 0.1261966676, 0.0154216904, -0.0932069048, 0.0873603672, -0.218415916, -0.1239407063, 0.0018027006, -0.1370613873, 0.0769498348, 0.1752731055, -0.0511879697, 0.038401451, -0.1730400473, 0.1069573462, -0.1529778838, -0.2987068594, -0.5071570873, 0.435893029, 0.1864787787, -0.338109374, 0.3402595818, 0.104027614, -0.0295993853, -0.1062710211, -0.1198950633, 0.5749330521, 0.2730709612, 0.019617686, 0.2990927994, -0.2454116046, 0.2542809844, -0.2662388682, -0.1036153138, -0.0977649093, 0.4209234118, 0.0310557783, 0.2350629866, 0.148280412, -0.0869456232, -0.2320593894, 0.4703808725, -0.1788093746, -0.0138720572, 0.2665264308, -0.4247210622, -0.0115635023, -0.3480208218, -0.2494779825, 0.0067337304, 0.1732288599, -0.0944132358, -0.0554348305, 0.008538004, -0.2627931833, 0.0078360327, 0.153237164, -0.0718519539, 0.3198346198, 0.1165152863, 0.1005953103, -0.1228921488, -0.1794068366, 0.1028476655, 0.1782900989, -0.275191009, 0.1844428182, -0.0228228644, -0.0134468302, -0.2862631679, 0.1904247552, -0.1268012226, 0.0388155058, -0.064743273, 0.2586891651, -0.0779029801, 0.1019122973, 0.3001608849, 0.3549228907, 0.462962091, -0.0780947804, 0.3263480961, -0.0826636776, -0.1796956658, 0.0329436511, 0.3345074356, -0.3179924488, -0.0248901304, 0.0053771734, 0.0511769727, -0.098929733, -0.1660347879, 0.0946891308, 0.1968245208, -0.1661896259, -0.2573056519, -0.0721762553, 0.1776912063, -0.0424321517, 0.1197286695, -0.2324413657, 0.0084067807, 0.1450569928, -0.0062898882, -0.1843237281, 0.0404452905, -0.1199223399, -0.0470516533, -0.4116415977, -0.053518068, 0.1070510522, 0.1367316544, 0.3593674898, 0.1067891791, 0.194451645, 0.2884450853, 0.1585922837, 0.0063672885, -0.0885991901, 0.5141718388, -0.0175727978, -0.2971983254, -0.2908624411, -0.2397544682, 0.0251131542, 0.3155243993, -0.6842602491, 0.2337829173, -0.1138520464, -0.0940265656, -0.3955192268, 0.0007257089, 0.252205044, -0.0761481375, 0.0163395926, 0.0954741985, 0.2302769721, 0.1546791494, -0.0387872718, 0.3857944906, -0.4130838811, 0.359408766, -0.1354579031, 0.4855529666, -0.1221182644, -0.2300207019, 0.0988750979, 0.1504482776, -0.0171156786, -0.0218024775, -0.0139686503, -0.2593793869, -0.0582374632, 0.0227312893, 0.5197216272, 0.0168899372, -0.1655897647, -0.0158990994, 0.0681643113, 0.2563194036, -0.2606988251, 0.4905276, -0.1907785088, 0.0184165761, -0.1996744126, 0.0187455565, -0.2049808204, -0.2150854766, -0.1641347706, 0.0934438407, -0.0303831622, -0.2240440995, -0.2116935551, 0.10314022, -0.3681344092, -0.0437650606, 0.1179903895, 0.2293324769, 0.1807705015, -0.1961435974, -0.086288482, -0.2018707544, 0.3574145138, 0.1934524179, 0.1124655455, 0.1632595956, -0.0568787605, -0.2155371606, -0.1774517596, -0.2904546559, 0.1419078708, 0.6505103111, 0.6102460623, -0.2162735164, -0.1585993618, 0.0791448355, 0.0026035607, -0.1652731597, -0.5487569571, -0.1285093129, -0.1654994041, -0.4410517216, 0.0046067014, 0.3584984243, 0.3558533788, -0.2264456898, -0.0318751894, 0.1387300938, -0.0007012784, 0.3717512488, -0.1297643632, 0.0150536448, -0.066410847, 0.2258286923, -0.1074174196, 0.3673555255, 0.2488899529, 0.3563638031, 0.1917256266, -0.3175569773, 0.0706882849, -0.0298886281, 0.2314305753, 0.115161255, 0.1102285907, 0.3546282947, 0.1289709657, 0.3322556615, -0.1909662783, 0.6737231016, -0.0073441286, 0.2068271488, -0.8374903798, -0.2673701644, 0.3314557374, -0.0263577811, 0.1994024962, 0.215717867, -0.4038517177, -0.3216702938, 0.3766611218, 0.0990837663, 0.7675249577, -0.2972689867, 0.3962582052, -0.1552994698, 0.6611967087, 0.1280310154, -0.3813101053, 0.1585618854, -0.3987909555, -0.2663655877, 0.2130052447, -0.0923116431, 0.0365401059, 0.1605039835, -0.2905768156, -0.0298117511, 0.0862519294, 0.2249331921, 0.1306698024, 0.5134935379, 0.120926626, -0.6006980538, -0.0828129277, 0.0265690908, -0.0234003812, -0.1049019024, -0.0492794514, 0.0799459741, -0.0313268006, -0.2085639238, -0.0284662768, 0.2052582055, -0.2474462092, 0.0866158754, 0.0637520179, -0.0720534623, 0.5367990732, 0.1062302291, -0.0013607293, 0.4988959134, -0.0392867513, 0.1321583986, -0.2274820209, -0.1652866751, 0.2007084042, -0.0658488423, 0.4217028916, 0.0883791074, -0.00058797, 0.1141032875, 0.0632384419, -0.0466908254, 0.0947640389, -0.3643093109, 0.1054970175, -0.4112315476, -0.2154278457, -0.0884305984, -0.3007827401, -0.1009221599, 0.123109594, 0.1373566687, -0.464446038, 0.1151452065, 0.2652114034, -0.375010103, 0.0119969212, 0.3838247359, 0.1710590869, -0.1051390767, 0.499843061, 0.457587719, -0.0533619151, -0.2644975185, -0.1096296161, 0.2944480777, -0.2731434405, 0.0849161297, -0.2455591708, -0.3016008735, 0.2093508542, 0.0808756128, 0.1000995561, -0.0149994157, -0.0632621273, -0.2510291338, -0.6380496025, -0.1814441532, -0.0027227891, -0.0035484964, 0.1460835785, 0.2738982141, -0.3100468516, 0.5892775059, -0.2788560092, -0.0529419072, -0.2076085508, 0.2782230973, -0.0555623658, -0.3855459988, 0.2115361542, 0.2033527046, 0.1001042053, 0.0805963874, 0.0059093423, -0.2213539034, -0.1435202807, 0.112715289, 0.2364698499, -0.4196630418, -0.0159935355, -0.3129969835, -0.0924898088, -0.2700214386, 0.0660682172, 0.0170285404, 0.0139458058, -0.2261784673, 0.0093940832, 0.0475259274, -0.2552625537, 0.042060256, -0.2545849085, 0.1250909567, 0.246571213, 0.1331031621, 0.1178301424, 0.0006369948, -0.3931023479, 0.1796455681, 0.6661384702, -0.0175323728, 0.419082135, -0.4357323647, -0.0715063736, 0.247659564, 0.0498031005, 0.0480658486, -0.3416772485, -0.0107186846, 0.2935878038, 0.2268191278, 0.0152087137, -0.0147332661, 0.2263551354, -0.4112984538, 0.1301490664, 0.3453423977, -0.1119266823, -0.0674753264, 0.1559654474, 0.0913231522, 0.5066777468, 0.1149650663, 0.0057111867, 0.2489495873, -0.0012200177, 0.1548573822, 0.4856925309, 0.2611194551, 0.2386806309, 0.3016281128, -0.0326658972, -0.0406999141, -0.2312837243, 0.2294136882, -0.435616076, -0.415166378, -0.2674563825, 0.5044151545, -0.0193236805, 0.1242090911, -0.3350303173, -0.0662411451, -0.0406532101, 0.2541327477, -0.1192157567, 0.2744088173, -0.4530941248, -0.1705991924, -0.0894243345, -0.123961933, 0.0115341395, 0.1896333247, -0.0558586121, 0.2722765803, -0.5796659589, 0.2293862104, 0.0521361418, 0.1451499909, -0.1346786618, 0.1227787733, 0.1055012345, -0.5078569651, 0.3433353007, 0.2378563285, 0.2918590903, 0.115404956, -0.221906662, 0.3443704844, 0.4366974831, -0.1908073723, -0.0406121649, -0.1209841669, -0.0356197953, -0.1499129832, 0.1974350214, 0.3490367532, 0.1878990531, 0.2596160173, 0.0557278506, -0.265966177, 0.2785172462, 0.0650140941, -0.3240586519, 0.1786668301, -0.1329390854, 0.4019319415, -0.3285808265, -0.4429640174, 0.009867128, -0.3941243589, 0.2155868858, 0.6670196056, -0.2954224646, 0.2878553569, -0.0254881512, 0.0485503711, 0.0018406976, 0.4760496616, 0.4697353244, 0.2841596603, -0.3783711195, -0.0256674215, -0.3288833499, 0.2765586376, -0.2024302334, 0.0669976398, 0.0222478993, 0.1707946956, 0.1645776778, -0.0033023204, -0.0796225369, -0.0136221573, 0.218803376, -0.1950690895, -0.1675592959, -0.1280471683, 0.0020943638, 0.3573175073, -0.0259005763, -0.2975495458, 0.1588650346, 0.1150627136, 0.0991571695, -0.3214533329, 0.3626994789, -0.1984633654, -0.0203625765, 0.0691283345, -0.1504710466, 0.3792098761, -0.2204388827, -0.0330470726, -0.0818270966, 0.1281416565, -0.1670770645, -0.2209998518, 0.3402220011, 0.3884243667, -0.1076362282, 0.1082137078, -0.255192548, -0.0864622518, 0.0403043032, -0.369690299, -0.3087161779, -0.0074875243, -0.0002530813, 0.0085244328, 0.1030983254, 0.2458303273, 0.1053374782, -0.0278403312, -0.2292289734, -0.1855804026, 0.2094751894, -0.2946497798, -0.3130407035, -0.0353798121, 0.055624567, -0.1636292785, 0.2511017621, -0.4624068141, 0.1457485557, 0.2297439128, -0.0255144164, -0.4540739357, 0.2232972682, 0.2540614009, 0.0164720193, -0.2076969594, 0.0534572713, 0.152623266, -0.1369921267, -0.2016401291, -0.1757197082 ]
https://github.com/huggingface/datasets/issues/633
Load large text file for LM pre-training resulting in OOM
@lhoestq sure. Here you have https://colab.research.google.com/drive/1ba09ZOpyHGAOQLcsxiQAHRXl10qnMU5o?usp=sharing let me know if the link works and it reproduces the issue. To me, it reproduces the issue, since if you start the training the ram memory keeps increasing. Let me know. Thanks!
I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks.
39
Load large text file for LM pre-training resulting in OOM I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks. @lhoestq sure. Here you have https://colab.research.google.com/drive/1ba09ZOpyHGAOQLcsxiQAHRXl10qnMU5o?usp=sharing let me know if the link works and it reproduces the issue. To me, it reproduces the issue, since if you start the training the ram memory keeps increasing. Let me know. Thanks!
[ -0.6339286566, -0.4775367379, 0.0106935017, 0.2986355722, 0.3600474596, -0.1518248618, 0.5567324758, 0.3738059402, 0.0108827576, 0.0107196309, -0.129592672, -0.1827997565, -0.2669844627, -0.1620898247, -0.0323721357, 0.0232065637, -0.0998215079, 0.1968753189, -0.2772667408, -0.0966669172, 0.0570117384, -0.0776291862, -0.2127603889, -0.0297442861, -0.3879216015, -0.0617137849, 0.3158174157, 0.1221210212, -0.1640810221, -0.0958083943, -0.2297017574, 0.1197900772, 0.439348489, 0.4340211749, -0.0001147721, 0.0066160336, 0.1890438497, -0.2363324165, -0.1672690213, 0.0192163214, 0.0775555521, -0.3022914231, -0.1064001918, -0.3036173582, -0.0272463262, -0.1061252505, -0.0156821907, -0.2243723869, 0.4987519681, 0.5414042473, 0.158762753, 0.2508125007, -0.2136608064, 0.0723968297, 0.2544375062, 0.0488999188, 0.1056828648, -0.2795148194, 0.4687834084, -0.1923182011, -0.4726887643, 0.147817716, -0.2247735411, -0.2016820759, -0.0950891525, 0.0382048525, 0.3660891652, 0.0315695107, 0.2641872764, 0.3474208415, 0.4272072911, -0.0008119822, -0.2896353602, -0.3209854364, -0.142623961, -0.0843160674, 0.1892364621, 0.2010776401, 0.0598267131, 0.1184588969, 0.1071273386, -0.1253313124, -0.1519896984, 0.0761606991, -0.2748077214, 0.3397959173, -0.2012510002, -0.0380433276, 0.1555755138, -0.0944792777, 0.1961691678, -0.0997751206, 0.0034044338, 0.256714642, -0.2454416156, -0.1123953462, -0.0716817975, -0.5194139481, 0.1627842039, -0.275945127, 0.1981878579, 0.2972391844, -0.0876616836, -0.0676169693, 0.0863933712, 0.4202023745, -0.2524377108, 0.2605276108, 0.2272560447, 0.1640110612, -0.1717063487, -0.044398699, -0.3305115998, -0.1962453574, 0.1015936062, -0.0773412734, -0.0110169947, -0.2546044588, -0.2404216379, 0.0585461818, -0.1407698691, -0.0308450628, 0.2748159468, 0.3899732232, -0.3540614843, 0.429497689, 0.1651095748, 0.0523289181, -0.4855332375, -0.3365268707, -0.1744022667, 0.1532573104, -0.2032445073, 0.0389754251, 0.1297925711, 0.1396334916, 0.0596540496, -0.0428831056, -0.062589474, -0.4402189851, -0.0094447248, -0.0241031349, 0.0207529441, -0.0425147153, 0.0630241409, -0.0233657025, 0.2347659618, -0.124994956, -0.0253818408, 0.347791791, -0.2762121558, -0.2276411355, 0.0509795845, 0.2101765722, -0.0282484777, 0.2091551125, 0.0285063665, 0.0768566355, 0.5359786153, 0.0066538528, -0.0471477136, -0.3782058656, 0.0945213884, 0.0920172632, 0.1619913876, 0.176630199, -0.0425881371, -0.0726022273, 0.0063893422, 0.1035867184, 0.2550407052, 0.3907325864, -0.2264701426, 0.2906734943, 0.0070044398, -0.0566552505, 0.5319920778, -0.2620993257, -0.4484443367, 0.2338527292, -0.3511552811, 0.1081818715, 0.2206701636, 0.1759027839, 0.1138891205, -0.1089850664, 0.2690153718, 0.3656853139, 0.0705155656, 0.1209142357, -0.2366243303, -0.0922287181, -0.1715156287, 0.4842028618, 0.3627239466, -0.0255859457, -0.073089242, 0.3563796282, 0.1833981574, -0.1992233247, 0.1312805116, 0.3693415821, -0.1744246632, -0.126158461, 0.0944814235, 0.065598011, -0.3000324965, -0.0709871948, -0.0908449665, 0.0828673989, -0.1136057153, -0.0894262716, 0.2825582922, 0.0228702873, -0.076665625, 0.1131795123, 0.0379412323, -0.1318216175, 0.0248667225, -0.0704166517, 0.1261966676, 0.0154216904, -0.0932069048, 0.0873603672, -0.218415916, -0.1239407063, 0.0018027006, -0.1370613873, 0.0769498348, 0.1752731055, -0.0511879697, 0.038401451, -0.1730400473, 0.1069573462, -0.1529778838, -0.2987068594, -0.5071570873, 0.435893029, 0.1864787787, -0.338109374, 0.3402595818, 0.104027614, -0.0295993853, -0.1062710211, -0.1198950633, 0.5749330521, 0.2730709612, 0.019617686, 0.2990927994, -0.2454116046, 0.2542809844, -0.2662388682, -0.1036153138, -0.0977649093, 0.4209234118, 0.0310557783, 0.2350629866, 0.148280412, -0.0869456232, -0.2320593894, 0.4703808725, -0.1788093746, -0.0138720572, 0.2665264308, -0.4247210622, -0.0115635023, -0.3480208218, -0.2494779825, 0.0067337304, 0.1732288599, -0.0944132358, -0.0554348305, 0.008538004, -0.2627931833, 0.0078360327, 0.153237164, -0.0718519539, 0.3198346198, 0.1165152863, 0.1005953103, -0.1228921488, -0.1794068366, 0.1028476655, 0.1782900989, -0.275191009, 0.1844428182, -0.0228228644, -0.0134468302, -0.2862631679, 0.1904247552, -0.1268012226, 0.0388155058, -0.064743273, 0.2586891651, -0.0779029801, 0.1019122973, 0.3001608849, 0.3549228907, 0.462962091, -0.0780947804, 0.3263480961, -0.0826636776, -0.1796956658, 0.0329436511, 0.3345074356, -0.3179924488, -0.0248901304, 0.0053771734, 0.0511769727, -0.098929733, -0.1660347879, 0.0946891308, 0.1968245208, -0.1661896259, -0.2573056519, -0.0721762553, 0.1776912063, -0.0424321517, 0.1197286695, -0.2324413657, 0.0084067807, 0.1450569928, -0.0062898882, -0.1843237281, 0.0404452905, -0.1199223399, -0.0470516533, -0.4116415977, -0.053518068, 0.1070510522, 0.1367316544, 0.3593674898, 0.1067891791, 0.194451645, 0.2884450853, 0.1585922837, 0.0063672885, -0.0885991901, 0.5141718388, -0.0175727978, -0.2971983254, -0.2908624411, -0.2397544682, 0.0251131542, 0.3155243993, -0.6842602491, 0.2337829173, -0.1138520464, -0.0940265656, -0.3955192268, 0.0007257089, 0.252205044, -0.0761481375, 0.0163395926, 0.0954741985, 0.2302769721, 0.1546791494, -0.0387872718, 0.3857944906, -0.4130838811, 0.359408766, -0.1354579031, 0.4855529666, -0.1221182644, -0.2300207019, 0.0988750979, 0.1504482776, -0.0171156786, -0.0218024775, -0.0139686503, -0.2593793869, -0.0582374632, 0.0227312893, 0.5197216272, 0.0168899372, -0.1655897647, -0.0158990994, 0.0681643113, 0.2563194036, -0.2606988251, 0.4905276, -0.1907785088, 0.0184165761, -0.1996744126, 0.0187455565, -0.2049808204, -0.2150854766, -0.1641347706, 0.0934438407, -0.0303831622, -0.2240440995, -0.2116935551, 0.10314022, -0.3681344092, -0.0437650606, 0.1179903895, 0.2293324769, 0.1807705015, -0.1961435974, -0.086288482, -0.2018707544, 0.3574145138, 0.1934524179, 0.1124655455, 0.1632595956, -0.0568787605, -0.2155371606, -0.1774517596, -0.2904546559, 0.1419078708, 0.6505103111, 0.6102460623, -0.2162735164, -0.1585993618, 0.0791448355, 0.0026035607, -0.1652731597, -0.5487569571, -0.1285093129, -0.1654994041, -0.4410517216, 0.0046067014, 0.3584984243, 0.3558533788, -0.2264456898, -0.0318751894, 0.1387300938, -0.0007012784, 0.3717512488, -0.1297643632, 0.0150536448, -0.066410847, 0.2258286923, -0.1074174196, 0.3673555255, 0.2488899529, 0.3563638031, 0.1917256266, -0.3175569773, 0.0706882849, -0.0298886281, 0.2314305753, 0.115161255, 0.1102285907, 0.3546282947, 0.1289709657, 0.3322556615, -0.1909662783, 0.6737231016, -0.0073441286, 0.2068271488, -0.8374903798, -0.2673701644, 0.3314557374, -0.0263577811, 0.1994024962, 0.215717867, -0.4038517177, -0.3216702938, 0.3766611218, 0.0990837663, 0.7675249577, -0.2972689867, 0.3962582052, -0.1552994698, 0.6611967087, 0.1280310154, -0.3813101053, 0.1585618854, -0.3987909555, -0.2663655877, 0.2130052447, -0.0923116431, 0.0365401059, 0.1605039835, -0.2905768156, -0.0298117511, 0.0862519294, 0.2249331921, 0.1306698024, 0.5134935379, 0.120926626, -0.6006980538, -0.0828129277, 0.0265690908, -0.0234003812, -0.1049019024, -0.0492794514, 0.0799459741, -0.0313268006, -0.2085639238, -0.0284662768, 0.2052582055, -0.2474462092, 0.0866158754, 0.0637520179, -0.0720534623, 0.5367990732, 0.1062302291, -0.0013607293, 0.4988959134, -0.0392867513, 0.1321583986, -0.2274820209, -0.1652866751, 0.2007084042, -0.0658488423, 0.4217028916, 0.0883791074, -0.00058797, 0.1141032875, 0.0632384419, -0.0466908254, 0.0947640389, -0.3643093109, 0.1054970175, -0.4112315476, -0.2154278457, -0.0884305984, -0.3007827401, -0.1009221599, 0.123109594, 0.1373566687, -0.464446038, 0.1151452065, 0.2652114034, -0.375010103, 0.0119969212, 0.3838247359, 0.1710590869, -0.1051390767, 0.499843061, 0.457587719, -0.0533619151, -0.2644975185, -0.1096296161, 0.2944480777, -0.2731434405, 0.0849161297, -0.2455591708, -0.3016008735, 0.2093508542, 0.0808756128, 0.1000995561, -0.0149994157, -0.0632621273, -0.2510291338, -0.6380496025, -0.1814441532, -0.0027227891, -0.0035484964, 0.1460835785, 0.2738982141, -0.3100468516, 0.5892775059, -0.2788560092, -0.0529419072, -0.2076085508, 0.2782230973, -0.0555623658, -0.3855459988, 0.2115361542, 0.2033527046, 0.1001042053, 0.0805963874, 0.0059093423, -0.2213539034, -0.1435202807, 0.112715289, 0.2364698499, -0.4196630418, -0.0159935355, -0.3129969835, -0.0924898088, -0.2700214386, 0.0660682172, 0.0170285404, 0.0139458058, -0.2261784673, 0.0093940832, 0.0475259274, -0.2552625537, 0.042060256, -0.2545849085, 0.1250909567, 0.246571213, 0.1331031621, 0.1178301424, 0.0006369948, -0.3931023479, 0.1796455681, 0.6661384702, -0.0175323728, 0.419082135, -0.4357323647, -0.0715063736, 0.247659564, 0.0498031005, 0.0480658486, -0.3416772485, -0.0107186846, 0.2935878038, 0.2268191278, 0.0152087137, -0.0147332661, 0.2263551354, -0.4112984538, 0.1301490664, 0.3453423977, -0.1119266823, -0.0674753264, 0.1559654474, 0.0913231522, 0.5066777468, 0.1149650663, 0.0057111867, 0.2489495873, -0.0012200177, 0.1548573822, 0.4856925309, 0.2611194551, 0.2386806309, 0.3016281128, -0.0326658972, -0.0406999141, -0.2312837243, 0.2294136882, -0.435616076, -0.415166378, -0.2674563825, 0.5044151545, -0.0193236805, 0.1242090911, -0.3350303173, -0.0662411451, -0.0406532101, 0.2541327477, -0.1192157567, 0.2744088173, -0.4530941248, -0.1705991924, -0.0894243345, -0.123961933, 0.0115341395, 0.1896333247, -0.0558586121, 0.2722765803, -0.5796659589, 0.2293862104, 0.0521361418, 0.1451499909, -0.1346786618, 0.1227787733, 0.1055012345, -0.5078569651, 0.3433353007, 0.2378563285, 0.2918590903, 0.115404956, -0.221906662, 0.3443704844, 0.4366974831, -0.1908073723, -0.0406121649, -0.1209841669, -0.0356197953, -0.1499129832, 0.1974350214, 0.3490367532, 0.1878990531, 0.2596160173, 0.0557278506, -0.265966177, 0.2785172462, 0.0650140941, -0.3240586519, 0.1786668301, -0.1329390854, 0.4019319415, -0.3285808265, -0.4429640174, 0.009867128, -0.3941243589, 0.2155868858, 0.6670196056, -0.2954224646, 0.2878553569, -0.0254881512, 0.0485503711, 0.0018406976, 0.4760496616, 0.4697353244, 0.2841596603, -0.3783711195, -0.0256674215, -0.3288833499, 0.2765586376, -0.2024302334, 0.0669976398, 0.0222478993, 0.1707946956, 0.1645776778, -0.0033023204, -0.0796225369, -0.0136221573, 0.218803376, -0.1950690895, -0.1675592959, -0.1280471683, 0.0020943638, 0.3573175073, -0.0259005763, -0.2975495458, 0.1588650346, 0.1150627136, 0.0991571695, -0.3214533329, 0.3626994789, -0.1984633654, -0.0203625765, 0.0691283345, -0.1504710466, 0.3792098761, -0.2204388827, -0.0330470726, -0.0818270966, 0.1281416565, -0.1670770645, -0.2209998518, 0.3402220011, 0.3884243667, -0.1076362282, 0.1082137078, -0.255192548, -0.0864622518, 0.0403043032, -0.369690299, -0.3087161779, -0.0074875243, -0.0002530813, 0.0085244328, 0.1030983254, 0.2458303273, 0.1053374782, -0.0278403312, -0.2292289734, -0.1855804026, 0.2094751894, -0.2946497798, -0.3130407035, -0.0353798121, 0.055624567, -0.1636292785, 0.2511017621, -0.4624068141, 0.1457485557, 0.2297439128, -0.0255144164, -0.4540739357, 0.2232972682, 0.2540614009, 0.0164720193, -0.2076969594, 0.0534572713, 0.152623266, -0.1369921267, -0.2016401291, -0.1757197082 ]
https://github.com/huggingface/datasets/issues/633
Load large text file for LM pre-training resulting in OOM
Could the bug be comming from tokenizers? I got this warning at the terminal from my jupyter notebook: ``` huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ```
I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks.
63
Load large text file for LM pre-training resulting in OOM I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks. Could the bug be comming from tokenizers? I got this warning at the terminal from my jupyter notebook: ``` huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ```
[ -0.6339286566, -0.4775367379, 0.0106935017, 0.2986355722, 0.3600474596, -0.1518248618, 0.5567324758, 0.3738059402, 0.0108827576, 0.0107196309, -0.129592672, -0.1827997565, -0.2669844627, -0.1620898247, -0.0323721357, 0.0232065637, -0.0998215079, 0.1968753189, -0.2772667408, -0.0966669172, 0.0570117384, -0.0776291862, -0.2127603889, -0.0297442861, -0.3879216015, -0.0617137849, 0.3158174157, 0.1221210212, -0.1640810221, -0.0958083943, -0.2297017574, 0.1197900772, 0.439348489, 0.4340211749, -0.0001147721, 0.0066160336, 0.1890438497, -0.2363324165, -0.1672690213, 0.0192163214, 0.0775555521, -0.3022914231, -0.1064001918, -0.3036173582, -0.0272463262, -0.1061252505, -0.0156821907, -0.2243723869, 0.4987519681, 0.5414042473, 0.158762753, 0.2508125007, -0.2136608064, 0.0723968297, 0.2544375062, 0.0488999188, 0.1056828648, -0.2795148194, 0.4687834084, -0.1923182011, -0.4726887643, 0.147817716, -0.2247735411, -0.2016820759, -0.0950891525, 0.0382048525, 0.3660891652, 0.0315695107, 0.2641872764, 0.3474208415, 0.4272072911, -0.0008119822, -0.2896353602, -0.3209854364, -0.142623961, -0.0843160674, 0.1892364621, 0.2010776401, 0.0598267131, 0.1184588969, 0.1071273386, -0.1253313124, -0.1519896984, 0.0761606991, -0.2748077214, 0.3397959173, -0.2012510002, -0.0380433276, 0.1555755138, -0.0944792777, 0.1961691678, -0.0997751206, 0.0034044338, 0.256714642, -0.2454416156, -0.1123953462, -0.0716817975, -0.5194139481, 0.1627842039, -0.275945127, 0.1981878579, 0.2972391844, -0.0876616836, -0.0676169693, 0.0863933712, 0.4202023745, -0.2524377108, 0.2605276108, 0.2272560447, 0.1640110612, -0.1717063487, -0.044398699, -0.3305115998, -0.1962453574, 0.1015936062, -0.0773412734, -0.0110169947, -0.2546044588, -0.2404216379, 0.0585461818, -0.1407698691, -0.0308450628, 0.2748159468, 0.3899732232, -0.3540614843, 0.429497689, 0.1651095748, 0.0523289181, -0.4855332375, -0.3365268707, -0.1744022667, 0.1532573104, -0.2032445073, 0.0389754251, 0.1297925711, 0.1396334916, 0.0596540496, -0.0428831056, -0.062589474, -0.4402189851, -0.0094447248, -0.0241031349, 0.0207529441, -0.0425147153, 0.0630241409, -0.0233657025, 0.2347659618, -0.124994956, -0.0253818408, 0.347791791, -0.2762121558, -0.2276411355, 0.0509795845, 0.2101765722, -0.0282484777, 0.2091551125, 0.0285063665, 0.0768566355, 0.5359786153, 0.0066538528, -0.0471477136, -0.3782058656, 0.0945213884, 0.0920172632, 0.1619913876, 0.176630199, -0.0425881371, -0.0726022273, 0.0063893422, 0.1035867184, 0.2550407052, 0.3907325864, -0.2264701426, 0.2906734943, 0.0070044398, -0.0566552505, 0.5319920778, -0.2620993257, -0.4484443367, 0.2338527292, -0.3511552811, 0.1081818715, 0.2206701636, 0.1759027839, 0.1138891205, -0.1089850664, 0.2690153718, 0.3656853139, 0.0705155656, 0.1209142357, -0.2366243303, -0.0922287181, -0.1715156287, 0.4842028618, 0.3627239466, -0.0255859457, -0.073089242, 0.3563796282, 0.1833981574, -0.1992233247, 0.1312805116, 0.3693415821, -0.1744246632, -0.126158461, 0.0944814235, 0.065598011, -0.3000324965, -0.0709871948, -0.0908449665, 0.0828673989, -0.1136057153, -0.0894262716, 0.2825582922, 0.0228702873, -0.076665625, 0.1131795123, 0.0379412323, -0.1318216175, 0.0248667225, -0.0704166517, 0.1261966676, 0.0154216904, -0.0932069048, 0.0873603672, -0.218415916, -0.1239407063, 0.0018027006, -0.1370613873, 0.0769498348, 0.1752731055, -0.0511879697, 0.038401451, -0.1730400473, 0.1069573462, -0.1529778838, -0.2987068594, -0.5071570873, 0.435893029, 0.1864787787, -0.338109374, 0.3402595818, 0.104027614, -0.0295993853, -0.1062710211, -0.1198950633, 0.5749330521, 0.2730709612, 0.019617686, 0.2990927994, -0.2454116046, 0.2542809844, -0.2662388682, -0.1036153138, -0.0977649093, 0.4209234118, 0.0310557783, 0.2350629866, 0.148280412, -0.0869456232, -0.2320593894, 0.4703808725, -0.1788093746, -0.0138720572, 0.2665264308, -0.4247210622, -0.0115635023, -0.3480208218, -0.2494779825, 0.0067337304, 0.1732288599, -0.0944132358, -0.0554348305, 0.008538004, -0.2627931833, 0.0078360327, 0.153237164, -0.0718519539, 0.3198346198, 0.1165152863, 0.1005953103, -0.1228921488, -0.1794068366, 0.1028476655, 0.1782900989, -0.275191009, 0.1844428182, -0.0228228644, -0.0134468302, -0.2862631679, 0.1904247552, -0.1268012226, 0.0388155058, -0.064743273, 0.2586891651, -0.0779029801, 0.1019122973, 0.3001608849, 0.3549228907, 0.462962091, -0.0780947804, 0.3263480961, -0.0826636776, -0.1796956658, 0.0329436511, 0.3345074356, -0.3179924488, -0.0248901304, 0.0053771734, 0.0511769727, -0.098929733, -0.1660347879, 0.0946891308, 0.1968245208, -0.1661896259, -0.2573056519, -0.0721762553, 0.1776912063, -0.0424321517, 0.1197286695, -0.2324413657, 0.0084067807, 0.1450569928, -0.0062898882, -0.1843237281, 0.0404452905, -0.1199223399, -0.0470516533, -0.4116415977, -0.053518068, 0.1070510522, 0.1367316544, 0.3593674898, 0.1067891791, 0.194451645, 0.2884450853, 0.1585922837, 0.0063672885, -0.0885991901, 0.5141718388, -0.0175727978, -0.2971983254, -0.2908624411, -0.2397544682, 0.0251131542, 0.3155243993, -0.6842602491, 0.2337829173, -0.1138520464, -0.0940265656, -0.3955192268, 0.0007257089, 0.252205044, -0.0761481375, 0.0163395926, 0.0954741985, 0.2302769721, 0.1546791494, -0.0387872718, 0.3857944906, -0.4130838811, 0.359408766, -0.1354579031, 0.4855529666, -0.1221182644, -0.2300207019, 0.0988750979, 0.1504482776, -0.0171156786, -0.0218024775, -0.0139686503, -0.2593793869, -0.0582374632, 0.0227312893, 0.5197216272, 0.0168899372, -0.1655897647, -0.0158990994, 0.0681643113, 0.2563194036, -0.2606988251, 0.4905276, -0.1907785088, 0.0184165761, -0.1996744126, 0.0187455565, -0.2049808204, -0.2150854766, -0.1641347706, 0.0934438407, -0.0303831622, -0.2240440995, -0.2116935551, 0.10314022, -0.3681344092, -0.0437650606, 0.1179903895, 0.2293324769, 0.1807705015, -0.1961435974, -0.086288482, -0.2018707544, 0.3574145138, 0.1934524179, 0.1124655455, 0.1632595956, -0.0568787605, -0.2155371606, -0.1774517596, -0.2904546559, 0.1419078708, 0.6505103111, 0.6102460623, -0.2162735164, -0.1585993618, 0.0791448355, 0.0026035607, -0.1652731597, -0.5487569571, -0.1285093129, -0.1654994041, -0.4410517216, 0.0046067014, 0.3584984243, 0.3558533788, -0.2264456898, -0.0318751894, 0.1387300938, -0.0007012784, 0.3717512488, -0.1297643632, 0.0150536448, -0.066410847, 0.2258286923, -0.1074174196, 0.3673555255, 0.2488899529, 0.3563638031, 0.1917256266, -0.3175569773, 0.0706882849, -0.0298886281, 0.2314305753, 0.115161255, 0.1102285907, 0.3546282947, 0.1289709657, 0.3322556615, -0.1909662783, 0.6737231016, -0.0073441286, 0.2068271488, -0.8374903798, -0.2673701644, 0.3314557374, -0.0263577811, 0.1994024962, 0.215717867, -0.4038517177, -0.3216702938, 0.3766611218, 0.0990837663, 0.7675249577, -0.2972689867, 0.3962582052, -0.1552994698, 0.6611967087, 0.1280310154, -0.3813101053, 0.1585618854, -0.3987909555, -0.2663655877, 0.2130052447, -0.0923116431, 0.0365401059, 0.1605039835, -0.2905768156, -0.0298117511, 0.0862519294, 0.2249331921, 0.1306698024, 0.5134935379, 0.120926626, -0.6006980538, -0.0828129277, 0.0265690908, -0.0234003812, -0.1049019024, -0.0492794514, 0.0799459741, -0.0313268006, -0.2085639238, -0.0284662768, 0.2052582055, -0.2474462092, 0.0866158754, 0.0637520179, -0.0720534623, 0.5367990732, 0.1062302291, -0.0013607293, 0.4988959134, -0.0392867513, 0.1321583986, -0.2274820209, -0.1652866751, 0.2007084042, -0.0658488423, 0.4217028916, 0.0883791074, -0.00058797, 0.1141032875, 0.0632384419, -0.0466908254, 0.0947640389, -0.3643093109, 0.1054970175, -0.4112315476, -0.2154278457, -0.0884305984, -0.3007827401, -0.1009221599, 0.123109594, 0.1373566687, -0.464446038, 0.1151452065, 0.2652114034, -0.375010103, 0.0119969212, 0.3838247359, 0.1710590869, -0.1051390767, 0.499843061, 0.457587719, -0.0533619151, -0.2644975185, -0.1096296161, 0.2944480777, -0.2731434405, 0.0849161297, -0.2455591708, -0.3016008735, 0.2093508542, 0.0808756128, 0.1000995561, -0.0149994157, -0.0632621273, -0.2510291338, -0.6380496025, -0.1814441532, -0.0027227891, -0.0035484964, 0.1460835785, 0.2738982141, -0.3100468516, 0.5892775059, -0.2788560092, -0.0529419072, -0.2076085508, 0.2782230973, -0.0555623658, -0.3855459988, 0.2115361542, 0.2033527046, 0.1001042053, 0.0805963874, 0.0059093423, -0.2213539034, -0.1435202807, 0.112715289, 0.2364698499, -0.4196630418, -0.0159935355, -0.3129969835, -0.0924898088, -0.2700214386, 0.0660682172, 0.0170285404, 0.0139458058, -0.2261784673, 0.0093940832, 0.0475259274, -0.2552625537, 0.042060256, -0.2545849085, 0.1250909567, 0.246571213, 0.1331031621, 0.1178301424, 0.0006369948, -0.3931023479, 0.1796455681, 0.6661384702, -0.0175323728, 0.419082135, -0.4357323647, -0.0715063736, 0.247659564, 0.0498031005, 0.0480658486, -0.3416772485, -0.0107186846, 0.2935878038, 0.2268191278, 0.0152087137, -0.0147332661, 0.2263551354, -0.4112984538, 0.1301490664, 0.3453423977, -0.1119266823, -0.0674753264, 0.1559654474, 0.0913231522, 0.5066777468, 0.1149650663, 0.0057111867, 0.2489495873, -0.0012200177, 0.1548573822, 0.4856925309, 0.2611194551, 0.2386806309, 0.3016281128, -0.0326658972, -0.0406999141, -0.2312837243, 0.2294136882, -0.435616076, -0.415166378, -0.2674563825, 0.5044151545, -0.0193236805, 0.1242090911, -0.3350303173, -0.0662411451, -0.0406532101, 0.2541327477, -0.1192157567, 0.2744088173, -0.4530941248, -0.1705991924, -0.0894243345, -0.123961933, 0.0115341395, 0.1896333247, -0.0558586121, 0.2722765803, -0.5796659589, 0.2293862104, 0.0521361418, 0.1451499909, -0.1346786618, 0.1227787733, 0.1055012345, -0.5078569651, 0.3433353007, 0.2378563285, 0.2918590903, 0.115404956, -0.221906662, 0.3443704844, 0.4366974831, -0.1908073723, -0.0406121649, -0.1209841669, -0.0356197953, -0.1499129832, 0.1974350214, 0.3490367532, 0.1878990531, 0.2596160173, 0.0557278506, -0.265966177, 0.2785172462, 0.0650140941, -0.3240586519, 0.1786668301, -0.1329390854, 0.4019319415, -0.3285808265, -0.4429640174, 0.009867128, -0.3941243589, 0.2155868858, 0.6670196056, -0.2954224646, 0.2878553569, -0.0254881512, 0.0485503711, 0.0018406976, 0.4760496616, 0.4697353244, 0.2841596603, -0.3783711195, -0.0256674215, -0.3288833499, 0.2765586376, -0.2024302334, 0.0669976398, 0.0222478993, 0.1707946956, 0.1645776778, -0.0033023204, -0.0796225369, -0.0136221573, 0.218803376, -0.1950690895, -0.1675592959, -0.1280471683, 0.0020943638, 0.3573175073, -0.0259005763, -0.2975495458, 0.1588650346, 0.1150627136, 0.0991571695, -0.3214533329, 0.3626994789, -0.1984633654, -0.0203625765, 0.0691283345, -0.1504710466, 0.3792098761, -0.2204388827, -0.0330470726, -0.0818270966, 0.1281416565, -0.1670770645, -0.2209998518, 0.3402220011, 0.3884243667, -0.1076362282, 0.1082137078, -0.255192548, -0.0864622518, 0.0403043032, -0.369690299, -0.3087161779, -0.0074875243, -0.0002530813, 0.0085244328, 0.1030983254, 0.2458303273, 0.1053374782, -0.0278403312, -0.2292289734, -0.1855804026, 0.2094751894, -0.2946497798, -0.3130407035, -0.0353798121, 0.055624567, -0.1636292785, 0.2511017621, -0.4624068141, 0.1457485557, 0.2297439128, -0.0255144164, -0.4540739357, 0.2232972682, 0.2540614009, 0.0164720193, -0.2076969594, 0.0534572713, 0.152623266, -0.1369921267, -0.2016401291, -0.1757197082 ]
https://github.com/huggingface/datasets/issues/633
Load large text file for LM pre-training resulting in OOM
I've never experienced memory issues with tokenizers so I don't know Cc @n1t0 are you aware of any issue that would cause memory to keep increasing when the tokenizer is used in the Data Collator for language modeling ?
I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks.
39
Load large text file for LM pre-training resulting in OOM I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks. I've never experienced memory issues with tokenizers so I don't know Cc @n1t0 are you aware of any issue that would cause memory to keep increasing when the tokenizer is used in the Data Collator for language modeling ?
[ -0.6339286566, -0.4775367379, 0.0106935017, 0.2986355722, 0.3600474596, -0.1518248618, 0.5567324758, 0.3738059402, 0.0108827576, 0.0107196309, -0.129592672, -0.1827997565, -0.2669844627, -0.1620898247, -0.0323721357, 0.0232065637, -0.0998215079, 0.1968753189, -0.2772667408, -0.0966669172, 0.0570117384, -0.0776291862, -0.2127603889, -0.0297442861, -0.3879216015, -0.0617137849, 0.3158174157, 0.1221210212, -0.1640810221, -0.0958083943, -0.2297017574, 0.1197900772, 0.439348489, 0.4340211749, -0.0001147721, 0.0066160336, 0.1890438497, -0.2363324165, -0.1672690213, 0.0192163214, 0.0775555521, -0.3022914231, -0.1064001918, -0.3036173582, -0.0272463262, -0.1061252505, -0.0156821907, -0.2243723869, 0.4987519681, 0.5414042473, 0.158762753, 0.2508125007, -0.2136608064, 0.0723968297, 0.2544375062, 0.0488999188, 0.1056828648, -0.2795148194, 0.4687834084, -0.1923182011, -0.4726887643, 0.147817716, -0.2247735411, -0.2016820759, -0.0950891525, 0.0382048525, 0.3660891652, 0.0315695107, 0.2641872764, 0.3474208415, 0.4272072911, -0.0008119822, -0.2896353602, -0.3209854364, -0.142623961, -0.0843160674, 0.1892364621, 0.2010776401, 0.0598267131, 0.1184588969, 0.1071273386, -0.1253313124, -0.1519896984, 0.0761606991, -0.2748077214, 0.3397959173, -0.2012510002, -0.0380433276, 0.1555755138, -0.0944792777, 0.1961691678, -0.0997751206, 0.0034044338, 0.256714642, -0.2454416156, -0.1123953462, -0.0716817975, -0.5194139481, 0.1627842039, -0.275945127, 0.1981878579, 0.2972391844, -0.0876616836, -0.0676169693, 0.0863933712, 0.4202023745, -0.2524377108, 0.2605276108, 0.2272560447, 0.1640110612, -0.1717063487, -0.044398699, -0.3305115998, -0.1962453574, 0.1015936062, -0.0773412734, -0.0110169947, -0.2546044588, -0.2404216379, 0.0585461818, -0.1407698691, -0.0308450628, 0.2748159468, 0.3899732232, -0.3540614843, 0.429497689, 0.1651095748, 0.0523289181, -0.4855332375, -0.3365268707, -0.1744022667, 0.1532573104, -0.2032445073, 0.0389754251, 0.1297925711, 0.1396334916, 0.0596540496, -0.0428831056, -0.062589474, -0.4402189851, -0.0094447248, -0.0241031349, 0.0207529441, -0.0425147153, 0.0630241409, -0.0233657025, 0.2347659618, -0.124994956, -0.0253818408, 0.347791791, -0.2762121558, -0.2276411355, 0.0509795845, 0.2101765722, -0.0282484777, 0.2091551125, 0.0285063665, 0.0768566355, 0.5359786153, 0.0066538528, -0.0471477136, -0.3782058656, 0.0945213884, 0.0920172632, 0.1619913876, 0.176630199, -0.0425881371, -0.0726022273, 0.0063893422, 0.1035867184, 0.2550407052, 0.3907325864, -0.2264701426, 0.2906734943, 0.0070044398, -0.0566552505, 0.5319920778, -0.2620993257, -0.4484443367, 0.2338527292, -0.3511552811, 0.1081818715, 0.2206701636, 0.1759027839, 0.1138891205, -0.1089850664, 0.2690153718, 0.3656853139, 0.0705155656, 0.1209142357, -0.2366243303, -0.0922287181, -0.1715156287, 0.4842028618, 0.3627239466, -0.0255859457, -0.073089242, 0.3563796282, 0.1833981574, -0.1992233247, 0.1312805116, 0.3693415821, -0.1744246632, -0.126158461, 0.0944814235, 0.065598011, -0.3000324965, -0.0709871948, -0.0908449665, 0.0828673989, -0.1136057153, -0.0894262716, 0.2825582922, 0.0228702873, -0.076665625, 0.1131795123, 0.0379412323, -0.1318216175, 0.0248667225, -0.0704166517, 0.1261966676, 0.0154216904, -0.0932069048, 0.0873603672, -0.218415916, -0.1239407063, 0.0018027006, -0.1370613873, 0.0769498348, 0.1752731055, -0.0511879697, 0.038401451, -0.1730400473, 0.1069573462, -0.1529778838, -0.2987068594, -0.5071570873, 0.435893029, 0.1864787787, -0.338109374, 0.3402595818, 0.104027614, -0.0295993853, -0.1062710211, -0.1198950633, 0.5749330521, 0.2730709612, 0.019617686, 0.2990927994, -0.2454116046, 0.2542809844, -0.2662388682, -0.1036153138, -0.0977649093, 0.4209234118, 0.0310557783, 0.2350629866, 0.148280412, -0.0869456232, -0.2320593894, 0.4703808725, -0.1788093746, -0.0138720572, 0.2665264308, -0.4247210622, -0.0115635023, -0.3480208218, -0.2494779825, 0.0067337304, 0.1732288599, -0.0944132358, -0.0554348305, 0.008538004, -0.2627931833, 0.0078360327, 0.153237164, -0.0718519539, 0.3198346198, 0.1165152863, 0.1005953103, -0.1228921488, -0.1794068366, 0.1028476655, 0.1782900989, -0.275191009, 0.1844428182, -0.0228228644, -0.0134468302, -0.2862631679, 0.1904247552, -0.1268012226, 0.0388155058, -0.064743273, 0.2586891651, -0.0779029801, 0.1019122973, 0.3001608849, 0.3549228907, 0.462962091, -0.0780947804, 0.3263480961, -0.0826636776, -0.1796956658, 0.0329436511, 0.3345074356, -0.3179924488, -0.0248901304, 0.0053771734, 0.0511769727, -0.098929733, -0.1660347879, 0.0946891308, 0.1968245208, -0.1661896259, -0.2573056519, -0.0721762553, 0.1776912063, -0.0424321517, 0.1197286695, -0.2324413657, 0.0084067807, 0.1450569928, -0.0062898882, -0.1843237281, 0.0404452905, -0.1199223399, -0.0470516533, -0.4116415977, -0.053518068, 0.1070510522, 0.1367316544, 0.3593674898, 0.1067891791, 0.194451645, 0.2884450853, 0.1585922837, 0.0063672885, -0.0885991901, 0.5141718388, -0.0175727978, -0.2971983254, -0.2908624411, -0.2397544682, 0.0251131542, 0.3155243993, -0.6842602491, 0.2337829173, -0.1138520464, -0.0940265656, -0.3955192268, 0.0007257089, 0.252205044, -0.0761481375, 0.0163395926, 0.0954741985, 0.2302769721, 0.1546791494, -0.0387872718, 0.3857944906, -0.4130838811, 0.359408766, -0.1354579031, 0.4855529666, -0.1221182644, -0.2300207019, 0.0988750979, 0.1504482776, -0.0171156786, -0.0218024775, -0.0139686503, -0.2593793869, -0.0582374632, 0.0227312893, 0.5197216272, 0.0168899372, -0.1655897647, -0.0158990994, 0.0681643113, 0.2563194036, -0.2606988251, 0.4905276, -0.1907785088, 0.0184165761, -0.1996744126, 0.0187455565, -0.2049808204, -0.2150854766, -0.1641347706, 0.0934438407, -0.0303831622, -0.2240440995, -0.2116935551, 0.10314022, -0.3681344092, -0.0437650606, 0.1179903895, 0.2293324769, 0.1807705015, -0.1961435974, -0.086288482, -0.2018707544, 0.3574145138, 0.1934524179, 0.1124655455, 0.1632595956, -0.0568787605, -0.2155371606, -0.1774517596, -0.2904546559, 0.1419078708, 0.6505103111, 0.6102460623, -0.2162735164, -0.1585993618, 0.0791448355, 0.0026035607, -0.1652731597, -0.5487569571, -0.1285093129, -0.1654994041, -0.4410517216, 0.0046067014, 0.3584984243, 0.3558533788, -0.2264456898, -0.0318751894, 0.1387300938, -0.0007012784, 0.3717512488, -0.1297643632, 0.0150536448, -0.066410847, 0.2258286923, -0.1074174196, 0.3673555255, 0.2488899529, 0.3563638031, 0.1917256266, -0.3175569773, 0.0706882849, -0.0298886281, 0.2314305753, 0.115161255, 0.1102285907, 0.3546282947, 0.1289709657, 0.3322556615, -0.1909662783, 0.6737231016, -0.0073441286, 0.2068271488, -0.8374903798, -0.2673701644, 0.3314557374, -0.0263577811, 0.1994024962, 0.215717867, -0.4038517177, -0.3216702938, 0.3766611218, 0.0990837663, 0.7675249577, -0.2972689867, 0.3962582052, -0.1552994698, 0.6611967087, 0.1280310154, -0.3813101053, 0.1585618854, -0.3987909555, -0.2663655877, 0.2130052447, -0.0923116431, 0.0365401059, 0.1605039835, -0.2905768156, -0.0298117511, 0.0862519294, 0.2249331921, 0.1306698024, 0.5134935379, 0.120926626, -0.6006980538, -0.0828129277, 0.0265690908, -0.0234003812, -0.1049019024, -0.0492794514, 0.0799459741, -0.0313268006, -0.2085639238, -0.0284662768, 0.2052582055, -0.2474462092, 0.0866158754, 0.0637520179, -0.0720534623, 0.5367990732, 0.1062302291, -0.0013607293, 0.4988959134, -0.0392867513, 0.1321583986, -0.2274820209, -0.1652866751, 0.2007084042, -0.0658488423, 0.4217028916, 0.0883791074, -0.00058797, 0.1141032875, 0.0632384419, -0.0466908254, 0.0947640389, -0.3643093109, 0.1054970175, -0.4112315476, -0.2154278457, -0.0884305984, -0.3007827401, -0.1009221599, 0.123109594, 0.1373566687, -0.464446038, 0.1151452065, 0.2652114034, -0.375010103, 0.0119969212, 0.3838247359, 0.1710590869, -0.1051390767, 0.499843061, 0.457587719, -0.0533619151, -0.2644975185, -0.1096296161, 0.2944480777, -0.2731434405, 0.0849161297, -0.2455591708, -0.3016008735, 0.2093508542, 0.0808756128, 0.1000995561, -0.0149994157, -0.0632621273, -0.2510291338, -0.6380496025, -0.1814441532, -0.0027227891, -0.0035484964, 0.1460835785, 0.2738982141, -0.3100468516, 0.5892775059, -0.2788560092, -0.0529419072, -0.2076085508, 0.2782230973, -0.0555623658, -0.3855459988, 0.2115361542, 0.2033527046, 0.1001042053, 0.0805963874, 0.0059093423, -0.2213539034, -0.1435202807, 0.112715289, 0.2364698499, -0.4196630418, -0.0159935355, -0.3129969835, -0.0924898088, -0.2700214386, 0.0660682172, 0.0170285404, 0.0139458058, -0.2261784673, 0.0093940832, 0.0475259274, -0.2552625537, 0.042060256, -0.2545849085, 0.1250909567, 0.246571213, 0.1331031621, 0.1178301424, 0.0006369948, -0.3931023479, 0.1796455681, 0.6661384702, -0.0175323728, 0.419082135, -0.4357323647, -0.0715063736, 0.247659564, 0.0498031005, 0.0480658486, -0.3416772485, -0.0107186846, 0.2935878038, 0.2268191278, 0.0152087137, -0.0147332661, 0.2263551354, -0.4112984538, 0.1301490664, 0.3453423977, -0.1119266823, -0.0674753264, 0.1559654474, 0.0913231522, 0.5066777468, 0.1149650663, 0.0057111867, 0.2489495873, -0.0012200177, 0.1548573822, 0.4856925309, 0.2611194551, 0.2386806309, 0.3016281128, -0.0326658972, -0.0406999141, -0.2312837243, 0.2294136882, -0.435616076, -0.415166378, -0.2674563825, 0.5044151545, -0.0193236805, 0.1242090911, -0.3350303173, -0.0662411451, -0.0406532101, 0.2541327477, -0.1192157567, 0.2744088173, -0.4530941248, -0.1705991924, -0.0894243345, -0.123961933, 0.0115341395, 0.1896333247, -0.0558586121, 0.2722765803, -0.5796659589, 0.2293862104, 0.0521361418, 0.1451499909, -0.1346786618, 0.1227787733, 0.1055012345, -0.5078569651, 0.3433353007, 0.2378563285, 0.2918590903, 0.115404956, -0.221906662, 0.3443704844, 0.4366974831, -0.1908073723, -0.0406121649, -0.1209841669, -0.0356197953, -0.1499129832, 0.1974350214, 0.3490367532, 0.1878990531, 0.2596160173, 0.0557278506, -0.265966177, 0.2785172462, 0.0650140941, -0.3240586519, 0.1786668301, -0.1329390854, 0.4019319415, -0.3285808265, -0.4429640174, 0.009867128, -0.3941243589, 0.2155868858, 0.6670196056, -0.2954224646, 0.2878553569, -0.0254881512, 0.0485503711, 0.0018406976, 0.4760496616, 0.4697353244, 0.2841596603, -0.3783711195, -0.0256674215, -0.3288833499, 0.2765586376, -0.2024302334, 0.0669976398, 0.0222478993, 0.1707946956, 0.1645776778, -0.0033023204, -0.0796225369, -0.0136221573, 0.218803376, -0.1950690895, -0.1675592959, -0.1280471683, 0.0020943638, 0.3573175073, -0.0259005763, -0.2975495458, 0.1588650346, 0.1150627136, 0.0991571695, -0.3214533329, 0.3626994789, -0.1984633654, -0.0203625765, 0.0691283345, -0.1504710466, 0.3792098761, -0.2204388827, -0.0330470726, -0.0818270966, 0.1281416565, -0.1670770645, -0.2209998518, 0.3402220011, 0.3884243667, -0.1076362282, 0.1082137078, -0.255192548, -0.0864622518, 0.0403043032, -0.369690299, -0.3087161779, -0.0074875243, -0.0002530813, 0.0085244328, 0.1030983254, 0.2458303273, 0.1053374782, -0.0278403312, -0.2292289734, -0.1855804026, 0.2094751894, -0.2946497798, -0.3130407035, -0.0353798121, 0.055624567, -0.1636292785, 0.2511017621, -0.4624068141, 0.1457485557, 0.2297439128, -0.0255144164, -0.4540739357, 0.2232972682, 0.2540614009, 0.0164720193, -0.2076969594, 0.0534572713, 0.152623266, -0.1369921267, -0.2016401291, -0.1757197082 ]
https://github.com/huggingface/datasets/issues/633
Load large text file for LM pre-training resulting in OOM
@lhoestq Thanks for pointing to n1t0, just to clarify. That warning was doing fine-tuning, without collator: ``` from datasets import load_dataset, load_metric import numpy as np GLUE_TASKS = [ "cola", "mnli", "mnli-mm", "mrpc", "qnli", "qqp", "rte", "sst2", "stsb", "wnli", ] task = "mnli" actual_task = "mnli" if task == "mnli-mm" else task dataset = load_dataset("glue", actual_task) metric = load_metric("glue", actual_task) batch_size = 16 attention_type = "linear" from transformers.models.mobilebert_mod import ( MobileBertForSequenceClassification, MobileBertTokenizerFast, ) from transformers.models.mobilebert_mod.configuration_mobilebert import ( MobileBertConfigMod, ) from transformers import TrainingArguments, Trainer num_labels = 3 if task.startswith("mnli") else 1 if task == "stsb" else 2 tokenizer = MobileBertTokenizerFast.from_pretrained( "/media/ad/00b5422b-9d54-4449-8b5d-08eab5cdac8c/training_trfm/big_linear_layerdrop_shared/checkpoint-23000/", max_len=512, ) model = MobileBertForSequenceClassification.from_pretrained( "/media/ad/00b5422b-9d54-4449-8b5d-08eab5cdac8c/training_trfm/big_linear_layerdrop_shared/checkpoint-23000/", num_labels=num_labels, ) print(model.num_parameters()) task_to_keys = { "cola": ("sentence", None), "mnli": ("premise", "hypothesis"), "mnli-mm": ("premise", "hypothesis"), "mrpc": ("sentence1", "sentence2"), "qnli": ("question", "sentence"), "qqp": ("question1", "question2"), "rte": ("sentence1", "sentence2"), "sst2": ("sentence", None), "stsb": ("sentence1", "sentence2"), "wnli": ("sentence1", "sentence2"), } sentence1_key, sentence2_key = task_to_keys[task] if sentence2_key is None: print(f"Sentence: {dataset['train'][0][sentence1_key]}") else: print(f"Sentence 1: {dataset['train'][0][sentence1_key]}") print(f"Sentence 2: {dataset['train'][0][sentence2_key]}") def preprocess_function(examples): if sentence2_key is None: return tokenizer(examples[sentence1_key], truncation=True) return tokenizer(examples[sentence1_key], examples[sentence2_key], truncation=True) encoded_dataset = dataset.map(preprocess_function, batched=True) metric_name = ( "pearson" if task == "stsb" else "matthews_correlation" if task == "cola" else "accuracy" ) args = TrainingArguments( f"test-glue/{task}_{attention_type}", evaluation_strategy="steps", learning_rate=1e-5, per_device_train_batch_size=batch_size, per_device_eval_batch_size=batch_size, logging_steps=200, num_train_epochs=5, gradient_accumulation_steps=1, warmup_steps=10000, fp16=True, dataloader_num_workers=10, weight_decay=0.1, load_best_model_at_end=True, metric_for_best_model=metric_name, ) def compute_metrics(eval_pred): predictions, labels = eval_pred if task != "stsb": predictions = np.argmax(predictions, axis=1) else: predictions = predictions[:, 0] return metric.compute(predictions=predictions, references=labels) validation_key = ( "validation_mismatched" if task == "mnli-mm" else "validation_matched" if task == "mnli" else "validation" ) trainer = Trainer( model, args, train_dataset=encoded_dataset["train"], eval_dataset=encoded_dataset[validation_key], tokenizer=tokenizer, compute_metrics=compute_metrics, ) trainer.train() ``` Now, I have come back to pre-training. The changes that I think I have done are: not formatting the dataset to torch: ~~`big_dataset.set_format(type='torch', columns=["text", "input_ids", "attention_mask", "token_type_ids"])`~~ so maybe some column is dropped and not freezed in memory and now I have not setted any validation dataset in the trainer. My validation dataset before: ``` book_corpus_eval = load_dataset( "bookcorpus", "plain_text", cache_dir="/home/ad/Desktop/bookcorpus", split="train[98:99%]", ) book_corpus_eval = book_corpus_eval.map(encode, batched=True) book_corpus_eval.set_format( type="torch", columns=["text", "input_ids", "attention_mask", "token_type_ids"] ) **book_corpus_eval = book_corpus_eval.select([i for i in range(1500)])** ``` Maybe _selecting_ or indexing the dataset before feeding it to the trainer, do something strange. My trainer now: ``` big_dataset = load_from_disk("/home/ad/Desktop/35percent_data.arrow/") from transformers import DataCollatorForWholeWordMask data_collator = DataCollatorForWholeWordMask( tokenizer=tokenizer, mlm=True, mlm_probability=0.15) from transformers import Trainer, TrainingArguments training_args = TrainingArguments( output_dir="./big_linear_layerdrop_shared_silu_secondtry", overwrite_output_dir=True, per_device_train_batch_size=60, per_device_eval_batch_size=60, save_steps=500, save_total_limit=10, logging_first_step=True, logging_steps=100, # evaluation_strategy='steps', # eval_steps=250, gradient_accumulation_steps=8, fp16=True, dataloader_num_workers=10, warmup_steps=15000, learning_rate=6e-4, adam_epsilon=1e-6, adam_beta2=0.98, weight_decay=0.01, max_grad_norm=1.0, max_steps=500000, ) trainer = Trainer( model=model, args=training_args, data_collator=data_collator, train_dataset=big_dataset, # eval_dataset=book_corpus_eval, tokenizer=tokenizer) import wandb wandb.login() trainer.train() ``` And surprisingly, the ram now keeps going up and down. The training is up now for 12h without collapse the ram. I don't know what could cause the leakage. :mag: Edit: I didn't see the swap memory, that keeps increasing. So the problem persist.
I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks.
468
Load large text file for LM pre-training resulting in OOM I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks. @lhoestq Thanks for pointing to n1t0, just to clarify. That warning was doing fine-tuning, without collator: ``` from datasets import load_dataset, load_metric import numpy as np GLUE_TASKS = [ "cola", "mnli", "mnli-mm", "mrpc", "qnli", "qqp", "rte", "sst2", "stsb", "wnli", ] task = "mnli" actual_task = "mnli" if task == "mnli-mm" else task dataset = load_dataset("glue", actual_task) metric = load_metric("glue", actual_task) batch_size = 16 attention_type = "linear" from transformers.models.mobilebert_mod import ( MobileBertForSequenceClassification, MobileBertTokenizerFast, ) from transformers.models.mobilebert_mod.configuration_mobilebert import ( MobileBertConfigMod, ) from transformers import TrainingArguments, Trainer num_labels = 3 if task.startswith("mnli") else 1 if task == "stsb" else 2 tokenizer = MobileBertTokenizerFast.from_pretrained( "/media/ad/00b5422b-9d54-4449-8b5d-08eab5cdac8c/training_trfm/big_linear_layerdrop_shared/checkpoint-23000/", max_len=512, ) model = MobileBertForSequenceClassification.from_pretrained( "/media/ad/00b5422b-9d54-4449-8b5d-08eab5cdac8c/training_trfm/big_linear_layerdrop_shared/checkpoint-23000/", num_labels=num_labels, ) print(model.num_parameters()) task_to_keys = { "cola": ("sentence", None), "mnli": ("premise", "hypothesis"), "mnli-mm": ("premise", "hypothesis"), "mrpc": ("sentence1", "sentence2"), "qnli": ("question", "sentence"), "qqp": ("question1", "question2"), "rte": ("sentence1", "sentence2"), "sst2": ("sentence", None), "stsb": ("sentence1", "sentence2"), "wnli": ("sentence1", "sentence2"), } sentence1_key, sentence2_key = task_to_keys[task] if sentence2_key is None: print(f"Sentence: {dataset['train'][0][sentence1_key]}") else: print(f"Sentence 1: {dataset['train'][0][sentence1_key]}") print(f"Sentence 2: {dataset['train'][0][sentence2_key]}") def preprocess_function(examples): if sentence2_key is None: return tokenizer(examples[sentence1_key], truncation=True) return tokenizer(examples[sentence1_key], examples[sentence2_key], truncation=True) encoded_dataset = dataset.map(preprocess_function, batched=True) metric_name = ( "pearson" if task == "stsb" else "matthews_correlation" if task == "cola" else "accuracy" ) args = TrainingArguments( f"test-glue/{task}_{attention_type}", evaluation_strategy="steps", learning_rate=1e-5, per_device_train_batch_size=batch_size, per_device_eval_batch_size=batch_size, logging_steps=200, num_train_epochs=5, gradient_accumulation_steps=1, warmup_steps=10000, fp16=True, dataloader_num_workers=10, weight_decay=0.1, load_best_model_at_end=True, metric_for_best_model=metric_name, ) def compute_metrics(eval_pred): predictions, labels = eval_pred if task != "stsb": predictions = np.argmax(predictions, axis=1) else: predictions = predictions[:, 0] return metric.compute(predictions=predictions, references=labels) validation_key = ( "validation_mismatched" if task == "mnli-mm" else "validation_matched" if task == "mnli" else "validation" ) trainer = Trainer( model, args, train_dataset=encoded_dataset["train"], eval_dataset=encoded_dataset[validation_key], tokenizer=tokenizer, compute_metrics=compute_metrics, ) trainer.train() ``` Now, I have come back to pre-training. The changes that I think I have done are: not formatting the dataset to torch: ~~`big_dataset.set_format(type='torch', columns=["text", "input_ids", "attention_mask", "token_type_ids"])`~~ so maybe some column is dropped and not freezed in memory and now I have not setted any validation dataset in the trainer. My validation dataset before: ``` book_corpus_eval = load_dataset( "bookcorpus", "plain_text", cache_dir="/home/ad/Desktop/bookcorpus", split="train[98:99%]", ) book_corpus_eval = book_corpus_eval.map(encode, batched=True) book_corpus_eval.set_format( type="torch", columns=["text", "input_ids", "attention_mask", "token_type_ids"] ) **book_corpus_eval = book_corpus_eval.select([i for i in range(1500)])** ``` Maybe _selecting_ or indexing the dataset before feeding it to the trainer, do something strange. My trainer now: ``` big_dataset = load_from_disk("/home/ad/Desktop/35percent_data.arrow/") from transformers import DataCollatorForWholeWordMask data_collator = DataCollatorForWholeWordMask( tokenizer=tokenizer, mlm=True, mlm_probability=0.15) from transformers import Trainer, TrainingArguments training_args = TrainingArguments( output_dir="./big_linear_layerdrop_shared_silu_secondtry", overwrite_output_dir=True, per_device_train_batch_size=60, per_device_eval_batch_size=60, save_steps=500, save_total_limit=10, logging_first_step=True, logging_steps=100, # evaluation_strategy='steps', # eval_steps=250, gradient_accumulation_steps=8, fp16=True, dataloader_num_workers=10, warmup_steps=15000, learning_rate=6e-4, adam_epsilon=1e-6, adam_beta2=0.98, weight_decay=0.01, max_grad_norm=1.0, max_steps=500000, ) trainer = Trainer( model=model, args=training_args, data_collator=data_collator, train_dataset=big_dataset, # eval_dataset=book_corpus_eval, tokenizer=tokenizer) import wandb wandb.login() trainer.train() ``` And surprisingly, the ram now keeps going up and down. The training is up now for 12h without collapse the ram. I don't know what could cause the leakage. :mag: Edit: I didn't see the swap memory, that keeps increasing. So the problem persist.
[ -0.6339286566, -0.4775367379, 0.0106935017, 0.2986355722, 0.3600474596, -0.1518248618, 0.5567324758, 0.3738059402, 0.0108827576, 0.0107196309, -0.129592672, -0.1827997565, -0.2669844627, -0.1620898247, -0.0323721357, 0.0232065637, -0.0998215079, 0.1968753189, -0.2772667408, -0.0966669172, 0.0570117384, -0.0776291862, -0.2127603889, -0.0297442861, -0.3879216015, -0.0617137849, 0.3158174157, 0.1221210212, -0.1640810221, -0.0958083943, -0.2297017574, 0.1197900772, 0.439348489, 0.4340211749, -0.0001147721, 0.0066160336, 0.1890438497, -0.2363324165, -0.1672690213, 0.0192163214, 0.0775555521, -0.3022914231, -0.1064001918, -0.3036173582, -0.0272463262, -0.1061252505, -0.0156821907, -0.2243723869, 0.4987519681, 0.5414042473, 0.158762753, 0.2508125007, -0.2136608064, 0.0723968297, 0.2544375062, 0.0488999188, 0.1056828648, -0.2795148194, 0.4687834084, -0.1923182011, -0.4726887643, 0.147817716, -0.2247735411, -0.2016820759, -0.0950891525, 0.0382048525, 0.3660891652, 0.0315695107, 0.2641872764, 0.3474208415, 0.4272072911, -0.0008119822, -0.2896353602, -0.3209854364, -0.142623961, -0.0843160674, 0.1892364621, 0.2010776401, 0.0598267131, 0.1184588969, 0.1071273386, -0.1253313124, -0.1519896984, 0.0761606991, -0.2748077214, 0.3397959173, -0.2012510002, -0.0380433276, 0.1555755138, -0.0944792777, 0.1961691678, -0.0997751206, 0.0034044338, 0.256714642, -0.2454416156, -0.1123953462, -0.0716817975, -0.5194139481, 0.1627842039, -0.275945127, 0.1981878579, 0.2972391844, -0.0876616836, -0.0676169693, 0.0863933712, 0.4202023745, -0.2524377108, 0.2605276108, 0.2272560447, 0.1640110612, -0.1717063487, -0.044398699, -0.3305115998, -0.1962453574, 0.1015936062, -0.0773412734, -0.0110169947, -0.2546044588, -0.2404216379, 0.0585461818, -0.1407698691, -0.0308450628, 0.2748159468, 0.3899732232, -0.3540614843, 0.429497689, 0.1651095748, 0.0523289181, -0.4855332375, -0.3365268707, -0.1744022667, 0.1532573104, -0.2032445073, 0.0389754251, 0.1297925711, 0.1396334916, 0.0596540496, -0.0428831056, -0.062589474, -0.4402189851, -0.0094447248, -0.0241031349, 0.0207529441, -0.0425147153, 0.0630241409, -0.0233657025, 0.2347659618, -0.124994956, -0.0253818408, 0.347791791, -0.2762121558, -0.2276411355, 0.0509795845, 0.2101765722, -0.0282484777, 0.2091551125, 0.0285063665, 0.0768566355, 0.5359786153, 0.0066538528, -0.0471477136, -0.3782058656, 0.0945213884, 0.0920172632, 0.1619913876, 0.176630199, -0.0425881371, -0.0726022273, 0.0063893422, 0.1035867184, 0.2550407052, 0.3907325864, -0.2264701426, 0.2906734943, 0.0070044398, -0.0566552505, 0.5319920778, -0.2620993257, -0.4484443367, 0.2338527292, -0.3511552811, 0.1081818715, 0.2206701636, 0.1759027839, 0.1138891205, -0.1089850664, 0.2690153718, 0.3656853139, 0.0705155656, 0.1209142357, -0.2366243303, -0.0922287181, -0.1715156287, 0.4842028618, 0.3627239466, -0.0255859457, -0.073089242, 0.3563796282, 0.1833981574, -0.1992233247, 0.1312805116, 0.3693415821, -0.1744246632, -0.126158461, 0.0944814235, 0.065598011, -0.3000324965, -0.0709871948, -0.0908449665, 0.0828673989, -0.1136057153, -0.0894262716, 0.2825582922, 0.0228702873, -0.076665625, 0.1131795123, 0.0379412323, -0.1318216175, 0.0248667225, -0.0704166517, 0.1261966676, 0.0154216904, -0.0932069048, 0.0873603672, -0.218415916, -0.1239407063, 0.0018027006, -0.1370613873, 0.0769498348, 0.1752731055, -0.0511879697, 0.038401451, -0.1730400473, 0.1069573462, -0.1529778838, -0.2987068594, -0.5071570873, 0.435893029, 0.1864787787, -0.338109374, 0.3402595818, 0.104027614, -0.0295993853, -0.1062710211, -0.1198950633, 0.5749330521, 0.2730709612, 0.019617686, 0.2990927994, -0.2454116046, 0.2542809844, -0.2662388682, -0.1036153138, -0.0977649093, 0.4209234118, 0.0310557783, 0.2350629866, 0.148280412, -0.0869456232, -0.2320593894, 0.4703808725, -0.1788093746, -0.0138720572, 0.2665264308, -0.4247210622, -0.0115635023, -0.3480208218, -0.2494779825, 0.0067337304, 0.1732288599, -0.0944132358, -0.0554348305, 0.008538004, -0.2627931833, 0.0078360327, 0.153237164, -0.0718519539, 0.3198346198, 0.1165152863, 0.1005953103, -0.1228921488, -0.1794068366, 0.1028476655, 0.1782900989, -0.275191009, 0.1844428182, -0.0228228644, -0.0134468302, -0.2862631679, 0.1904247552, -0.1268012226, 0.0388155058, -0.064743273, 0.2586891651, -0.0779029801, 0.1019122973, 0.3001608849, 0.3549228907, 0.462962091, -0.0780947804, 0.3263480961, -0.0826636776, -0.1796956658, 0.0329436511, 0.3345074356, -0.3179924488, -0.0248901304, 0.0053771734, 0.0511769727, -0.098929733, -0.1660347879, 0.0946891308, 0.1968245208, -0.1661896259, -0.2573056519, -0.0721762553, 0.1776912063, -0.0424321517, 0.1197286695, -0.2324413657, 0.0084067807, 0.1450569928, -0.0062898882, -0.1843237281, 0.0404452905, -0.1199223399, -0.0470516533, -0.4116415977, -0.053518068, 0.1070510522, 0.1367316544, 0.3593674898, 0.1067891791, 0.194451645, 0.2884450853, 0.1585922837, 0.0063672885, -0.0885991901, 0.5141718388, -0.0175727978, -0.2971983254, -0.2908624411, -0.2397544682, 0.0251131542, 0.3155243993, -0.6842602491, 0.2337829173, -0.1138520464, -0.0940265656, -0.3955192268, 0.0007257089, 0.252205044, -0.0761481375, 0.0163395926, 0.0954741985, 0.2302769721, 0.1546791494, -0.0387872718, 0.3857944906, -0.4130838811, 0.359408766, -0.1354579031, 0.4855529666, -0.1221182644, -0.2300207019, 0.0988750979, 0.1504482776, -0.0171156786, -0.0218024775, -0.0139686503, -0.2593793869, -0.0582374632, 0.0227312893, 0.5197216272, 0.0168899372, -0.1655897647, -0.0158990994, 0.0681643113, 0.2563194036, -0.2606988251, 0.4905276, -0.1907785088, 0.0184165761, -0.1996744126, 0.0187455565, -0.2049808204, -0.2150854766, -0.1641347706, 0.0934438407, -0.0303831622, -0.2240440995, -0.2116935551, 0.10314022, -0.3681344092, -0.0437650606, 0.1179903895, 0.2293324769, 0.1807705015, -0.1961435974, -0.086288482, -0.2018707544, 0.3574145138, 0.1934524179, 0.1124655455, 0.1632595956, -0.0568787605, -0.2155371606, -0.1774517596, -0.2904546559, 0.1419078708, 0.6505103111, 0.6102460623, -0.2162735164, -0.1585993618, 0.0791448355, 0.0026035607, -0.1652731597, -0.5487569571, -0.1285093129, -0.1654994041, -0.4410517216, 0.0046067014, 0.3584984243, 0.3558533788, -0.2264456898, -0.0318751894, 0.1387300938, -0.0007012784, 0.3717512488, -0.1297643632, 0.0150536448, -0.066410847, 0.2258286923, -0.1074174196, 0.3673555255, 0.2488899529, 0.3563638031, 0.1917256266, -0.3175569773, 0.0706882849, -0.0298886281, 0.2314305753, 0.115161255, 0.1102285907, 0.3546282947, 0.1289709657, 0.3322556615, -0.1909662783, 0.6737231016, -0.0073441286, 0.2068271488, -0.8374903798, -0.2673701644, 0.3314557374, -0.0263577811, 0.1994024962, 0.215717867, -0.4038517177, -0.3216702938, 0.3766611218, 0.0990837663, 0.7675249577, -0.2972689867, 0.3962582052, -0.1552994698, 0.6611967087, 0.1280310154, -0.3813101053, 0.1585618854, -0.3987909555, -0.2663655877, 0.2130052447, -0.0923116431, 0.0365401059, 0.1605039835, -0.2905768156, -0.0298117511, 0.0862519294, 0.2249331921, 0.1306698024, 0.5134935379, 0.120926626, -0.6006980538, -0.0828129277, 0.0265690908, -0.0234003812, -0.1049019024, -0.0492794514, 0.0799459741, -0.0313268006, -0.2085639238, -0.0284662768, 0.2052582055, -0.2474462092, 0.0866158754, 0.0637520179, -0.0720534623, 0.5367990732, 0.1062302291, -0.0013607293, 0.4988959134, -0.0392867513, 0.1321583986, -0.2274820209, -0.1652866751, 0.2007084042, -0.0658488423, 0.4217028916, 0.0883791074, -0.00058797, 0.1141032875, 0.0632384419, -0.0466908254, 0.0947640389, -0.3643093109, 0.1054970175, -0.4112315476, -0.2154278457, -0.0884305984, -0.3007827401, -0.1009221599, 0.123109594, 0.1373566687, -0.464446038, 0.1151452065, 0.2652114034, -0.375010103, 0.0119969212, 0.3838247359, 0.1710590869, -0.1051390767, 0.499843061, 0.457587719, -0.0533619151, -0.2644975185, -0.1096296161, 0.2944480777, -0.2731434405, 0.0849161297, -0.2455591708, -0.3016008735, 0.2093508542, 0.0808756128, 0.1000995561, -0.0149994157, -0.0632621273, -0.2510291338, -0.6380496025, -0.1814441532, -0.0027227891, -0.0035484964, 0.1460835785, 0.2738982141, -0.3100468516, 0.5892775059, -0.2788560092, -0.0529419072, -0.2076085508, 0.2782230973, -0.0555623658, -0.3855459988, 0.2115361542, 0.2033527046, 0.1001042053, 0.0805963874, 0.0059093423, -0.2213539034, -0.1435202807, 0.112715289, 0.2364698499, -0.4196630418, -0.0159935355, -0.3129969835, -0.0924898088, -0.2700214386, 0.0660682172, 0.0170285404, 0.0139458058, -0.2261784673, 0.0093940832, 0.0475259274, -0.2552625537, 0.042060256, -0.2545849085, 0.1250909567, 0.246571213, 0.1331031621, 0.1178301424, 0.0006369948, -0.3931023479, 0.1796455681, 0.6661384702, -0.0175323728, 0.419082135, -0.4357323647, -0.0715063736, 0.247659564, 0.0498031005, 0.0480658486, -0.3416772485, -0.0107186846, 0.2935878038, 0.2268191278, 0.0152087137, -0.0147332661, 0.2263551354, -0.4112984538, 0.1301490664, 0.3453423977, -0.1119266823, -0.0674753264, 0.1559654474, 0.0913231522, 0.5066777468, 0.1149650663, 0.0057111867, 0.2489495873, -0.0012200177, 0.1548573822, 0.4856925309, 0.2611194551, 0.2386806309, 0.3016281128, -0.0326658972, -0.0406999141, -0.2312837243, 0.2294136882, -0.435616076, -0.415166378, -0.2674563825, 0.5044151545, -0.0193236805, 0.1242090911, -0.3350303173, -0.0662411451, -0.0406532101, 0.2541327477, -0.1192157567, 0.2744088173, -0.4530941248, -0.1705991924, -0.0894243345, -0.123961933, 0.0115341395, 0.1896333247, -0.0558586121, 0.2722765803, -0.5796659589, 0.2293862104, 0.0521361418, 0.1451499909, -0.1346786618, 0.1227787733, 0.1055012345, -0.5078569651, 0.3433353007, 0.2378563285, 0.2918590903, 0.115404956, -0.221906662, 0.3443704844, 0.4366974831, -0.1908073723, -0.0406121649, -0.1209841669, -0.0356197953, -0.1499129832, 0.1974350214, 0.3490367532, 0.1878990531, 0.2596160173, 0.0557278506, -0.265966177, 0.2785172462, 0.0650140941, -0.3240586519, 0.1786668301, -0.1329390854, 0.4019319415, -0.3285808265, -0.4429640174, 0.009867128, -0.3941243589, 0.2155868858, 0.6670196056, -0.2954224646, 0.2878553569, -0.0254881512, 0.0485503711, 0.0018406976, 0.4760496616, 0.4697353244, 0.2841596603, -0.3783711195, -0.0256674215, -0.3288833499, 0.2765586376, -0.2024302334, 0.0669976398, 0.0222478993, 0.1707946956, 0.1645776778, -0.0033023204, -0.0796225369, -0.0136221573, 0.218803376, -0.1950690895, -0.1675592959, -0.1280471683, 0.0020943638, 0.3573175073, -0.0259005763, -0.2975495458, 0.1588650346, 0.1150627136, 0.0991571695, -0.3214533329, 0.3626994789, -0.1984633654, -0.0203625765, 0.0691283345, -0.1504710466, 0.3792098761, -0.2204388827, -0.0330470726, -0.0818270966, 0.1281416565, -0.1670770645, -0.2209998518, 0.3402220011, 0.3884243667, -0.1076362282, 0.1082137078, -0.255192548, -0.0864622518, 0.0403043032, -0.369690299, -0.3087161779, -0.0074875243, -0.0002530813, 0.0085244328, 0.1030983254, 0.2458303273, 0.1053374782, -0.0278403312, -0.2292289734, -0.1855804026, 0.2094751894, -0.2946497798, -0.3130407035, -0.0353798121, 0.055624567, -0.1636292785, 0.2511017621, -0.4624068141, 0.1457485557, 0.2297439128, -0.0255144164, -0.4540739357, 0.2232972682, 0.2540614009, 0.0164720193, -0.2076969594, 0.0534572713, 0.152623266, -0.1369921267, -0.2016401291, -0.1757197082 ]
https://github.com/huggingface/datasets/issues/633
Load large text file for LM pre-training resulting in OOM
Thanks for sharing your results. So you still had the issue for fine-tuning ? And the issue still appears with a bare-bone dataset from an arrow file...
I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks.
27
Load large text file for LM pre-training resulting in OOM I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks. Thanks for sharing your results. So you still had the issue for fine-tuning ? And the issue still appears with a bare-bone dataset from an arrow file...
[ -0.6339286566, -0.4775367379, 0.0106935017, 0.2986355722, 0.3600474596, -0.1518248618, 0.5567324758, 0.3738059402, 0.0108827576, 0.0107196309, -0.129592672, -0.1827997565, -0.2669844627, -0.1620898247, -0.0323721357, 0.0232065637, -0.0998215079, 0.1968753189, -0.2772667408, -0.0966669172, 0.0570117384, -0.0776291862, -0.2127603889, -0.0297442861, -0.3879216015, -0.0617137849, 0.3158174157, 0.1221210212, -0.1640810221, -0.0958083943, -0.2297017574, 0.1197900772, 0.439348489, 0.4340211749, -0.0001147721, 0.0066160336, 0.1890438497, -0.2363324165, -0.1672690213, 0.0192163214, 0.0775555521, -0.3022914231, -0.1064001918, -0.3036173582, -0.0272463262, -0.1061252505, -0.0156821907, -0.2243723869, 0.4987519681, 0.5414042473, 0.158762753, 0.2508125007, -0.2136608064, 0.0723968297, 0.2544375062, 0.0488999188, 0.1056828648, -0.2795148194, 0.4687834084, -0.1923182011, -0.4726887643, 0.147817716, -0.2247735411, -0.2016820759, -0.0950891525, 0.0382048525, 0.3660891652, 0.0315695107, 0.2641872764, 0.3474208415, 0.4272072911, -0.0008119822, -0.2896353602, -0.3209854364, -0.142623961, -0.0843160674, 0.1892364621, 0.2010776401, 0.0598267131, 0.1184588969, 0.1071273386, -0.1253313124, -0.1519896984, 0.0761606991, -0.2748077214, 0.3397959173, -0.2012510002, -0.0380433276, 0.1555755138, -0.0944792777, 0.1961691678, -0.0997751206, 0.0034044338, 0.256714642, -0.2454416156, -0.1123953462, -0.0716817975, -0.5194139481, 0.1627842039, -0.275945127, 0.1981878579, 0.2972391844, -0.0876616836, -0.0676169693, 0.0863933712, 0.4202023745, -0.2524377108, 0.2605276108, 0.2272560447, 0.1640110612, -0.1717063487, -0.044398699, -0.3305115998, -0.1962453574, 0.1015936062, -0.0773412734, -0.0110169947, -0.2546044588, -0.2404216379, 0.0585461818, -0.1407698691, -0.0308450628, 0.2748159468, 0.3899732232, -0.3540614843, 0.429497689, 0.1651095748, 0.0523289181, -0.4855332375, -0.3365268707, -0.1744022667, 0.1532573104, -0.2032445073, 0.0389754251, 0.1297925711, 0.1396334916, 0.0596540496, -0.0428831056, -0.062589474, -0.4402189851, -0.0094447248, -0.0241031349, 0.0207529441, -0.0425147153, 0.0630241409, -0.0233657025, 0.2347659618, -0.124994956, -0.0253818408, 0.347791791, -0.2762121558, -0.2276411355, 0.0509795845, 0.2101765722, -0.0282484777, 0.2091551125, 0.0285063665, 0.0768566355, 0.5359786153, 0.0066538528, -0.0471477136, -0.3782058656, 0.0945213884, 0.0920172632, 0.1619913876, 0.176630199, -0.0425881371, -0.0726022273, 0.0063893422, 0.1035867184, 0.2550407052, 0.3907325864, -0.2264701426, 0.2906734943, 0.0070044398, -0.0566552505, 0.5319920778, -0.2620993257, -0.4484443367, 0.2338527292, -0.3511552811, 0.1081818715, 0.2206701636, 0.1759027839, 0.1138891205, -0.1089850664, 0.2690153718, 0.3656853139, 0.0705155656, 0.1209142357, -0.2366243303, -0.0922287181, -0.1715156287, 0.4842028618, 0.3627239466, -0.0255859457, -0.073089242, 0.3563796282, 0.1833981574, -0.1992233247, 0.1312805116, 0.3693415821, -0.1744246632, -0.126158461, 0.0944814235, 0.065598011, -0.3000324965, -0.0709871948, -0.0908449665, 0.0828673989, -0.1136057153, -0.0894262716, 0.2825582922, 0.0228702873, -0.076665625, 0.1131795123, 0.0379412323, -0.1318216175, 0.0248667225, -0.0704166517, 0.1261966676, 0.0154216904, -0.0932069048, 0.0873603672, -0.218415916, -0.1239407063, 0.0018027006, -0.1370613873, 0.0769498348, 0.1752731055, -0.0511879697, 0.038401451, -0.1730400473, 0.1069573462, -0.1529778838, -0.2987068594, -0.5071570873, 0.435893029, 0.1864787787, -0.338109374, 0.3402595818, 0.104027614, -0.0295993853, -0.1062710211, -0.1198950633, 0.5749330521, 0.2730709612, 0.019617686, 0.2990927994, -0.2454116046, 0.2542809844, -0.2662388682, -0.1036153138, -0.0977649093, 0.4209234118, 0.0310557783, 0.2350629866, 0.148280412, -0.0869456232, -0.2320593894, 0.4703808725, -0.1788093746, -0.0138720572, 0.2665264308, -0.4247210622, -0.0115635023, -0.3480208218, -0.2494779825, 0.0067337304, 0.1732288599, -0.0944132358, -0.0554348305, 0.008538004, -0.2627931833, 0.0078360327, 0.153237164, -0.0718519539, 0.3198346198, 0.1165152863, 0.1005953103, -0.1228921488, -0.1794068366, 0.1028476655, 0.1782900989, -0.275191009, 0.1844428182, -0.0228228644, -0.0134468302, -0.2862631679, 0.1904247552, -0.1268012226, 0.0388155058, -0.064743273, 0.2586891651, -0.0779029801, 0.1019122973, 0.3001608849, 0.3549228907, 0.462962091, -0.0780947804, 0.3263480961, -0.0826636776, -0.1796956658, 0.0329436511, 0.3345074356, -0.3179924488, -0.0248901304, 0.0053771734, 0.0511769727, -0.098929733, -0.1660347879, 0.0946891308, 0.1968245208, -0.1661896259, -0.2573056519, -0.0721762553, 0.1776912063, -0.0424321517, 0.1197286695, -0.2324413657, 0.0084067807, 0.1450569928, -0.0062898882, -0.1843237281, 0.0404452905, -0.1199223399, -0.0470516533, -0.4116415977, -0.053518068, 0.1070510522, 0.1367316544, 0.3593674898, 0.1067891791, 0.194451645, 0.2884450853, 0.1585922837, 0.0063672885, -0.0885991901, 0.5141718388, -0.0175727978, -0.2971983254, -0.2908624411, -0.2397544682, 0.0251131542, 0.3155243993, -0.6842602491, 0.2337829173, -0.1138520464, -0.0940265656, -0.3955192268, 0.0007257089, 0.252205044, -0.0761481375, 0.0163395926, 0.0954741985, 0.2302769721, 0.1546791494, -0.0387872718, 0.3857944906, -0.4130838811, 0.359408766, -0.1354579031, 0.4855529666, -0.1221182644, -0.2300207019, 0.0988750979, 0.1504482776, -0.0171156786, -0.0218024775, -0.0139686503, -0.2593793869, -0.0582374632, 0.0227312893, 0.5197216272, 0.0168899372, -0.1655897647, -0.0158990994, 0.0681643113, 0.2563194036, -0.2606988251, 0.4905276, -0.1907785088, 0.0184165761, -0.1996744126, 0.0187455565, -0.2049808204, -0.2150854766, -0.1641347706, 0.0934438407, -0.0303831622, -0.2240440995, -0.2116935551, 0.10314022, -0.3681344092, -0.0437650606, 0.1179903895, 0.2293324769, 0.1807705015, -0.1961435974, -0.086288482, -0.2018707544, 0.3574145138, 0.1934524179, 0.1124655455, 0.1632595956, -0.0568787605, -0.2155371606, -0.1774517596, -0.2904546559, 0.1419078708, 0.6505103111, 0.6102460623, -0.2162735164, -0.1585993618, 0.0791448355, 0.0026035607, -0.1652731597, -0.5487569571, -0.1285093129, -0.1654994041, -0.4410517216, 0.0046067014, 0.3584984243, 0.3558533788, -0.2264456898, -0.0318751894, 0.1387300938, -0.0007012784, 0.3717512488, -0.1297643632, 0.0150536448, -0.066410847, 0.2258286923, -0.1074174196, 0.3673555255, 0.2488899529, 0.3563638031, 0.1917256266, -0.3175569773, 0.0706882849, -0.0298886281, 0.2314305753, 0.115161255, 0.1102285907, 0.3546282947, 0.1289709657, 0.3322556615, -0.1909662783, 0.6737231016, -0.0073441286, 0.2068271488, -0.8374903798, -0.2673701644, 0.3314557374, -0.0263577811, 0.1994024962, 0.215717867, -0.4038517177, -0.3216702938, 0.3766611218, 0.0990837663, 0.7675249577, -0.2972689867, 0.3962582052, -0.1552994698, 0.6611967087, 0.1280310154, -0.3813101053, 0.1585618854, -0.3987909555, -0.2663655877, 0.2130052447, -0.0923116431, 0.0365401059, 0.1605039835, -0.2905768156, -0.0298117511, 0.0862519294, 0.2249331921, 0.1306698024, 0.5134935379, 0.120926626, -0.6006980538, -0.0828129277, 0.0265690908, -0.0234003812, -0.1049019024, -0.0492794514, 0.0799459741, -0.0313268006, -0.2085639238, -0.0284662768, 0.2052582055, -0.2474462092, 0.0866158754, 0.0637520179, -0.0720534623, 0.5367990732, 0.1062302291, -0.0013607293, 0.4988959134, -0.0392867513, 0.1321583986, -0.2274820209, -0.1652866751, 0.2007084042, -0.0658488423, 0.4217028916, 0.0883791074, -0.00058797, 0.1141032875, 0.0632384419, -0.0466908254, 0.0947640389, -0.3643093109, 0.1054970175, -0.4112315476, -0.2154278457, -0.0884305984, -0.3007827401, -0.1009221599, 0.123109594, 0.1373566687, -0.464446038, 0.1151452065, 0.2652114034, -0.375010103, 0.0119969212, 0.3838247359, 0.1710590869, -0.1051390767, 0.499843061, 0.457587719, -0.0533619151, -0.2644975185, -0.1096296161, 0.2944480777, -0.2731434405, 0.0849161297, -0.2455591708, -0.3016008735, 0.2093508542, 0.0808756128, 0.1000995561, -0.0149994157, -0.0632621273, -0.2510291338, -0.6380496025, -0.1814441532, -0.0027227891, -0.0035484964, 0.1460835785, 0.2738982141, -0.3100468516, 0.5892775059, -0.2788560092, -0.0529419072, -0.2076085508, 0.2782230973, -0.0555623658, -0.3855459988, 0.2115361542, 0.2033527046, 0.1001042053, 0.0805963874, 0.0059093423, -0.2213539034, -0.1435202807, 0.112715289, 0.2364698499, -0.4196630418, -0.0159935355, -0.3129969835, -0.0924898088, -0.2700214386, 0.0660682172, 0.0170285404, 0.0139458058, -0.2261784673, 0.0093940832, 0.0475259274, -0.2552625537, 0.042060256, -0.2545849085, 0.1250909567, 0.246571213, 0.1331031621, 0.1178301424, 0.0006369948, -0.3931023479, 0.1796455681, 0.6661384702, -0.0175323728, 0.419082135, -0.4357323647, -0.0715063736, 0.247659564, 0.0498031005, 0.0480658486, -0.3416772485, -0.0107186846, 0.2935878038, 0.2268191278, 0.0152087137, -0.0147332661, 0.2263551354, -0.4112984538, 0.1301490664, 0.3453423977, -0.1119266823, -0.0674753264, 0.1559654474, 0.0913231522, 0.5066777468, 0.1149650663, 0.0057111867, 0.2489495873, -0.0012200177, 0.1548573822, 0.4856925309, 0.2611194551, 0.2386806309, 0.3016281128, -0.0326658972, -0.0406999141, -0.2312837243, 0.2294136882, -0.435616076, -0.415166378, -0.2674563825, 0.5044151545, -0.0193236805, 0.1242090911, -0.3350303173, -0.0662411451, -0.0406532101, 0.2541327477, -0.1192157567, 0.2744088173, -0.4530941248, -0.1705991924, -0.0894243345, -0.123961933, 0.0115341395, 0.1896333247, -0.0558586121, 0.2722765803, -0.5796659589, 0.2293862104, 0.0521361418, 0.1451499909, -0.1346786618, 0.1227787733, 0.1055012345, -0.5078569651, 0.3433353007, 0.2378563285, 0.2918590903, 0.115404956, -0.221906662, 0.3443704844, 0.4366974831, -0.1908073723, -0.0406121649, -0.1209841669, -0.0356197953, -0.1499129832, 0.1974350214, 0.3490367532, 0.1878990531, 0.2596160173, 0.0557278506, -0.265966177, 0.2785172462, 0.0650140941, -0.3240586519, 0.1786668301, -0.1329390854, 0.4019319415, -0.3285808265, -0.4429640174, 0.009867128, -0.3941243589, 0.2155868858, 0.6670196056, -0.2954224646, 0.2878553569, -0.0254881512, 0.0485503711, 0.0018406976, 0.4760496616, 0.4697353244, 0.2841596603, -0.3783711195, -0.0256674215, -0.3288833499, 0.2765586376, -0.2024302334, 0.0669976398, 0.0222478993, 0.1707946956, 0.1645776778, -0.0033023204, -0.0796225369, -0.0136221573, 0.218803376, -0.1950690895, -0.1675592959, -0.1280471683, 0.0020943638, 0.3573175073, -0.0259005763, -0.2975495458, 0.1588650346, 0.1150627136, 0.0991571695, -0.3214533329, 0.3626994789, -0.1984633654, -0.0203625765, 0.0691283345, -0.1504710466, 0.3792098761, -0.2204388827, -0.0330470726, -0.0818270966, 0.1281416565, -0.1670770645, -0.2209998518, 0.3402220011, 0.3884243667, -0.1076362282, 0.1082137078, -0.255192548, -0.0864622518, 0.0403043032, -0.369690299, -0.3087161779, -0.0074875243, -0.0002530813, 0.0085244328, 0.1030983254, 0.2458303273, 0.1053374782, -0.0278403312, -0.2292289734, -0.1855804026, 0.2094751894, -0.2946497798, -0.3130407035, -0.0353798121, 0.055624567, -0.1636292785, 0.2511017621, -0.4624068141, 0.1457485557, 0.2297439128, -0.0255144164, -0.4540739357, 0.2232972682, 0.2540614009, 0.0164720193, -0.2076969594, 0.0534572713, 0.152623266, -0.1369921267, -0.2016401291, -0.1757197082 ]
https://github.com/huggingface/datasets/issues/633
Load large text file for LM pre-training resulting in OOM
Yes, on both cases. Fine-tuning a pre-trained model and pre-training from scratch with a local arrow file already pre-processed.
I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks.
19
Load large text file for LM pre-training resulting in OOM I tried to pretrain Longformer using transformers and datasets. But I got OOM issues with loading a large text file. My script is almost like this: ```python from datasets import load_dataset @dataclass class DataCollatorForDatasetsLanguageModeling(DataCollatorForLanguageModeling): """ Data collator used for language modeling based on DataCollatorForLazyLanguageModeling - collates batches of tensors, honoring their tokenizer's pad_token - preprocesses batches for masked language modeling """ block_size: int = 512 def __call__(self, examples: List[dict]) -> Dict[str, torch.Tensor]: examples = [example['text'] for example in examples] batch, attention_mask = self._tensorize_batch(examples) if self.mlm: inputs, labels = self.mask_tokens(batch) return {"input_ids": inputs, "labels": labels} else: labels = batch.clone().detach() if self.tokenizer.pad_token_id is not None: labels[labels == self.tokenizer.pad_token_id] = -100 return {"input_ids": batch, "labels": labels} def _tensorize_batch(self, examples: List[str]) -> Tuple[torch.Tensor, torch.Tensor]: if self.tokenizer._pad_token is None: raise ValueError( "You are attempting to pad samples but the tokenizer you are using" f" ({self.tokenizer.__class__.__name__}) does not have one." ) tensor_examples = self.tokenizer.batch_encode_plus( [ex for ex in examples if ex], max_length=self.block_size, return_tensors="pt", pad_to_max_length=True, return_attention_mask=True, truncation=True, ) input_ids, attention_mask = tensor_examples["input_ids"], tensor_examples["attention_mask"] return input_ids, attention_mask dataset = load_dataset('text', data_files='train.txt',cache_dir="./", , split='train') data_collator = DataCollatorForDatasetsLanguageModeling(tokenizer=tokenizer, mlm=True, mlm_probability=0.15, block_size=tokenizer.max_len) trainer = Trainer(model=model, args=args, data_collator=data_collator, train_dataset=train_dataset, prediction_loss_only=True, ) trainer.train(model_path=model_path) ``` This train.txt is about 1.1GB and has 90k lines where each line is a sequence of 4k words. During training, the memory usage increased fast as the following graph and resulted in OOM before the finish of training. ![image](https://user-images.githubusercontent.com/29704017/93292112-5576b280-f817-11ea-8da2-b2db9bf35665.png) Could you please give me any suggestions on why this happened and how to fix it? Thanks. Yes, on both cases. Fine-tuning a pre-trained model and pre-training from scratch with a local arrow file already pre-processed.
[ -0.6339286566, -0.4775367379, 0.0106935017, 0.2986355722, 0.3600474596, -0.1518248618, 0.5567324758, 0.3738059402, 0.0108827576, 0.0107196309, -0.129592672, -0.1827997565, -0.2669844627, -0.1620898247, -0.0323721357, 0.0232065637, -0.0998215079, 0.1968753189, -0.2772667408, -0.0966669172, 0.0570117384, -0.0776291862, -0.2127603889, -0.0297442861, -0.3879216015, -0.0617137849, 0.3158174157, 0.1221210212, -0.1640810221, -0.0958083943, -0.2297017574, 0.1197900772, 0.439348489, 0.4340211749, -0.0001147721, 0.0066160336, 0.1890438497, -0.2363324165, -0.1672690213, 0.0192163214, 0.0775555521, -0.3022914231, -0.1064001918, -0.3036173582, -0.0272463262, -0.1061252505, -0.0156821907, -0.2243723869, 0.4987519681, 0.5414042473, 0.158762753, 0.2508125007, -0.2136608064, 0.0723968297, 0.2544375062, 0.0488999188, 0.1056828648, -0.2795148194, 0.4687834084, -0.1923182011, -0.4726887643, 0.147817716, -0.2247735411, -0.2016820759, -0.0950891525, 0.0382048525, 0.3660891652, 0.0315695107, 0.2641872764, 0.3474208415, 0.4272072911, -0.0008119822, -0.2896353602, -0.3209854364, -0.142623961, -0.0843160674, 0.1892364621, 0.2010776401, 0.0598267131, 0.1184588969, 0.1071273386, -0.1253313124, -0.1519896984, 0.0761606991, -0.2748077214, 0.3397959173, -0.2012510002, -0.0380433276, 0.1555755138, -0.0944792777, 0.1961691678, -0.0997751206, 0.0034044338, 0.256714642, -0.2454416156, -0.1123953462, -0.0716817975, -0.5194139481, 0.1627842039, -0.275945127, 0.1981878579, 0.2972391844, -0.0876616836, -0.0676169693, 0.0863933712, 0.4202023745, -0.2524377108, 0.2605276108, 0.2272560447, 0.1640110612, -0.1717063487, -0.044398699, -0.3305115998, -0.1962453574, 0.1015936062, -0.0773412734, -0.0110169947, -0.2546044588, -0.2404216379, 0.0585461818, -0.1407698691, -0.0308450628, 0.2748159468, 0.3899732232, -0.3540614843, 0.429497689, 0.1651095748, 0.0523289181, -0.4855332375, -0.3365268707, -0.1744022667, 0.1532573104, -0.2032445073, 0.0389754251, 0.1297925711, 0.1396334916, 0.0596540496, -0.0428831056, -0.062589474, -0.4402189851, -0.0094447248, -0.0241031349, 0.0207529441, -0.0425147153, 0.0630241409, -0.0233657025, 0.2347659618, -0.124994956, -0.0253818408, 0.347791791, -0.2762121558, -0.2276411355, 0.0509795845, 0.2101765722, -0.0282484777, 0.2091551125, 0.0285063665, 0.0768566355, 0.5359786153, 0.0066538528, -0.0471477136, -0.3782058656, 0.0945213884, 0.0920172632, 0.1619913876, 0.176630199, -0.0425881371, -0.0726022273, 0.0063893422, 0.1035867184, 0.2550407052, 0.3907325864, -0.2264701426, 0.2906734943, 0.0070044398, -0.0566552505, 0.5319920778, -0.2620993257, -0.4484443367, 0.2338527292, -0.3511552811, 0.1081818715, 0.2206701636, 0.1759027839, 0.1138891205, -0.1089850664, 0.2690153718, 0.3656853139, 0.0705155656, 0.1209142357, -0.2366243303, -0.0922287181, -0.1715156287, 0.4842028618, 0.3627239466, -0.0255859457, -0.073089242, 0.3563796282, 0.1833981574, -0.1992233247, 0.1312805116, 0.3693415821, -0.1744246632, -0.126158461, 0.0944814235, 0.065598011, -0.3000324965, -0.0709871948, -0.0908449665, 0.0828673989, -0.1136057153, -0.0894262716, 0.2825582922, 0.0228702873, -0.076665625, 0.1131795123, 0.0379412323, -0.1318216175, 0.0248667225, -0.0704166517, 0.1261966676, 0.0154216904, -0.0932069048, 0.0873603672, -0.218415916, -0.1239407063, 0.0018027006, -0.1370613873, 0.0769498348, 0.1752731055, -0.0511879697, 0.038401451, -0.1730400473, 0.1069573462, -0.1529778838, -0.2987068594, -0.5071570873, 0.435893029, 0.1864787787, -0.338109374, 0.3402595818, 0.104027614, -0.0295993853, -0.1062710211, -0.1198950633, 0.5749330521, 0.2730709612, 0.019617686, 0.2990927994, -0.2454116046, 0.2542809844, -0.2662388682, -0.1036153138, -0.0977649093, 0.4209234118, 0.0310557783, 0.2350629866, 0.148280412, -0.0869456232, -0.2320593894, 0.4703808725, -0.1788093746, -0.0138720572, 0.2665264308, -0.4247210622, -0.0115635023, -0.3480208218, -0.2494779825, 0.0067337304, 0.1732288599, -0.0944132358, -0.0554348305, 0.008538004, -0.2627931833, 0.0078360327, 0.153237164, -0.0718519539, 0.3198346198, 0.1165152863, 0.1005953103, -0.1228921488, -0.1794068366, 0.1028476655, 0.1782900989, -0.275191009, 0.1844428182, -0.0228228644, -0.0134468302, -0.2862631679, 0.1904247552, -0.1268012226, 0.0388155058, -0.064743273, 0.2586891651, -0.0779029801, 0.1019122973, 0.3001608849, 0.3549228907, 0.462962091, -0.0780947804, 0.3263480961, -0.0826636776, -0.1796956658, 0.0329436511, 0.3345074356, -0.3179924488, -0.0248901304, 0.0053771734, 0.0511769727, -0.098929733, -0.1660347879, 0.0946891308, 0.1968245208, -0.1661896259, -0.2573056519, -0.0721762553, 0.1776912063, -0.0424321517, 0.1197286695, -0.2324413657, 0.0084067807, 0.1450569928, -0.0062898882, -0.1843237281, 0.0404452905, -0.1199223399, -0.0470516533, -0.4116415977, -0.053518068, 0.1070510522, 0.1367316544, 0.3593674898, 0.1067891791, 0.194451645, 0.2884450853, 0.1585922837, 0.0063672885, -0.0885991901, 0.5141718388, -0.0175727978, -0.2971983254, -0.2908624411, -0.2397544682, 0.0251131542, 0.3155243993, -0.6842602491, 0.2337829173, -0.1138520464, -0.0940265656, -0.3955192268, 0.0007257089, 0.252205044, -0.0761481375, 0.0163395926, 0.0954741985, 0.2302769721, 0.1546791494, -0.0387872718, 0.3857944906, -0.4130838811, 0.359408766, -0.1354579031, 0.4855529666, -0.1221182644, -0.2300207019, 0.0988750979, 0.1504482776, -0.0171156786, -0.0218024775, -0.0139686503, -0.2593793869, -0.0582374632, 0.0227312893, 0.5197216272, 0.0168899372, -0.1655897647, -0.0158990994, 0.0681643113, 0.2563194036, -0.2606988251, 0.4905276, -0.1907785088, 0.0184165761, -0.1996744126, 0.0187455565, -0.2049808204, -0.2150854766, -0.1641347706, 0.0934438407, -0.0303831622, -0.2240440995, -0.2116935551, 0.10314022, -0.3681344092, -0.0437650606, 0.1179903895, 0.2293324769, 0.1807705015, -0.1961435974, -0.086288482, -0.2018707544, 0.3574145138, 0.1934524179, 0.1124655455, 0.1632595956, -0.0568787605, -0.2155371606, -0.1774517596, -0.2904546559, 0.1419078708, 0.6505103111, 0.6102460623, -0.2162735164, -0.1585993618, 0.0791448355, 0.0026035607, -0.1652731597, -0.5487569571, -0.1285093129, -0.1654994041, -0.4410517216, 0.0046067014, 0.3584984243, 0.3558533788, -0.2264456898, -0.0318751894, 0.1387300938, -0.0007012784, 0.3717512488, -0.1297643632, 0.0150536448, -0.066410847, 0.2258286923, -0.1074174196, 0.3673555255, 0.2488899529, 0.3563638031, 0.1917256266, -0.3175569773, 0.0706882849, -0.0298886281, 0.2314305753, 0.115161255, 0.1102285907, 0.3546282947, 0.1289709657, 0.3322556615, -0.1909662783, 0.6737231016, -0.0073441286, 0.2068271488, -0.8374903798, -0.2673701644, 0.3314557374, -0.0263577811, 0.1994024962, 0.215717867, -0.4038517177, -0.3216702938, 0.3766611218, 0.0990837663, 0.7675249577, -0.2972689867, 0.3962582052, -0.1552994698, 0.6611967087, 0.1280310154, -0.3813101053, 0.1585618854, -0.3987909555, -0.2663655877, 0.2130052447, -0.0923116431, 0.0365401059, 0.1605039835, -0.2905768156, -0.0298117511, 0.0862519294, 0.2249331921, 0.1306698024, 0.5134935379, 0.120926626, -0.6006980538, -0.0828129277, 0.0265690908, -0.0234003812, -0.1049019024, -0.0492794514, 0.0799459741, -0.0313268006, -0.2085639238, -0.0284662768, 0.2052582055, -0.2474462092, 0.0866158754, 0.0637520179, -0.0720534623, 0.5367990732, 0.1062302291, -0.0013607293, 0.4988959134, -0.0392867513, 0.1321583986, -0.2274820209, -0.1652866751, 0.2007084042, -0.0658488423, 0.4217028916, 0.0883791074, -0.00058797, 0.1141032875, 0.0632384419, -0.0466908254, 0.0947640389, -0.3643093109, 0.1054970175, -0.4112315476, -0.2154278457, -0.0884305984, -0.3007827401, -0.1009221599, 0.123109594, 0.1373566687, -0.464446038, 0.1151452065, 0.2652114034, -0.375010103, 0.0119969212, 0.3838247359, 0.1710590869, -0.1051390767, 0.499843061, 0.457587719, -0.0533619151, -0.2644975185, -0.1096296161, 0.2944480777, -0.2731434405, 0.0849161297, -0.2455591708, -0.3016008735, 0.2093508542, 0.0808756128, 0.1000995561, -0.0149994157, -0.0632621273, -0.2510291338, -0.6380496025, -0.1814441532, -0.0027227891, -0.0035484964, 0.1460835785, 0.2738982141, -0.3100468516, 0.5892775059, -0.2788560092, -0.0529419072, -0.2076085508, 0.2782230973, -0.0555623658, -0.3855459988, 0.2115361542, 0.2033527046, 0.1001042053, 0.0805963874, 0.0059093423, -0.2213539034, -0.1435202807, 0.112715289, 0.2364698499, -0.4196630418, -0.0159935355, -0.3129969835, -0.0924898088, -0.2700214386, 0.0660682172, 0.0170285404, 0.0139458058, -0.2261784673, 0.0093940832, 0.0475259274, -0.2552625537, 0.042060256, -0.2545849085, 0.1250909567, 0.246571213, 0.1331031621, 0.1178301424, 0.0006369948, -0.3931023479, 0.1796455681, 0.6661384702, -0.0175323728, 0.419082135, -0.4357323647, -0.0715063736, 0.247659564, 0.0498031005, 0.0480658486, -0.3416772485, -0.0107186846, 0.2935878038, 0.2268191278, 0.0152087137, -0.0147332661, 0.2263551354, -0.4112984538, 0.1301490664, 0.3453423977, -0.1119266823, -0.0674753264, 0.1559654474, 0.0913231522, 0.5066777468, 0.1149650663, 0.0057111867, 0.2489495873, -0.0012200177, 0.1548573822, 0.4856925309, 0.2611194551, 0.2386806309, 0.3016281128, -0.0326658972, -0.0406999141, -0.2312837243, 0.2294136882, -0.435616076, -0.415166378, -0.2674563825, 0.5044151545, -0.0193236805, 0.1242090911, -0.3350303173, -0.0662411451, -0.0406532101, 0.2541327477, -0.1192157567, 0.2744088173, -0.4530941248, -0.1705991924, -0.0894243345, -0.123961933, 0.0115341395, 0.1896333247, -0.0558586121, 0.2722765803, -0.5796659589, 0.2293862104, 0.0521361418, 0.1451499909, -0.1346786618, 0.1227787733, 0.1055012345, -0.5078569651, 0.3433353007, 0.2378563285, 0.2918590903, 0.115404956, -0.221906662, 0.3443704844, 0.4366974831, -0.1908073723, -0.0406121649, -0.1209841669, -0.0356197953, -0.1499129832, 0.1974350214, 0.3490367532, 0.1878990531, 0.2596160173, 0.0557278506, -0.265966177, 0.2785172462, 0.0650140941, -0.3240586519, 0.1786668301, -0.1329390854, 0.4019319415, -0.3285808265, -0.4429640174, 0.009867128, -0.3941243589, 0.2155868858, 0.6670196056, -0.2954224646, 0.2878553569, -0.0254881512, 0.0485503711, 0.0018406976, 0.4760496616, 0.4697353244, 0.2841596603, -0.3783711195, -0.0256674215, -0.3288833499, 0.2765586376, -0.2024302334, 0.0669976398, 0.0222478993, 0.1707946956, 0.1645776778, -0.0033023204, -0.0796225369, -0.0136221573, 0.218803376, -0.1950690895, -0.1675592959, -0.1280471683, 0.0020943638, 0.3573175073, -0.0259005763, -0.2975495458, 0.1588650346, 0.1150627136, 0.0991571695, -0.3214533329, 0.3626994789, -0.1984633654, -0.0203625765, 0.0691283345, -0.1504710466, 0.3792098761, -0.2204388827, -0.0330470726, -0.0818270966, 0.1281416565, -0.1670770645, -0.2209998518, 0.3402220011, 0.3884243667, -0.1076362282, 0.1082137078, -0.255192548, -0.0864622518, 0.0403043032, -0.369690299, -0.3087161779, -0.0074875243, -0.0002530813, 0.0085244328, 0.1030983254, 0.2458303273, 0.1053374782, -0.0278403312, -0.2292289734, -0.1855804026, 0.2094751894, -0.2946497798, -0.3130407035, -0.0353798121, 0.055624567, -0.1636292785, 0.2511017621, -0.4624068141, 0.1457485557, 0.2297439128, -0.0255144164, -0.4540739357, 0.2232972682, 0.2540614009, 0.0164720193, -0.2076969594, 0.0534572713, 0.152623266, -0.1369921267, -0.2016401291, -0.1757197082 ]
https://github.com/huggingface/datasets/issues/630
Text dataset not working with large files
Basically ~600MB txt files(UTF-8) * 59. contents like ```안녕하세요, 이것은 예제로 한번 말해보는 텍스트입니다. 그냥 이렇다고요.<|endoftext|>\n``` Also, it gets stuck for a loooong time at ```Testing the mapped function outputs```, for more than 12 hours(currently ongoing)
``` Traceback (most recent call last): File "examples/language-modeling/run_language_modeling.py", line 333, in <module> main() File "examples/language-modeling/run_language_modeling.py", line 262, in main get_dataset(data_args, tokenizer=tokenizer, cache_dir=model_args.cache_dir) if training_args.do_train else None File "examples/language-modeling/run_language_modeling.py", line 144, in get_dataset dataset = load_dataset("text", data_files=file_path, split='train+test') File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/load.py", line 611, in load_dataset ignore_verifications=ignore_verifications, File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 469, in download_and_prepare dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/ksjae/.local/lib/python3.7/site-packages/tqdm/std.py", line 1129, in __iter__ for obj in iterable: File "/home/ksjae/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 104, in _generate_tables convert_options=self.config.convert_options, File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status ``` **pyarrow.lib.ArrowInvalid: straddling object straddles two block boundaries (try to increase block size?)** It gives the same message for both 200MB, 10GB .tx files but not for 700MB file. Can't upload due to size & copyright problem. sorry.
36
Text dataset not working with large files ``` Traceback (most recent call last): File "examples/language-modeling/run_language_modeling.py", line 333, in <module> main() File "examples/language-modeling/run_language_modeling.py", line 262, in main get_dataset(data_args, tokenizer=tokenizer, cache_dir=model_args.cache_dir) if training_args.do_train else None File "examples/language-modeling/run_language_modeling.py", line 144, in get_dataset dataset = load_dataset("text", data_files=file_path, split='train+test') File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/load.py", line 611, in load_dataset ignore_verifications=ignore_verifications, File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 469, in download_and_prepare dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/ksjae/.local/lib/python3.7/site-packages/tqdm/std.py", line 1129, in __iter__ for obj in iterable: File "/home/ksjae/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 104, in _generate_tables convert_options=self.config.convert_options, File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status ``` **pyarrow.lib.ArrowInvalid: straddling object straddles two block boundaries (try to increase block size?)** It gives the same message for both 200MB, 10GB .tx files but not for 700MB file. Can't upload due to size & copyright problem. sorry. Basically ~600MB txt files(UTF-8) * 59. contents like ```안녕하세요, 이것은 예제로 한번 말해보는 텍스트입니다. 그냥 이렇다고요.<|endoftext|>\n``` Also, it gets stuck for a loooong time at ```Testing the mapped function outputs```, for more than 12 hours(currently ongoing)
[ -0.4925667048, -0.231024459, -0.1198643669, 0.2836016417, 0.4663564563, -0.0735015422, 0.305367887, 0.5961021185, -0.1138257831, 0.0461650416, -0.0624195635, -0.0304050855, -0.1033422872, 0.3101792336, -0.1062135994, -0.038950067, -0.2278844714, 0.1130641326, -0.0946918353, 0.0499164239, -0.1125508845, 0.1110999957, -0.1209789515, -0.0956027284, -0.4804550707, -0.0829954073, 0.1185314134, 0.2535656095, -0.292010814, -0.3053141534, -0.1657070965, 0.0965851247, 0.1122045517, 0.6015895605, -0.0001037621, 0.129462257, 0.2131116986, -0.1573347449, -0.2265825868, -0.0314554796, 0.1559487879, -0.4493691325, -0.1823891997, -0.2179644406, 0.0261512175, -0.0627443939, -0.1082876548, -0.1175597459, 0.2653357387, 0.5598000884, 0.3412034512, 0.1707586795, 0.132169202, -0.112961188, 0.250754118, -0.0872496516, -0.0887441784, 0.0848499984, 0.3108182251, 0.0265772715, -0.2564009428, 0.3199440539, 0.0101718828, 0.0377003662, 0.000314123, -0.0240627509, -0.0827938169, -0.2940051258, 0.3711952865, 0.2321465015, 0.5224131942, -0.3169360459, -0.1641274989, -0.3288432062, -0.182441175, -0.3512557149, 0.2266326845, 0.1845538318, -0.111331135, 0.1161270589, -0.1436177939, -0.1014851034, -0.2063972205, -0.0293160938, -0.2615340948, 0.1022735834, -0.2105868161, -0.1004619896, 0.2575801313, -0.27024737, 0.0108299255, -0.1056218594, -0.1307831407, -0.0068053361, -0.247213304, -0.0840229169, 0.246850729, -0.199078083, 0.288942188, -0.0489984453, -0.0504209101, 0.2247608304, -0.0796671361, -0.0302130803, 0.0054933541, 0.3368761539, 0.0041130576, 0.0638572425, 0.408585608, 0.2356286943, -0.379357487, -0.2320609689, -0.1496266723, -0.5145173073, -0.0293559097, -0.1177168339, 0.0168719739, -0.0452367477, -0.1210253313, 0.1521078646, 0.1642947644, 0.1547950506, 0.1059662178, 0.4816168845, 0.1049142778, 0.2937653065, -0.1649461985, 0.1770391017, -0.1443264484, -0.1722711325, -0.2492275834, -0.0456117094, -0.0579166152, -0.0926889852, 0.2676634789, 0.1338512301, 0.2967100441, -0.1395805478, 0.1908533573, -0.1090159342, 0.1168144345, -0.5029691458, 0.1476962417, 0.1410788596, -0.0354774296, 0.1686937064, 0.2287830114, -0.1788697988, -0.102391243, 0.0502354205, -0.1035604477, -0.2971286774, 0.1772404313, 0.3213211298, 0.1347745359, -0.0030029803, -0.0189048126, 0.2770770788, 0.3757717609, -0.2176395059, 0.0281008035, -0.1773215979, -0.2569281459, -0.1258029342, 0.2650002539, 0.4393814206, -0.4250603914, 0.2090141028, 0.0690165535, 0.1223086491, 0.0890246034, 0.3739469349, -0.0650729612, 0.3143593669, -0.1145713925, 0.2009434551, 0.3094798923, -0.2855856121, -0.5090754032, 0.5406070948, -0.1859300882, -0.1181869656, 0.0992573202, -0.0444511101, 0.1034917086, 0.1358727366, 0.3047404289, 0.2793403864, 0.011319682, 0.2667298913, -0.260450542, -0.0832505971, -0.0501142181, 0.1559396088, 0.312137723, -0.2080694735, 0.195168823, -0.0172313079, 0.1943764687, -0.0444458202, 0.2128831893, 0.5059322715, 0.1271579862, -0.0078409538, -0.0123688877, -0.2239850014, 0.0316569805, 0.1751383543, 0.1210088432, 0.0843076557, -0.2666846514, -0.0822331682, -0.309245348, -0.1341455579, -0.2703467011, -0.1933863014, 0.2836053371, 0.0813238025, -0.0114755929, 0.2551390529, -0.0167011097, 0.0253123995, -0.1602132767, -0.0655840337, -0.0120966481, -0.0018739775, -0.091083914, -0.2782086134, 0.122949332, 0.1102732196, -0.0411426574, -0.0616680533, -0.1109170541, 0.3928825557, -0.0020204186, -0.0810619891, -0.0360697247, -0.0300613195, 0.1020017415, -0.0952063128, 0.0719456449, 0.0007553585, 0.1388050616, -0.105313383, -0.1801619083, 0.300614208, -0.1398147196, 0.2031629086, 0.3438570797, -0.2090739459, 0.2556621432, -0.0663510486, 0.2276657224, -0.1381174475, 0.2406678498, 0.0831319094, 0.1363214701, -0.0013490645, -0.2719839215, -0.0127428509, 0.6895840168, 0.1710654795, 0.0928928256, 0.1747953296, -0.2453399599, -0.0179575719, -0.2034239471, 0.1299687773, 0.3976371586, 0.274443984, 0.2679816484, 0.0357754976, 0.1870049238, -0.3740029037, 0.1397105753, 0.0032483395, 0.0111855678, 0.5398925543, 0.2758922875, -0.1295804679, -0.4534206986, -0.1563688219, 0.1057757214, 0.3134025633, -0.0895872042, 0.019882448, -0.2155626714, -0.2920288444, -0.2959759235, 0.210327521, -0.2627325058, -0.2253176868, 0.0694883168, 0.0018544411, -0.0096534938, 0.09686625, 0.0594208688, 0.1454249024, 0.3997492492, -0.1988250166, -0.0615104325, -0.2715907991, -0.1730494499, 0.1567442566, 0.2421877533, -0.0123151951, 0.1066426486, -0.1534678638, -0.1015961096, 0.0038625095, -0.2853767574, 0.0006908299, -0.0291902125, 0.1006610468, -0.0571643263, 0.2626276016, 0.166275382, 0.0769830346, 0.3128012419, -0.1297490746, -0.0807751566, 0.2233642489, 0.0906753093, -0.0652650967, -0.1705733389, -0.4048936069, -0.1530369967, -0.6100061536, 0.3575110734, 0.0955665559, 0.0380034968, 0.4022582173, 0.3181683421, 0.2154985219, -0.0212013423, 0.2124927938, 0.1002641469, -0.1950187981, 0.2925724685, -0.2248303741, -0.3832402229, -0.0998250395, 0.1606035233, 0.1626158357, 0.0072623119, -0.5234435201, 0.0302408189, -0.4368820786, 0.1623681039, -0.0637962669, 0.1686918736, 0.0814084336, -0.0684889853, -0.252260834, -0.0842750147, 0.0830213949, 0.0467449389, -0.0724339634, 0.0489530414, -0.1854186505, 0.4121848941, 0.2402960658, 0.4626803994, 0.1637824029, 0.081450291, 0.3938984275, -0.1217066795, 0.4271224439, -0.3532854915, -0.3265112042, 0.1239757389, -0.0912450999, -0.0329021886, 0.2716791034, 0.0862798318, 0.2699358165, -0.0800968781, -0.2192068994, 0.0848961473, -0.2793531418, 0.2020108402, -0.1586164832, 0.1581824124, -0.1357392967, -0.0241032168, -0.076738596, -0.1311157644, -0.0338695832, 0.1800868362, -0.2406114936, -0.0250575282, -0.3433841467, 0.0648814738, -0.4402942955, 0.1162604541, -0.0008667819, 0.3317792118, -0.1660779417, -0.1548695266, -0.1084448248, -0.1574717164, 0.5587422848, -0.0700286552, 0.0500068627, 0.062843658, -0.0919483528, -0.3191089332, 0.1609367728, -0.273501277, 0.1588301361, 0.3647182286, 0.5645073056, -0.3012344837, -0.0509773977, -0.0399971828, 0.3442743123, 0.0167831257, -0.0334482007, -0.2761998773, -0.1711503714, -0.4797544777, 0.13883394, 0.2979475558, 0.2846295238, -0.2693780661, 0.0718104541, 0.0067695789, -0.2416712642, 0.2142867446, 0.0040951446, 0.1445521116, -0.0965575501, 0.2255923599, -0.11081478, 0.1980495751, 0.1853598803, 0.571210742, -0.0331978723, -0.1837669462, -0.1003365591, -0.1977580041, 0.3502211571, 0.3210822642, -0.1226088405, 0.2475553453, -0.0468921177, 0.2047811002, 0.0162530318, 0.3714406788, 0.2987755835, -0.0353544578, -0.2595378458, -0.2792067826, 0.1115558743, -0.0845497698, 0.186700657, 0.1408400685, -0.3193203807, -0.255785048, 0.0803830326, -0.3155543804, 0.5819581747, 0.0936629474, 0.3404354155, -0.0566669777, 0.1961902827, 0.0629075021, -0.5295382738, 0.2434756607, -0.3393684626, -0.2687769532, 0.0840471908, -0.0019826964, 0.0826097876, 0.1538456231, -0.2705693841, 0.099735558, 0.1199408472, 0.10925477, -0.2874219716, 0.3836100101, -0.4040072858, -0.1202326417, -0.3170889616, 0.1808964312, 0.03846246, -0.1584917009, -0.0123264939, -0.1195178777, 0.0203618258, -0.2523367405, -0.1452476829, -0.0493158996, -0.4198037088, 0.1822547764, 0.0395060033, -0.3351991177, 0.2435081601, 0.1777709424, 0.071756199, 0.450784713, -0.2375143468, 0.1786590517, 0.0119801173, -0.027082352, 0.1566384882, -0.0020523407, 0.3159703612, -0.0124907866, -0.226741448, 0.0910505056, -0.0267283153, 0.1002251059, -0.0211735368, 0.0054353997, -0.0417034775, -0.3602175117, -0.2473488748, -0.1221572012, -0.2448607236, -0.3007043302, 0.225500524, -0.0533377901, -0.0544012934, 0.1680142581, 0.070989497, -0.3310468793, -0.1633546203, 0.3258527517, -0.1954340786, 0.1789702475, 0.5283613205, 0.3944673538, -0.2252093703, -0.3289530873, 0.3111716211, -0.0288909152, -0.3540940285, 0.2759121656, 0.0270213559, -0.0696799457, 0.1265555322, 0.2427482009, -0.1405766606, -0.3056273758, -0.1277887672, -0.2381087393, -0.4304107428, 0.0736414492, -0.0146860834, 0.1428981125, -0.01201839, 0.2134836614, 0.1132788509, 0.2139742076, -0.4020756483, 0.1423479915, -0.1771860868, 0.0996007621, 0.0931536108, -0.1355484426, 0.1392203569, 0.0518150181, 0.1112697199, 0.2546561956, -0.0731490254, -0.349837631, -0.1104321927, 0.0802938268, 0.0429647863, -0.1213977784, -0.018652847, -0.37576285, -0.257121861, -0.2619501054, 0.2055384815, -0.059557002, -0.1060269549, 0.1788901836, -0.1484445781, -0.0594713204, 0.116279766, -0.0307371747, -0.2113559842, 0.1135026142, 0.1723379791, 0.03541122, 0.0073987618, 0.0013163909, -0.315947175, 0.0826842189, 0.1583469957, -0.074155882, 0.3244627714, -0.3757268488, -0.0566545688, 0.1769374162, 0.3609240651, 0.4115838408, -0.2069201022, -0.0283431634, 0.242904827, 0.2981537879, -0.2532998025, -0.1001687646, 0.0949342027, -0.274805218, 0.3003486991, 0.1226504594, 0.1055467576, -0.0864749253, -0.1073916331, 0.2233813852, 0.1748344898, -0.0378284156, -0.1495877504, 0.3407847881, 0.1178210974, -0.0499307029, 0.3016183078, 0.1821384281, 0.3013968468, 0.7971208692, 0.0785120055, 0.2538470328, -0.2110732347, -0.0938018262, -0.1365757138, -0.4474599957, 0.0228370614, -0.080156602, -0.0352987014, -0.1246765777, -0.2056945413, 0.2049505264, -0.0154316071, 0.2201548517, -0.149097085, 0.0436852798, -0.2486524582, -0.1612087339, 0.0436247662, -0.3107735515, -0.0689435601, 0.1061922908, -0.1068827659, 0.3263955414, 0.1242940426, 0.062380448, 0.1022238582, -0.4294127822, 0.2035559118, -0.0626899004, 0.0876706094, -0.3625389636, 0.283985883, 0.2613128424, 0.1896339655, 0.1571779847, 0.1909477562, 0.5729640126, 0.3944715559, -0.136331588, -0.0809937119, -0.2755988538, -0.1103980243, -0.4154099226, 0.2859198749, 0.1979575902, 0.3553818166, 0.4643360972, 0.2075482011, -0.2680901289, -0.0660787076, 0.1920684576, -0.2637647986, -0.2234544754, 0.1047904044, 0.06799905, -0.2863511443, -0.3225551844, -0.0183769017, -0.4338229895, -0.004564492, 0.5934920311, -0.113588199, 0.3091259897, -0.3395168185, 0.1448054761, 0.0052000768, 0.6057995558, 0.295388788, -0.0178113356, -0.2954340577, -0.0195162073, -0.6181396842, 0.1262813359, -0.1511760056, 0.0123335719, -0.0116539802, 0.142536521, 0.1738923341, -0.0330507942, -0.00510104, 0.0362966694, -0.1498090327, -0.1097010374, -0.3313085437, -0.267660886, -0.2250266969, 0.1569494605, -0.001462914, -0.3760662079, 0.1550828815, -0.095508799, 0.2300338149, -0.1793869138, 0.0802164599, -0.0206276886, -0.0355978087, 0.1578442454, 0.0340669006, 0.3657599092, -0.2838610411, 0.0340453833, -0.1346195787, -0.0631755218, -0.1700665653, 0.1941982657, 0.2527388334, 0.2654083371, -0.0962214097, -0.1707081348, -0.1380879581, 0.1567031145, 0.0587183684, 0.0371346846, 0.1429637074, -0.0507430211, 0.1262889057, 0.0587170273, 0.1863670945, 0.4029747844, -0.1113760322, -0.034302935, -0.3506343663, -0.5414937139, 0.2185380459, -0.1180816963, -0.515848279, 0.0103076659, 0.1300401539, 0.1811081767, 0.0267860945, -0.5836284161, 0.3067768812, 0.3096932471, 0.1339993179, -0.3529008031, 0.258033663, -0.0505171418, 0.0583431274, 0.0119165555, 0.1201328412, 0.0242640786, -0.2702243924, -0.2056573778, -0.0322017111 ]
https://github.com/huggingface/datasets/issues/630
Text dataset not working with large files
It gets stuck while doing `.map()` ? Are you using multiprocessing ? If you could provide a code snippet it could be very useful
``` Traceback (most recent call last): File "examples/language-modeling/run_language_modeling.py", line 333, in <module> main() File "examples/language-modeling/run_language_modeling.py", line 262, in main get_dataset(data_args, tokenizer=tokenizer, cache_dir=model_args.cache_dir) if training_args.do_train else None File "examples/language-modeling/run_language_modeling.py", line 144, in get_dataset dataset = load_dataset("text", data_files=file_path, split='train+test') File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/load.py", line 611, in load_dataset ignore_verifications=ignore_verifications, File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 469, in download_and_prepare dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/ksjae/.local/lib/python3.7/site-packages/tqdm/std.py", line 1129, in __iter__ for obj in iterable: File "/home/ksjae/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 104, in _generate_tables convert_options=self.config.convert_options, File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status ``` **pyarrow.lib.ArrowInvalid: straddling object straddles two block boundaries (try to increase block size?)** It gives the same message for both 200MB, 10GB .tx files but not for 700MB file. Can't upload due to size & copyright problem. sorry.
24
Text dataset not working with large files ``` Traceback (most recent call last): File "examples/language-modeling/run_language_modeling.py", line 333, in <module> main() File "examples/language-modeling/run_language_modeling.py", line 262, in main get_dataset(data_args, tokenizer=tokenizer, cache_dir=model_args.cache_dir) if training_args.do_train else None File "examples/language-modeling/run_language_modeling.py", line 144, in get_dataset dataset = load_dataset("text", data_files=file_path, split='train+test') File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/load.py", line 611, in load_dataset ignore_verifications=ignore_verifications, File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 469, in download_and_prepare dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/ksjae/.local/lib/python3.7/site-packages/tqdm/std.py", line 1129, in __iter__ for obj in iterable: File "/home/ksjae/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 104, in _generate_tables convert_options=self.config.convert_options, File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status ``` **pyarrow.lib.ArrowInvalid: straddling object straddles two block boundaries (try to increase block size?)** It gives the same message for both 200MB, 10GB .tx files but not for 700MB file. Can't upload due to size & copyright problem. sorry. It gets stuck while doing `.map()` ? Are you using multiprocessing ? If you could provide a code snippet it could be very useful
[ -0.4925667048, -0.231024459, -0.1198643669, 0.2836016417, 0.4663564563, -0.0735015422, 0.305367887, 0.5961021185, -0.1138257831, 0.0461650416, -0.0624195635, -0.0304050855, -0.1033422872, 0.3101792336, -0.1062135994, -0.038950067, -0.2278844714, 0.1130641326, -0.0946918353, 0.0499164239, -0.1125508845, 0.1110999957, -0.1209789515, -0.0956027284, -0.4804550707, -0.0829954073, 0.1185314134, 0.2535656095, -0.292010814, -0.3053141534, -0.1657070965, 0.0965851247, 0.1122045517, 0.6015895605, -0.0001037621, 0.129462257, 0.2131116986, -0.1573347449, -0.2265825868, -0.0314554796, 0.1559487879, -0.4493691325, -0.1823891997, -0.2179644406, 0.0261512175, -0.0627443939, -0.1082876548, -0.1175597459, 0.2653357387, 0.5598000884, 0.3412034512, 0.1707586795, 0.132169202, -0.112961188, 0.250754118, -0.0872496516, -0.0887441784, 0.0848499984, 0.3108182251, 0.0265772715, -0.2564009428, 0.3199440539, 0.0101718828, 0.0377003662, 0.000314123, -0.0240627509, -0.0827938169, -0.2940051258, 0.3711952865, 0.2321465015, 0.5224131942, -0.3169360459, -0.1641274989, -0.3288432062, -0.182441175, -0.3512557149, 0.2266326845, 0.1845538318, -0.111331135, 0.1161270589, -0.1436177939, -0.1014851034, -0.2063972205, -0.0293160938, -0.2615340948, 0.1022735834, -0.2105868161, -0.1004619896, 0.2575801313, -0.27024737, 0.0108299255, -0.1056218594, -0.1307831407, -0.0068053361, -0.247213304, -0.0840229169, 0.246850729, -0.199078083, 0.288942188, -0.0489984453, -0.0504209101, 0.2247608304, -0.0796671361, -0.0302130803, 0.0054933541, 0.3368761539, 0.0041130576, 0.0638572425, 0.408585608, 0.2356286943, -0.379357487, -0.2320609689, -0.1496266723, -0.5145173073, -0.0293559097, -0.1177168339, 0.0168719739, -0.0452367477, -0.1210253313, 0.1521078646, 0.1642947644, 0.1547950506, 0.1059662178, 0.4816168845, 0.1049142778, 0.2937653065, -0.1649461985, 0.1770391017, -0.1443264484, -0.1722711325, -0.2492275834, -0.0456117094, -0.0579166152, -0.0926889852, 0.2676634789, 0.1338512301, 0.2967100441, -0.1395805478, 0.1908533573, -0.1090159342, 0.1168144345, -0.5029691458, 0.1476962417, 0.1410788596, -0.0354774296, 0.1686937064, 0.2287830114, -0.1788697988, -0.102391243, 0.0502354205, -0.1035604477, -0.2971286774, 0.1772404313, 0.3213211298, 0.1347745359, -0.0030029803, -0.0189048126, 0.2770770788, 0.3757717609, -0.2176395059, 0.0281008035, -0.1773215979, -0.2569281459, -0.1258029342, 0.2650002539, 0.4393814206, -0.4250603914, 0.2090141028, 0.0690165535, 0.1223086491, 0.0890246034, 0.3739469349, -0.0650729612, 0.3143593669, -0.1145713925, 0.2009434551, 0.3094798923, -0.2855856121, -0.5090754032, 0.5406070948, -0.1859300882, -0.1181869656, 0.0992573202, -0.0444511101, 0.1034917086, 0.1358727366, 0.3047404289, 0.2793403864, 0.011319682, 0.2667298913, -0.260450542, -0.0832505971, -0.0501142181, 0.1559396088, 0.312137723, -0.2080694735, 0.195168823, -0.0172313079, 0.1943764687, -0.0444458202, 0.2128831893, 0.5059322715, 0.1271579862, -0.0078409538, -0.0123688877, -0.2239850014, 0.0316569805, 0.1751383543, 0.1210088432, 0.0843076557, -0.2666846514, -0.0822331682, -0.309245348, -0.1341455579, -0.2703467011, -0.1933863014, 0.2836053371, 0.0813238025, -0.0114755929, 0.2551390529, -0.0167011097, 0.0253123995, -0.1602132767, -0.0655840337, -0.0120966481, -0.0018739775, -0.091083914, -0.2782086134, 0.122949332, 0.1102732196, -0.0411426574, -0.0616680533, -0.1109170541, 0.3928825557, -0.0020204186, -0.0810619891, -0.0360697247, -0.0300613195, 0.1020017415, -0.0952063128, 0.0719456449, 0.0007553585, 0.1388050616, -0.105313383, -0.1801619083, 0.300614208, -0.1398147196, 0.2031629086, 0.3438570797, -0.2090739459, 0.2556621432, -0.0663510486, 0.2276657224, -0.1381174475, 0.2406678498, 0.0831319094, 0.1363214701, -0.0013490645, -0.2719839215, -0.0127428509, 0.6895840168, 0.1710654795, 0.0928928256, 0.1747953296, -0.2453399599, -0.0179575719, -0.2034239471, 0.1299687773, 0.3976371586, 0.274443984, 0.2679816484, 0.0357754976, 0.1870049238, -0.3740029037, 0.1397105753, 0.0032483395, 0.0111855678, 0.5398925543, 0.2758922875, -0.1295804679, -0.4534206986, -0.1563688219, 0.1057757214, 0.3134025633, -0.0895872042, 0.019882448, -0.2155626714, -0.2920288444, -0.2959759235, 0.210327521, -0.2627325058, -0.2253176868, 0.0694883168, 0.0018544411, -0.0096534938, 0.09686625, 0.0594208688, 0.1454249024, 0.3997492492, -0.1988250166, -0.0615104325, -0.2715907991, -0.1730494499, 0.1567442566, 0.2421877533, -0.0123151951, 0.1066426486, -0.1534678638, -0.1015961096, 0.0038625095, -0.2853767574, 0.0006908299, -0.0291902125, 0.1006610468, -0.0571643263, 0.2626276016, 0.166275382, 0.0769830346, 0.3128012419, -0.1297490746, -0.0807751566, 0.2233642489, 0.0906753093, -0.0652650967, -0.1705733389, -0.4048936069, -0.1530369967, -0.6100061536, 0.3575110734, 0.0955665559, 0.0380034968, 0.4022582173, 0.3181683421, 0.2154985219, -0.0212013423, 0.2124927938, 0.1002641469, -0.1950187981, 0.2925724685, -0.2248303741, -0.3832402229, -0.0998250395, 0.1606035233, 0.1626158357, 0.0072623119, -0.5234435201, 0.0302408189, -0.4368820786, 0.1623681039, -0.0637962669, 0.1686918736, 0.0814084336, -0.0684889853, -0.252260834, -0.0842750147, 0.0830213949, 0.0467449389, -0.0724339634, 0.0489530414, -0.1854186505, 0.4121848941, 0.2402960658, 0.4626803994, 0.1637824029, 0.081450291, 0.3938984275, -0.1217066795, 0.4271224439, -0.3532854915, -0.3265112042, 0.1239757389, -0.0912450999, -0.0329021886, 0.2716791034, 0.0862798318, 0.2699358165, -0.0800968781, -0.2192068994, 0.0848961473, -0.2793531418, 0.2020108402, -0.1586164832, 0.1581824124, -0.1357392967, -0.0241032168, -0.076738596, -0.1311157644, -0.0338695832, 0.1800868362, -0.2406114936, -0.0250575282, -0.3433841467, 0.0648814738, -0.4402942955, 0.1162604541, -0.0008667819, 0.3317792118, -0.1660779417, -0.1548695266, -0.1084448248, -0.1574717164, 0.5587422848, -0.0700286552, 0.0500068627, 0.062843658, -0.0919483528, -0.3191089332, 0.1609367728, -0.273501277, 0.1588301361, 0.3647182286, 0.5645073056, -0.3012344837, -0.0509773977, -0.0399971828, 0.3442743123, 0.0167831257, -0.0334482007, -0.2761998773, -0.1711503714, -0.4797544777, 0.13883394, 0.2979475558, 0.2846295238, -0.2693780661, 0.0718104541, 0.0067695789, -0.2416712642, 0.2142867446, 0.0040951446, 0.1445521116, -0.0965575501, 0.2255923599, -0.11081478, 0.1980495751, 0.1853598803, 0.571210742, -0.0331978723, -0.1837669462, -0.1003365591, -0.1977580041, 0.3502211571, 0.3210822642, -0.1226088405, 0.2475553453, -0.0468921177, 0.2047811002, 0.0162530318, 0.3714406788, 0.2987755835, -0.0353544578, -0.2595378458, -0.2792067826, 0.1115558743, -0.0845497698, 0.186700657, 0.1408400685, -0.3193203807, -0.255785048, 0.0803830326, -0.3155543804, 0.5819581747, 0.0936629474, 0.3404354155, -0.0566669777, 0.1961902827, 0.0629075021, -0.5295382738, 0.2434756607, -0.3393684626, -0.2687769532, 0.0840471908, -0.0019826964, 0.0826097876, 0.1538456231, -0.2705693841, 0.099735558, 0.1199408472, 0.10925477, -0.2874219716, 0.3836100101, -0.4040072858, -0.1202326417, -0.3170889616, 0.1808964312, 0.03846246, -0.1584917009, -0.0123264939, -0.1195178777, 0.0203618258, -0.2523367405, -0.1452476829, -0.0493158996, -0.4198037088, 0.1822547764, 0.0395060033, -0.3351991177, 0.2435081601, 0.1777709424, 0.071756199, 0.450784713, -0.2375143468, 0.1786590517, 0.0119801173, -0.027082352, 0.1566384882, -0.0020523407, 0.3159703612, -0.0124907866, -0.226741448, 0.0910505056, -0.0267283153, 0.1002251059, -0.0211735368, 0.0054353997, -0.0417034775, -0.3602175117, -0.2473488748, -0.1221572012, -0.2448607236, -0.3007043302, 0.225500524, -0.0533377901, -0.0544012934, 0.1680142581, 0.070989497, -0.3310468793, -0.1633546203, 0.3258527517, -0.1954340786, 0.1789702475, 0.5283613205, 0.3944673538, -0.2252093703, -0.3289530873, 0.3111716211, -0.0288909152, -0.3540940285, 0.2759121656, 0.0270213559, -0.0696799457, 0.1265555322, 0.2427482009, -0.1405766606, -0.3056273758, -0.1277887672, -0.2381087393, -0.4304107428, 0.0736414492, -0.0146860834, 0.1428981125, -0.01201839, 0.2134836614, 0.1132788509, 0.2139742076, -0.4020756483, 0.1423479915, -0.1771860868, 0.0996007621, 0.0931536108, -0.1355484426, 0.1392203569, 0.0518150181, 0.1112697199, 0.2546561956, -0.0731490254, -0.349837631, -0.1104321927, 0.0802938268, 0.0429647863, -0.1213977784, -0.018652847, -0.37576285, -0.257121861, -0.2619501054, 0.2055384815, -0.059557002, -0.1060269549, 0.1788901836, -0.1484445781, -0.0594713204, 0.116279766, -0.0307371747, -0.2113559842, 0.1135026142, 0.1723379791, 0.03541122, 0.0073987618, 0.0013163909, -0.315947175, 0.0826842189, 0.1583469957, -0.074155882, 0.3244627714, -0.3757268488, -0.0566545688, 0.1769374162, 0.3609240651, 0.4115838408, -0.2069201022, -0.0283431634, 0.242904827, 0.2981537879, -0.2532998025, -0.1001687646, 0.0949342027, -0.274805218, 0.3003486991, 0.1226504594, 0.1055467576, -0.0864749253, -0.1073916331, 0.2233813852, 0.1748344898, -0.0378284156, -0.1495877504, 0.3407847881, 0.1178210974, -0.0499307029, 0.3016183078, 0.1821384281, 0.3013968468, 0.7971208692, 0.0785120055, 0.2538470328, -0.2110732347, -0.0938018262, -0.1365757138, -0.4474599957, 0.0228370614, -0.080156602, -0.0352987014, -0.1246765777, -0.2056945413, 0.2049505264, -0.0154316071, 0.2201548517, -0.149097085, 0.0436852798, -0.2486524582, -0.1612087339, 0.0436247662, -0.3107735515, -0.0689435601, 0.1061922908, -0.1068827659, 0.3263955414, 0.1242940426, 0.062380448, 0.1022238582, -0.4294127822, 0.2035559118, -0.0626899004, 0.0876706094, -0.3625389636, 0.283985883, 0.2613128424, 0.1896339655, 0.1571779847, 0.1909477562, 0.5729640126, 0.3944715559, -0.136331588, -0.0809937119, -0.2755988538, -0.1103980243, -0.4154099226, 0.2859198749, 0.1979575902, 0.3553818166, 0.4643360972, 0.2075482011, -0.2680901289, -0.0660787076, 0.1920684576, -0.2637647986, -0.2234544754, 0.1047904044, 0.06799905, -0.2863511443, -0.3225551844, -0.0183769017, -0.4338229895, -0.004564492, 0.5934920311, -0.113588199, 0.3091259897, -0.3395168185, 0.1448054761, 0.0052000768, 0.6057995558, 0.295388788, -0.0178113356, -0.2954340577, -0.0195162073, -0.6181396842, 0.1262813359, -0.1511760056, 0.0123335719, -0.0116539802, 0.142536521, 0.1738923341, -0.0330507942, -0.00510104, 0.0362966694, -0.1498090327, -0.1097010374, -0.3313085437, -0.267660886, -0.2250266969, 0.1569494605, -0.001462914, -0.3760662079, 0.1550828815, -0.095508799, 0.2300338149, -0.1793869138, 0.0802164599, -0.0206276886, -0.0355978087, 0.1578442454, 0.0340669006, 0.3657599092, -0.2838610411, 0.0340453833, -0.1346195787, -0.0631755218, -0.1700665653, 0.1941982657, 0.2527388334, 0.2654083371, -0.0962214097, -0.1707081348, -0.1380879581, 0.1567031145, 0.0587183684, 0.0371346846, 0.1429637074, -0.0507430211, 0.1262889057, 0.0587170273, 0.1863670945, 0.4029747844, -0.1113760322, -0.034302935, -0.3506343663, -0.5414937139, 0.2185380459, -0.1180816963, -0.515848279, 0.0103076659, 0.1300401539, 0.1811081767, 0.0267860945, -0.5836284161, 0.3067768812, 0.3096932471, 0.1339993179, -0.3529008031, 0.258033663, -0.0505171418, 0.0583431274, 0.0119165555, 0.1201328412, 0.0242640786, -0.2702243924, -0.2056573778, -0.0322017111 ]
https://github.com/huggingface/datasets/issues/630
Text dataset not working with large files
From transformers/examples/language-modeling/run-language-modeling.py : ``` def get_dataset( args: DataTrainingArguments, tokenizer: PreTrainedTokenizer, evaluate: bool = False, cache_dir: Optional[str] = None, ): file_path = args.eval_data_file if evaluate else args.train_data_file if True: dataset = load_dataset("text", data_files=glob.glob(file_path), split='train', use_threads=True, ignore_verifications=True, save_infos=True, block_size=104857600) dataset = dataset.map(lambda ex: tokenizer(ex["text"], add_special_tokens=True, truncation=True, max_length=args.block_size), batched=True) dataset.set_format(type='torch', columns=['input_ids']) return dataset if args.line_by_line: return LineByLineTextDataset(tokenizer=tokenizer, file_path=file_path, block_size=args.block_size) else: return TextDataset( tokenizer=tokenizer, file_path=file_path, block_size=args.block_size, overwrite_cache=args.overwrite_cache, cache_dir=cache_dir, ) ``` No, I'm not using multiprocessing.
``` Traceback (most recent call last): File "examples/language-modeling/run_language_modeling.py", line 333, in <module> main() File "examples/language-modeling/run_language_modeling.py", line 262, in main get_dataset(data_args, tokenizer=tokenizer, cache_dir=model_args.cache_dir) if training_args.do_train else None File "examples/language-modeling/run_language_modeling.py", line 144, in get_dataset dataset = load_dataset("text", data_files=file_path, split='train+test') File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/load.py", line 611, in load_dataset ignore_verifications=ignore_verifications, File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 469, in download_and_prepare dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/ksjae/.local/lib/python3.7/site-packages/tqdm/std.py", line 1129, in __iter__ for obj in iterable: File "/home/ksjae/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 104, in _generate_tables convert_options=self.config.convert_options, File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status ``` **pyarrow.lib.ArrowInvalid: straddling object straddles two block boundaries (try to increase block size?)** It gives the same message for both 200MB, 10GB .tx files but not for 700MB file. Can't upload due to size & copyright problem. sorry.
71
Text dataset not working with large files ``` Traceback (most recent call last): File "examples/language-modeling/run_language_modeling.py", line 333, in <module> main() File "examples/language-modeling/run_language_modeling.py", line 262, in main get_dataset(data_args, tokenizer=tokenizer, cache_dir=model_args.cache_dir) if training_args.do_train else None File "examples/language-modeling/run_language_modeling.py", line 144, in get_dataset dataset = load_dataset("text", data_files=file_path, split='train+test') File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/load.py", line 611, in load_dataset ignore_verifications=ignore_verifications, File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 469, in download_and_prepare dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/ksjae/.local/lib/python3.7/site-packages/tqdm/std.py", line 1129, in __iter__ for obj in iterable: File "/home/ksjae/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 104, in _generate_tables convert_options=self.config.convert_options, File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status ``` **pyarrow.lib.ArrowInvalid: straddling object straddles two block boundaries (try to increase block size?)** It gives the same message for both 200MB, 10GB .tx files but not for 700MB file. Can't upload due to size & copyright problem. sorry. From transformers/examples/language-modeling/run-language-modeling.py : ``` def get_dataset( args: DataTrainingArguments, tokenizer: PreTrainedTokenizer, evaluate: bool = False, cache_dir: Optional[str] = None, ): file_path = args.eval_data_file if evaluate else args.train_data_file if True: dataset = load_dataset("text", data_files=glob.glob(file_path), split='train', use_threads=True, ignore_verifications=True, save_infos=True, block_size=104857600) dataset = dataset.map(lambda ex: tokenizer(ex["text"], add_special_tokens=True, truncation=True, max_length=args.block_size), batched=True) dataset.set_format(type='torch', columns=['input_ids']) return dataset if args.line_by_line: return LineByLineTextDataset(tokenizer=tokenizer, file_path=file_path, block_size=args.block_size) else: return TextDataset( tokenizer=tokenizer, file_path=file_path, block_size=args.block_size, overwrite_cache=args.overwrite_cache, cache_dir=cache_dir, ) ``` No, I'm not using multiprocessing.
[ -0.4925667048, -0.231024459, -0.1198643669, 0.2836016417, 0.4663564563, -0.0735015422, 0.305367887, 0.5961021185, -0.1138257831, 0.0461650416, -0.0624195635, -0.0304050855, -0.1033422872, 0.3101792336, -0.1062135994, -0.038950067, -0.2278844714, 0.1130641326, -0.0946918353, 0.0499164239, -0.1125508845, 0.1110999957, -0.1209789515, -0.0956027284, -0.4804550707, -0.0829954073, 0.1185314134, 0.2535656095, -0.292010814, -0.3053141534, -0.1657070965, 0.0965851247, 0.1122045517, 0.6015895605, -0.0001037621, 0.129462257, 0.2131116986, -0.1573347449, -0.2265825868, -0.0314554796, 0.1559487879, -0.4493691325, -0.1823891997, -0.2179644406, 0.0261512175, -0.0627443939, -0.1082876548, -0.1175597459, 0.2653357387, 0.5598000884, 0.3412034512, 0.1707586795, 0.132169202, -0.112961188, 0.250754118, -0.0872496516, -0.0887441784, 0.0848499984, 0.3108182251, 0.0265772715, -0.2564009428, 0.3199440539, 0.0101718828, 0.0377003662, 0.000314123, -0.0240627509, -0.0827938169, -0.2940051258, 0.3711952865, 0.2321465015, 0.5224131942, -0.3169360459, -0.1641274989, -0.3288432062, -0.182441175, -0.3512557149, 0.2266326845, 0.1845538318, -0.111331135, 0.1161270589, -0.1436177939, -0.1014851034, -0.2063972205, -0.0293160938, -0.2615340948, 0.1022735834, -0.2105868161, -0.1004619896, 0.2575801313, -0.27024737, 0.0108299255, -0.1056218594, -0.1307831407, -0.0068053361, -0.247213304, -0.0840229169, 0.246850729, -0.199078083, 0.288942188, -0.0489984453, -0.0504209101, 0.2247608304, -0.0796671361, -0.0302130803, 0.0054933541, 0.3368761539, 0.0041130576, 0.0638572425, 0.408585608, 0.2356286943, -0.379357487, -0.2320609689, -0.1496266723, -0.5145173073, -0.0293559097, -0.1177168339, 0.0168719739, -0.0452367477, -0.1210253313, 0.1521078646, 0.1642947644, 0.1547950506, 0.1059662178, 0.4816168845, 0.1049142778, 0.2937653065, -0.1649461985, 0.1770391017, -0.1443264484, -0.1722711325, -0.2492275834, -0.0456117094, -0.0579166152, -0.0926889852, 0.2676634789, 0.1338512301, 0.2967100441, -0.1395805478, 0.1908533573, -0.1090159342, 0.1168144345, -0.5029691458, 0.1476962417, 0.1410788596, -0.0354774296, 0.1686937064, 0.2287830114, -0.1788697988, -0.102391243, 0.0502354205, -0.1035604477, -0.2971286774, 0.1772404313, 0.3213211298, 0.1347745359, -0.0030029803, -0.0189048126, 0.2770770788, 0.3757717609, -0.2176395059, 0.0281008035, -0.1773215979, -0.2569281459, -0.1258029342, 0.2650002539, 0.4393814206, -0.4250603914, 0.2090141028, 0.0690165535, 0.1223086491, 0.0890246034, 0.3739469349, -0.0650729612, 0.3143593669, -0.1145713925, 0.2009434551, 0.3094798923, -0.2855856121, -0.5090754032, 0.5406070948, -0.1859300882, -0.1181869656, 0.0992573202, -0.0444511101, 0.1034917086, 0.1358727366, 0.3047404289, 0.2793403864, 0.011319682, 0.2667298913, -0.260450542, -0.0832505971, -0.0501142181, 0.1559396088, 0.312137723, -0.2080694735, 0.195168823, -0.0172313079, 0.1943764687, -0.0444458202, 0.2128831893, 0.5059322715, 0.1271579862, -0.0078409538, -0.0123688877, -0.2239850014, 0.0316569805, 0.1751383543, 0.1210088432, 0.0843076557, -0.2666846514, -0.0822331682, -0.309245348, -0.1341455579, -0.2703467011, -0.1933863014, 0.2836053371, 0.0813238025, -0.0114755929, 0.2551390529, -0.0167011097, 0.0253123995, -0.1602132767, -0.0655840337, -0.0120966481, -0.0018739775, -0.091083914, -0.2782086134, 0.122949332, 0.1102732196, -0.0411426574, -0.0616680533, -0.1109170541, 0.3928825557, -0.0020204186, -0.0810619891, -0.0360697247, -0.0300613195, 0.1020017415, -0.0952063128, 0.0719456449, 0.0007553585, 0.1388050616, -0.105313383, -0.1801619083, 0.300614208, -0.1398147196, 0.2031629086, 0.3438570797, -0.2090739459, 0.2556621432, -0.0663510486, 0.2276657224, -0.1381174475, 0.2406678498, 0.0831319094, 0.1363214701, -0.0013490645, -0.2719839215, -0.0127428509, 0.6895840168, 0.1710654795, 0.0928928256, 0.1747953296, -0.2453399599, -0.0179575719, -0.2034239471, 0.1299687773, 0.3976371586, 0.274443984, 0.2679816484, 0.0357754976, 0.1870049238, -0.3740029037, 0.1397105753, 0.0032483395, 0.0111855678, 0.5398925543, 0.2758922875, -0.1295804679, -0.4534206986, -0.1563688219, 0.1057757214, 0.3134025633, -0.0895872042, 0.019882448, -0.2155626714, -0.2920288444, -0.2959759235, 0.210327521, -0.2627325058, -0.2253176868, 0.0694883168, 0.0018544411, -0.0096534938, 0.09686625, 0.0594208688, 0.1454249024, 0.3997492492, -0.1988250166, -0.0615104325, -0.2715907991, -0.1730494499, 0.1567442566, 0.2421877533, -0.0123151951, 0.1066426486, -0.1534678638, -0.1015961096, 0.0038625095, -0.2853767574, 0.0006908299, -0.0291902125, 0.1006610468, -0.0571643263, 0.2626276016, 0.166275382, 0.0769830346, 0.3128012419, -0.1297490746, -0.0807751566, 0.2233642489, 0.0906753093, -0.0652650967, -0.1705733389, -0.4048936069, -0.1530369967, -0.6100061536, 0.3575110734, 0.0955665559, 0.0380034968, 0.4022582173, 0.3181683421, 0.2154985219, -0.0212013423, 0.2124927938, 0.1002641469, -0.1950187981, 0.2925724685, -0.2248303741, -0.3832402229, -0.0998250395, 0.1606035233, 0.1626158357, 0.0072623119, -0.5234435201, 0.0302408189, -0.4368820786, 0.1623681039, -0.0637962669, 0.1686918736, 0.0814084336, -0.0684889853, -0.252260834, -0.0842750147, 0.0830213949, 0.0467449389, -0.0724339634, 0.0489530414, -0.1854186505, 0.4121848941, 0.2402960658, 0.4626803994, 0.1637824029, 0.081450291, 0.3938984275, -0.1217066795, 0.4271224439, -0.3532854915, -0.3265112042, 0.1239757389, -0.0912450999, -0.0329021886, 0.2716791034, 0.0862798318, 0.2699358165, -0.0800968781, -0.2192068994, 0.0848961473, -0.2793531418, 0.2020108402, -0.1586164832, 0.1581824124, -0.1357392967, -0.0241032168, -0.076738596, -0.1311157644, -0.0338695832, 0.1800868362, -0.2406114936, -0.0250575282, -0.3433841467, 0.0648814738, -0.4402942955, 0.1162604541, -0.0008667819, 0.3317792118, -0.1660779417, -0.1548695266, -0.1084448248, -0.1574717164, 0.5587422848, -0.0700286552, 0.0500068627, 0.062843658, -0.0919483528, -0.3191089332, 0.1609367728, -0.273501277, 0.1588301361, 0.3647182286, 0.5645073056, -0.3012344837, -0.0509773977, -0.0399971828, 0.3442743123, 0.0167831257, -0.0334482007, -0.2761998773, -0.1711503714, -0.4797544777, 0.13883394, 0.2979475558, 0.2846295238, -0.2693780661, 0.0718104541, 0.0067695789, -0.2416712642, 0.2142867446, 0.0040951446, 0.1445521116, -0.0965575501, 0.2255923599, -0.11081478, 0.1980495751, 0.1853598803, 0.571210742, -0.0331978723, -0.1837669462, -0.1003365591, -0.1977580041, 0.3502211571, 0.3210822642, -0.1226088405, 0.2475553453, -0.0468921177, 0.2047811002, 0.0162530318, 0.3714406788, 0.2987755835, -0.0353544578, -0.2595378458, -0.2792067826, 0.1115558743, -0.0845497698, 0.186700657, 0.1408400685, -0.3193203807, -0.255785048, 0.0803830326, -0.3155543804, 0.5819581747, 0.0936629474, 0.3404354155, -0.0566669777, 0.1961902827, 0.0629075021, -0.5295382738, 0.2434756607, -0.3393684626, -0.2687769532, 0.0840471908, -0.0019826964, 0.0826097876, 0.1538456231, -0.2705693841, 0.099735558, 0.1199408472, 0.10925477, -0.2874219716, 0.3836100101, -0.4040072858, -0.1202326417, -0.3170889616, 0.1808964312, 0.03846246, -0.1584917009, -0.0123264939, -0.1195178777, 0.0203618258, -0.2523367405, -0.1452476829, -0.0493158996, -0.4198037088, 0.1822547764, 0.0395060033, -0.3351991177, 0.2435081601, 0.1777709424, 0.071756199, 0.450784713, -0.2375143468, 0.1786590517, 0.0119801173, -0.027082352, 0.1566384882, -0.0020523407, 0.3159703612, -0.0124907866, -0.226741448, 0.0910505056, -0.0267283153, 0.1002251059, -0.0211735368, 0.0054353997, -0.0417034775, -0.3602175117, -0.2473488748, -0.1221572012, -0.2448607236, -0.3007043302, 0.225500524, -0.0533377901, -0.0544012934, 0.1680142581, 0.070989497, -0.3310468793, -0.1633546203, 0.3258527517, -0.1954340786, 0.1789702475, 0.5283613205, 0.3944673538, -0.2252093703, -0.3289530873, 0.3111716211, -0.0288909152, -0.3540940285, 0.2759121656, 0.0270213559, -0.0696799457, 0.1265555322, 0.2427482009, -0.1405766606, -0.3056273758, -0.1277887672, -0.2381087393, -0.4304107428, 0.0736414492, -0.0146860834, 0.1428981125, -0.01201839, 0.2134836614, 0.1132788509, 0.2139742076, -0.4020756483, 0.1423479915, -0.1771860868, 0.0996007621, 0.0931536108, -0.1355484426, 0.1392203569, 0.0518150181, 0.1112697199, 0.2546561956, -0.0731490254, -0.349837631, -0.1104321927, 0.0802938268, 0.0429647863, -0.1213977784, -0.018652847, -0.37576285, -0.257121861, -0.2619501054, 0.2055384815, -0.059557002, -0.1060269549, 0.1788901836, -0.1484445781, -0.0594713204, 0.116279766, -0.0307371747, -0.2113559842, 0.1135026142, 0.1723379791, 0.03541122, 0.0073987618, 0.0013163909, -0.315947175, 0.0826842189, 0.1583469957, -0.074155882, 0.3244627714, -0.3757268488, -0.0566545688, 0.1769374162, 0.3609240651, 0.4115838408, -0.2069201022, -0.0283431634, 0.242904827, 0.2981537879, -0.2532998025, -0.1001687646, 0.0949342027, -0.274805218, 0.3003486991, 0.1226504594, 0.1055467576, -0.0864749253, -0.1073916331, 0.2233813852, 0.1748344898, -0.0378284156, -0.1495877504, 0.3407847881, 0.1178210974, -0.0499307029, 0.3016183078, 0.1821384281, 0.3013968468, 0.7971208692, 0.0785120055, 0.2538470328, -0.2110732347, -0.0938018262, -0.1365757138, -0.4474599957, 0.0228370614, -0.080156602, -0.0352987014, -0.1246765777, -0.2056945413, 0.2049505264, -0.0154316071, 0.2201548517, -0.149097085, 0.0436852798, -0.2486524582, -0.1612087339, 0.0436247662, -0.3107735515, -0.0689435601, 0.1061922908, -0.1068827659, 0.3263955414, 0.1242940426, 0.062380448, 0.1022238582, -0.4294127822, 0.2035559118, -0.0626899004, 0.0876706094, -0.3625389636, 0.283985883, 0.2613128424, 0.1896339655, 0.1571779847, 0.1909477562, 0.5729640126, 0.3944715559, -0.136331588, -0.0809937119, -0.2755988538, -0.1103980243, -0.4154099226, 0.2859198749, 0.1979575902, 0.3553818166, 0.4643360972, 0.2075482011, -0.2680901289, -0.0660787076, 0.1920684576, -0.2637647986, -0.2234544754, 0.1047904044, 0.06799905, -0.2863511443, -0.3225551844, -0.0183769017, -0.4338229895, -0.004564492, 0.5934920311, -0.113588199, 0.3091259897, -0.3395168185, 0.1448054761, 0.0052000768, 0.6057995558, 0.295388788, -0.0178113356, -0.2954340577, -0.0195162073, -0.6181396842, 0.1262813359, -0.1511760056, 0.0123335719, -0.0116539802, 0.142536521, 0.1738923341, -0.0330507942, -0.00510104, 0.0362966694, -0.1498090327, -0.1097010374, -0.3313085437, -0.267660886, -0.2250266969, 0.1569494605, -0.001462914, -0.3760662079, 0.1550828815, -0.095508799, 0.2300338149, -0.1793869138, 0.0802164599, -0.0206276886, -0.0355978087, 0.1578442454, 0.0340669006, 0.3657599092, -0.2838610411, 0.0340453833, -0.1346195787, -0.0631755218, -0.1700665653, 0.1941982657, 0.2527388334, 0.2654083371, -0.0962214097, -0.1707081348, -0.1380879581, 0.1567031145, 0.0587183684, 0.0371346846, 0.1429637074, -0.0507430211, 0.1262889057, 0.0587170273, 0.1863670945, 0.4029747844, -0.1113760322, -0.034302935, -0.3506343663, -0.5414937139, 0.2185380459, -0.1180816963, -0.515848279, 0.0103076659, 0.1300401539, 0.1811081767, 0.0267860945, -0.5836284161, 0.3067768812, 0.3096932471, 0.1339993179, -0.3529008031, 0.258033663, -0.0505171418, 0.0583431274, 0.0119165555, 0.1201328412, 0.0242640786, -0.2702243924, -0.2056573778, -0.0322017111 ]
https://github.com/huggingface/datasets/issues/630
Text dataset not working with large files
I am not able to reproduce on my side :/ Could you send the version of `datasets` and `pyarrow` you're using ? Could you try to update the lib and try again ? Or do you think you could try to reproduce it on google colab ?
``` Traceback (most recent call last): File "examples/language-modeling/run_language_modeling.py", line 333, in <module> main() File "examples/language-modeling/run_language_modeling.py", line 262, in main get_dataset(data_args, tokenizer=tokenizer, cache_dir=model_args.cache_dir) if training_args.do_train else None File "examples/language-modeling/run_language_modeling.py", line 144, in get_dataset dataset = load_dataset("text", data_files=file_path, split='train+test') File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/load.py", line 611, in load_dataset ignore_verifications=ignore_verifications, File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 469, in download_and_prepare dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/ksjae/.local/lib/python3.7/site-packages/tqdm/std.py", line 1129, in __iter__ for obj in iterable: File "/home/ksjae/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 104, in _generate_tables convert_options=self.config.convert_options, File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status ``` **pyarrow.lib.ArrowInvalid: straddling object straddles two block boundaries (try to increase block size?)** It gives the same message for both 200MB, 10GB .tx files but not for 700MB file. Can't upload due to size & copyright problem. sorry.
47
Text dataset not working with large files ``` Traceback (most recent call last): File "examples/language-modeling/run_language_modeling.py", line 333, in <module> main() File "examples/language-modeling/run_language_modeling.py", line 262, in main get_dataset(data_args, tokenizer=tokenizer, cache_dir=model_args.cache_dir) if training_args.do_train else None File "examples/language-modeling/run_language_modeling.py", line 144, in get_dataset dataset = load_dataset("text", data_files=file_path, split='train+test') File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/load.py", line 611, in load_dataset ignore_verifications=ignore_verifications, File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 469, in download_and_prepare dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/ksjae/.local/lib/python3.7/site-packages/tqdm/std.py", line 1129, in __iter__ for obj in iterable: File "/home/ksjae/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 104, in _generate_tables convert_options=self.config.convert_options, File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status ``` **pyarrow.lib.ArrowInvalid: straddling object straddles two block boundaries (try to increase block size?)** It gives the same message for both 200MB, 10GB .tx files but not for 700MB file. Can't upload due to size & copyright problem. sorry. I am not able to reproduce on my side :/ Could you send the version of `datasets` and `pyarrow` you're using ? Could you try to update the lib and try again ? Or do you think you could try to reproduce it on google colab ?
[ -0.4925667048, -0.231024459, -0.1198643669, 0.2836016417, 0.4663564563, -0.0735015422, 0.305367887, 0.5961021185, -0.1138257831, 0.0461650416, -0.0624195635, -0.0304050855, -0.1033422872, 0.3101792336, -0.1062135994, -0.038950067, -0.2278844714, 0.1130641326, -0.0946918353, 0.0499164239, -0.1125508845, 0.1110999957, -0.1209789515, -0.0956027284, -0.4804550707, -0.0829954073, 0.1185314134, 0.2535656095, -0.292010814, -0.3053141534, -0.1657070965, 0.0965851247, 0.1122045517, 0.6015895605, -0.0001037621, 0.129462257, 0.2131116986, -0.1573347449, -0.2265825868, -0.0314554796, 0.1559487879, -0.4493691325, -0.1823891997, -0.2179644406, 0.0261512175, -0.0627443939, -0.1082876548, -0.1175597459, 0.2653357387, 0.5598000884, 0.3412034512, 0.1707586795, 0.132169202, -0.112961188, 0.250754118, -0.0872496516, -0.0887441784, 0.0848499984, 0.3108182251, 0.0265772715, -0.2564009428, 0.3199440539, 0.0101718828, 0.0377003662, 0.000314123, -0.0240627509, -0.0827938169, -0.2940051258, 0.3711952865, 0.2321465015, 0.5224131942, -0.3169360459, -0.1641274989, -0.3288432062, -0.182441175, -0.3512557149, 0.2266326845, 0.1845538318, -0.111331135, 0.1161270589, -0.1436177939, -0.1014851034, -0.2063972205, -0.0293160938, -0.2615340948, 0.1022735834, -0.2105868161, -0.1004619896, 0.2575801313, -0.27024737, 0.0108299255, -0.1056218594, -0.1307831407, -0.0068053361, -0.247213304, -0.0840229169, 0.246850729, -0.199078083, 0.288942188, -0.0489984453, -0.0504209101, 0.2247608304, -0.0796671361, -0.0302130803, 0.0054933541, 0.3368761539, 0.0041130576, 0.0638572425, 0.408585608, 0.2356286943, -0.379357487, -0.2320609689, -0.1496266723, -0.5145173073, -0.0293559097, -0.1177168339, 0.0168719739, -0.0452367477, -0.1210253313, 0.1521078646, 0.1642947644, 0.1547950506, 0.1059662178, 0.4816168845, 0.1049142778, 0.2937653065, -0.1649461985, 0.1770391017, -0.1443264484, -0.1722711325, -0.2492275834, -0.0456117094, -0.0579166152, -0.0926889852, 0.2676634789, 0.1338512301, 0.2967100441, -0.1395805478, 0.1908533573, -0.1090159342, 0.1168144345, -0.5029691458, 0.1476962417, 0.1410788596, -0.0354774296, 0.1686937064, 0.2287830114, -0.1788697988, -0.102391243, 0.0502354205, -0.1035604477, -0.2971286774, 0.1772404313, 0.3213211298, 0.1347745359, -0.0030029803, -0.0189048126, 0.2770770788, 0.3757717609, -0.2176395059, 0.0281008035, -0.1773215979, -0.2569281459, -0.1258029342, 0.2650002539, 0.4393814206, -0.4250603914, 0.2090141028, 0.0690165535, 0.1223086491, 0.0890246034, 0.3739469349, -0.0650729612, 0.3143593669, -0.1145713925, 0.2009434551, 0.3094798923, -0.2855856121, -0.5090754032, 0.5406070948, -0.1859300882, -0.1181869656, 0.0992573202, -0.0444511101, 0.1034917086, 0.1358727366, 0.3047404289, 0.2793403864, 0.011319682, 0.2667298913, -0.260450542, -0.0832505971, -0.0501142181, 0.1559396088, 0.312137723, -0.2080694735, 0.195168823, -0.0172313079, 0.1943764687, -0.0444458202, 0.2128831893, 0.5059322715, 0.1271579862, -0.0078409538, -0.0123688877, -0.2239850014, 0.0316569805, 0.1751383543, 0.1210088432, 0.0843076557, -0.2666846514, -0.0822331682, -0.309245348, -0.1341455579, -0.2703467011, -0.1933863014, 0.2836053371, 0.0813238025, -0.0114755929, 0.2551390529, -0.0167011097, 0.0253123995, -0.1602132767, -0.0655840337, -0.0120966481, -0.0018739775, -0.091083914, -0.2782086134, 0.122949332, 0.1102732196, -0.0411426574, -0.0616680533, -0.1109170541, 0.3928825557, -0.0020204186, -0.0810619891, -0.0360697247, -0.0300613195, 0.1020017415, -0.0952063128, 0.0719456449, 0.0007553585, 0.1388050616, -0.105313383, -0.1801619083, 0.300614208, -0.1398147196, 0.2031629086, 0.3438570797, -0.2090739459, 0.2556621432, -0.0663510486, 0.2276657224, -0.1381174475, 0.2406678498, 0.0831319094, 0.1363214701, -0.0013490645, -0.2719839215, -0.0127428509, 0.6895840168, 0.1710654795, 0.0928928256, 0.1747953296, -0.2453399599, -0.0179575719, -0.2034239471, 0.1299687773, 0.3976371586, 0.274443984, 0.2679816484, 0.0357754976, 0.1870049238, -0.3740029037, 0.1397105753, 0.0032483395, 0.0111855678, 0.5398925543, 0.2758922875, -0.1295804679, -0.4534206986, -0.1563688219, 0.1057757214, 0.3134025633, -0.0895872042, 0.019882448, -0.2155626714, -0.2920288444, -0.2959759235, 0.210327521, -0.2627325058, -0.2253176868, 0.0694883168, 0.0018544411, -0.0096534938, 0.09686625, 0.0594208688, 0.1454249024, 0.3997492492, -0.1988250166, -0.0615104325, -0.2715907991, -0.1730494499, 0.1567442566, 0.2421877533, -0.0123151951, 0.1066426486, -0.1534678638, -0.1015961096, 0.0038625095, -0.2853767574, 0.0006908299, -0.0291902125, 0.1006610468, -0.0571643263, 0.2626276016, 0.166275382, 0.0769830346, 0.3128012419, -0.1297490746, -0.0807751566, 0.2233642489, 0.0906753093, -0.0652650967, -0.1705733389, -0.4048936069, -0.1530369967, -0.6100061536, 0.3575110734, 0.0955665559, 0.0380034968, 0.4022582173, 0.3181683421, 0.2154985219, -0.0212013423, 0.2124927938, 0.1002641469, -0.1950187981, 0.2925724685, -0.2248303741, -0.3832402229, -0.0998250395, 0.1606035233, 0.1626158357, 0.0072623119, -0.5234435201, 0.0302408189, -0.4368820786, 0.1623681039, -0.0637962669, 0.1686918736, 0.0814084336, -0.0684889853, -0.252260834, -0.0842750147, 0.0830213949, 0.0467449389, -0.0724339634, 0.0489530414, -0.1854186505, 0.4121848941, 0.2402960658, 0.4626803994, 0.1637824029, 0.081450291, 0.3938984275, -0.1217066795, 0.4271224439, -0.3532854915, -0.3265112042, 0.1239757389, -0.0912450999, -0.0329021886, 0.2716791034, 0.0862798318, 0.2699358165, -0.0800968781, -0.2192068994, 0.0848961473, -0.2793531418, 0.2020108402, -0.1586164832, 0.1581824124, -0.1357392967, -0.0241032168, -0.076738596, -0.1311157644, -0.0338695832, 0.1800868362, -0.2406114936, -0.0250575282, -0.3433841467, 0.0648814738, -0.4402942955, 0.1162604541, -0.0008667819, 0.3317792118, -0.1660779417, -0.1548695266, -0.1084448248, -0.1574717164, 0.5587422848, -0.0700286552, 0.0500068627, 0.062843658, -0.0919483528, -0.3191089332, 0.1609367728, -0.273501277, 0.1588301361, 0.3647182286, 0.5645073056, -0.3012344837, -0.0509773977, -0.0399971828, 0.3442743123, 0.0167831257, -0.0334482007, -0.2761998773, -0.1711503714, -0.4797544777, 0.13883394, 0.2979475558, 0.2846295238, -0.2693780661, 0.0718104541, 0.0067695789, -0.2416712642, 0.2142867446, 0.0040951446, 0.1445521116, -0.0965575501, 0.2255923599, -0.11081478, 0.1980495751, 0.1853598803, 0.571210742, -0.0331978723, -0.1837669462, -0.1003365591, -0.1977580041, 0.3502211571, 0.3210822642, -0.1226088405, 0.2475553453, -0.0468921177, 0.2047811002, 0.0162530318, 0.3714406788, 0.2987755835, -0.0353544578, -0.2595378458, -0.2792067826, 0.1115558743, -0.0845497698, 0.186700657, 0.1408400685, -0.3193203807, -0.255785048, 0.0803830326, -0.3155543804, 0.5819581747, 0.0936629474, 0.3404354155, -0.0566669777, 0.1961902827, 0.0629075021, -0.5295382738, 0.2434756607, -0.3393684626, -0.2687769532, 0.0840471908, -0.0019826964, 0.0826097876, 0.1538456231, -0.2705693841, 0.099735558, 0.1199408472, 0.10925477, -0.2874219716, 0.3836100101, -0.4040072858, -0.1202326417, -0.3170889616, 0.1808964312, 0.03846246, -0.1584917009, -0.0123264939, -0.1195178777, 0.0203618258, -0.2523367405, -0.1452476829, -0.0493158996, -0.4198037088, 0.1822547764, 0.0395060033, -0.3351991177, 0.2435081601, 0.1777709424, 0.071756199, 0.450784713, -0.2375143468, 0.1786590517, 0.0119801173, -0.027082352, 0.1566384882, -0.0020523407, 0.3159703612, -0.0124907866, -0.226741448, 0.0910505056, -0.0267283153, 0.1002251059, -0.0211735368, 0.0054353997, -0.0417034775, -0.3602175117, -0.2473488748, -0.1221572012, -0.2448607236, -0.3007043302, 0.225500524, -0.0533377901, -0.0544012934, 0.1680142581, 0.070989497, -0.3310468793, -0.1633546203, 0.3258527517, -0.1954340786, 0.1789702475, 0.5283613205, 0.3944673538, -0.2252093703, -0.3289530873, 0.3111716211, -0.0288909152, -0.3540940285, 0.2759121656, 0.0270213559, -0.0696799457, 0.1265555322, 0.2427482009, -0.1405766606, -0.3056273758, -0.1277887672, -0.2381087393, -0.4304107428, 0.0736414492, -0.0146860834, 0.1428981125, -0.01201839, 0.2134836614, 0.1132788509, 0.2139742076, -0.4020756483, 0.1423479915, -0.1771860868, 0.0996007621, 0.0931536108, -0.1355484426, 0.1392203569, 0.0518150181, 0.1112697199, 0.2546561956, -0.0731490254, -0.349837631, -0.1104321927, 0.0802938268, 0.0429647863, -0.1213977784, -0.018652847, -0.37576285, -0.257121861, -0.2619501054, 0.2055384815, -0.059557002, -0.1060269549, 0.1788901836, -0.1484445781, -0.0594713204, 0.116279766, -0.0307371747, -0.2113559842, 0.1135026142, 0.1723379791, 0.03541122, 0.0073987618, 0.0013163909, -0.315947175, 0.0826842189, 0.1583469957, -0.074155882, 0.3244627714, -0.3757268488, -0.0566545688, 0.1769374162, 0.3609240651, 0.4115838408, -0.2069201022, -0.0283431634, 0.242904827, 0.2981537879, -0.2532998025, -0.1001687646, 0.0949342027, -0.274805218, 0.3003486991, 0.1226504594, 0.1055467576, -0.0864749253, -0.1073916331, 0.2233813852, 0.1748344898, -0.0378284156, -0.1495877504, 0.3407847881, 0.1178210974, -0.0499307029, 0.3016183078, 0.1821384281, 0.3013968468, 0.7971208692, 0.0785120055, 0.2538470328, -0.2110732347, -0.0938018262, -0.1365757138, -0.4474599957, 0.0228370614, -0.080156602, -0.0352987014, -0.1246765777, -0.2056945413, 0.2049505264, -0.0154316071, 0.2201548517, -0.149097085, 0.0436852798, -0.2486524582, -0.1612087339, 0.0436247662, -0.3107735515, -0.0689435601, 0.1061922908, -0.1068827659, 0.3263955414, 0.1242940426, 0.062380448, 0.1022238582, -0.4294127822, 0.2035559118, -0.0626899004, 0.0876706094, -0.3625389636, 0.283985883, 0.2613128424, 0.1896339655, 0.1571779847, 0.1909477562, 0.5729640126, 0.3944715559, -0.136331588, -0.0809937119, -0.2755988538, -0.1103980243, -0.4154099226, 0.2859198749, 0.1979575902, 0.3553818166, 0.4643360972, 0.2075482011, -0.2680901289, -0.0660787076, 0.1920684576, -0.2637647986, -0.2234544754, 0.1047904044, 0.06799905, -0.2863511443, -0.3225551844, -0.0183769017, -0.4338229895, -0.004564492, 0.5934920311, -0.113588199, 0.3091259897, -0.3395168185, 0.1448054761, 0.0052000768, 0.6057995558, 0.295388788, -0.0178113356, -0.2954340577, -0.0195162073, -0.6181396842, 0.1262813359, -0.1511760056, 0.0123335719, -0.0116539802, 0.142536521, 0.1738923341, -0.0330507942, -0.00510104, 0.0362966694, -0.1498090327, -0.1097010374, -0.3313085437, -0.267660886, -0.2250266969, 0.1569494605, -0.001462914, -0.3760662079, 0.1550828815, -0.095508799, 0.2300338149, -0.1793869138, 0.0802164599, -0.0206276886, -0.0355978087, 0.1578442454, 0.0340669006, 0.3657599092, -0.2838610411, 0.0340453833, -0.1346195787, -0.0631755218, -0.1700665653, 0.1941982657, 0.2527388334, 0.2654083371, -0.0962214097, -0.1707081348, -0.1380879581, 0.1567031145, 0.0587183684, 0.0371346846, 0.1429637074, -0.0507430211, 0.1262889057, 0.0587170273, 0.1863670945, 0.4029747844, -0.1113760322, -0.034302935, -0.3506343663, -0.5414937139, 0.2185380459, -0.1180816963, -0.515848279, 0.0103076659, 0.1300401539, 0.1811081767, 0.0267860945, -0.5836284161, 0.3067768812, 0.3096932471, 0.1339993179, -0.3529008031, 0.258033663, -0.0505171418, 0.0583431274, 0.0119165555, 0.1201328412, 0.0242640786, -0.2702243924, -0.2056573778, -0.0322017111 ]
https://github.com/huggingface/datasets/issues/630
Text dataset not working with large files
Huh, weird. It's fixed on my side too. But now ```Caching processed dataset``` is taking forever - how can I disable it? Any flags?
``` Traceback (most recent call last): File "examples/language-modeling/run_language_modeling.py", line 333, in <module> main() File "examples/language-modeling/run_language_modeling.py", line 262, in main get_dataset(data_args, tokenizer=tokenizer, cache_dir=model_args.cache_dir) if training_args.do_train else None File "examples/language-modeling/run_language_modeling.py", line 144, in get_dataset dataset = load_dataset("text", data_files=file_path, split='train+test') File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/load.py", line 611, in load_dataset ignore_verifications=ignore_verifications, File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 469, in download_and_prepare dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/ksjae/.local/lib/python3.7/site-packages/tqdm/std.py", line 1129, in __iter__ for obj in iterable: File "/home/ksjae/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 104, in _generate_tables convert_options=self.config.convert_options, File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status ``` **pyarrow.lib.ArrowInvalid: straddling object straddles two block boundaries (try to increase block size?)** It gives the same message for both 200MB, 10GB .tx files but not for 700MB file. Can't upload due to size & copyright problem. sorry.
24
Text dataset not working with large files ``` Traceback (most recent call last): File "examples/language-modeling/run_language_modeling.py", line 333, in <module> main() File "examples/language-modeling/run_language_modeling.py", line 262, in main get_dataset(data_args, tokenizer=tokenizer, cache_dir=model_args.cache_dir) if training_args.do_train else None File "examples/language-modeling/run_language_modeling.py", line 144, in get_dataset dataset = load_dataset("text", data_files=file_path, split='train+test') File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/load.py", line 611, in load_dataset ignore_verifications=ignore_verifications, File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 469, in download_and_prepare dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/ksjae/.local/lib/python3.7/site-packages/tqdm/std.py", line 1129, in __iter__ for obj in iterable: File "/home/ksjae/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 104, in _generate_tables convert_options=self.config.convert_options, File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status ``` **pyarrow.lib.ArrowInvalid: straddling object straddles two block boundaries (try to increase block size?)** It gives the same message for both 200MB, 10GB .tx files but not for 700MB file. Can't upload due to size & copyright problem. sorry. Huh, weird. It's fixed on my side too. But now ```Caching processed dataset``` is taking forever - how can I disable it? Any flags?
[ -0.4925667048, -0.231024459, -0.1198643669, 0.2836016417, 0.4663564563, -0.0735015422, 0.305367887, 0.5961021185, -0.1138257831, 0.0461650416, -0.0624195635, -0.0304050855, -0.1033422872, 0.3101792336, -0.1062135994, -0.038950067, -0.2278844714, 0.1130641326, -0.0946918353, 0.0499164239, -0.1125508845, 0.1110999957, -0.1209789515, -0.0956027284, -0.4804550707, -0.0829954073, 0.1185314134, 0.2535656095, -0.292010814, -0.3053141534, -0.1657070965, 0.0965851247, 0.1122045517, 0.6015895605, -0.0001037621, 0.129462257, 0.2131116986, -0.1573347449, -0.2265825868, -0.0314554796, 0.1559487879, -0.4493691325, -0.1823891997, -0.2179644406, 0.0261512175, -0.0627443939, -0.1082876548, -0.1175597459, 0.2653357387, 0.5598000884, 0.3412034512, 0.1707586795, 0.132169202, -0.112961188, 0.250754118, -0.0872496516, -0.0887441784, 0.0848499984, 0.3108182251, 0.0265772715, -0.2564009428, 0.3199440539, 0.0101718828, 0.0377003662, 0.000314123, -0.0240627509, -0.0827938169, -0.2940051258, 0.3711952865, 0.2321465015, 0.5224131942, -0.3169360459, -0.1641274989, -0.3288432062, -0.182441175, -0.3512557149, 0.2266326845, 0.1845538318, -0.111331135, 0.1161270589, -0.1436177939, -0.1014851034, -0.2063972205, -0.0293160938, -0.2615340948, 0.1022735834, -0.2105868161, -0.1004619896, 0.2575801313, -0.27024737, 0.0108299255, -0.1056218594, -0.1307831407, -0.0068053361, -0.247213304, -0.0840229169, 0.246850729, -0.199078083, 0.288942188, -0.0489984453, -0.0504209101, 0.2247608304, -0.0796671361, -0.0302130803, 0.0054933541, 0.3368761539, 0.0041130576, 0.0638572425, 0.408585608, 0.2356286943, -0.379357487, -0.2320609689, -0.1496266723, -0.5145173073, -0.0293559097, -0.1177168339, 0.0168719739, -0.0452367477, -0.1210253313, 0.1521078646, 0.1642947644, 0.1547950506, 0.1059662178, 0.4816168845, 0.1049142778, 0.2937653065, -0.1649461985, 0.1770391017, -0.1443264484, -0.1722711325, -0.2492275834, -0.0456117094, -0.0579166152, -0.0926889852, 0.2676634789, 0.1338512301, 0.2967100441, -0.1395805478, 0.1908533573, -0.1090159342, 0.1168144345, -0.5029691458, 0.1476962417, 0.1410788596, -0.0354774296, 0.1686937064, 0.2287830114, -0.1788697988, -0.102391243, 0.0502354205, -0.1035604477, -0.2971286774, 0.1772404313, 0.3213211298, 0.1347745359, -0.0030029803, -0.0189048126, 0.2770770788, 0.3757717609, -0.2176395059, 0.0281008035, -0.1773215979, -0.2569281459, -0.1258029342, 0.2650002539, 0.4393814206, -0.4250603914, 0.2090141028, 0.0690165535, 0.1223086491, 0.0890246034, 0.3739469349, -0.0650729612, 0.3143593669, -0.1145713925, 0.2009434551, 0.3094798923, -0.2855856121, -0.5090754032, 0.5406070948, -0.1859300882, -0.1181869656, 0.0992573202, -0.0444511101, 0.1034917086, 0.1358727366, 0.3047404289, 0.2793403864, 0.011319682, 0.2667298913, -0.260450542, -0.0832505971, -0.0501142181, 0.1559396088, 0.312137723, -0.2080694735, 0.195168823, -0.0172313079, 0.1943764687, -0.0444458202, 0.2128831893, 0.5059322715, 0.1271579862, -0.0078409538, -0.0123688877, -0.2239850014, 0.0316569805, 0.1751383543, 0.1210088432, 0.0843076557, -0.2666846514, -0.0822331682, -0.309245348, -0.1341455579, -0.2703467011, -0.1933863014, 0.2836053371, 0.0813238025, -0.0114755929, 0.2551390529, -0.0167011097, 0.0253123995, -0.1602132767, -0.0655840337, -0.0120966481, -0.0018739775, -0.091083914, -0.2782086134, 0.122949332, 0.1102732196, -0.0411426574, -0.0616680533, -0.1109170541, 0.3928825557, -0.0020204186, -0.0810619891, -0.0360697247, -0.0300613195, 0.1020017415, -0.0952063128, 0.0719456449, 0.0007553585, 0.1388050616, -0.105313383, -0.1801619083, 0.300614208, -0.1398147196, 0.2031629086, 0.3438570797, -0.2090739459, 0.2556621432, -0.0663510486, 0.2276657224, -0.1381174475, 0.2406678498, 0.0831319094, 0.1363214701, -0.0013490645, -0.2719839215, -0.0127428509, 0.6895840168, 0.1710654795, 0.0928928256, 0.1747953296, -0.2453399599, -0.0179575719, -0.2034239471, 0.1299687773, 0.3976371586, 0.274443984, 0.2679816484, 0.0357754976, 0.1870049238, -0.3740029037, 0.1397105753, 0.0032483395, 0.0111855678, 0.5398925543, 0.2758922875, -0.1295804679, -0.4534206986, -0.1563688219, 0.1057757214, 0.3134025633, -0.0895872042, 0.019882448, -0.2155626714, -0.2920288444, -0.2959759235, 0.210327521, -0.2627325058, -0.2253176868, 0.0694883168, 0.0018544411, -0.0096534938, 0.09686625, 0.0594208688, 0.1454249024, 0.3997492492, -0.1988250166, -0.0615104325, -0.2715907991, -0.1730494499, 0.1567442566, 0.2421877533, -0.0123151951, 0.1066426486, -0.1534678638, -0.1015961096, 0.0038625095, -0.2853767574, 0.0006908299, -0.0291902125, 0.1006610468, -0.0571643263, 0.2626276016, 0.166275382, 0.0769830346, 0.3128012419, -0.1297490746, -0.0807751566, 0.2233642489, 0.0906753093, -0.0652650967, -0.1705733389, -0.4048936069, -0.1530369967, -0.6100061536, 0.3575110734, 0.0955665559, 0.0380034968, 0.4022582173, 0.3181683421, 0.2154985219, -0.0212013423, 0.2124927938, 0.1002641469, -0.1950187981, 0.2925724685, -0.2248303741, -0.3832402229, -0.0998250395, 0.1606035233, 0.1626158357, 0.0072623119, -0.5234435201, 0.0302408189, -0.4368820786, 0.1623681039, -0.0637962669, 0.1686918736, 0.0814084336, -0.0684889853, -0.252260834, -0.0842750147, 0.0830213949, 0.0467449389, -0.0724339634, 0.0489530414, -0.1854186505, 0.4121848941, 0.2402960658, 0.4626803994, 0.1637824029, 0.081450291, 0.3938984275, -0.1217066795, 0.4271224439, -0.3532854915, -0.3265112042, 0.1239757389, -0.0912450999, -0.0329021886, 0.2716791034, 0.0862798318, 0.2699358165, -0.0800968781, -0.2192068994, 0.0848961473, -0.2793531418, 0.2020108402, -0.1586164832, 0.1581824124, -0.1357392967, -0.0241032168, -0.076738596, -0.1311157644, -0.0338695832, 0.1800868362, -0.2406114936, -0.0250575282, -0.3433841467, 0.0648814738, -0.4402942955, 0.1162604541, -0.0008667819, 0.3317792118, -0.1660779417, -0.1548695266, -0.1084448248, -0.1574717164, 0.5587422848, -0.0700286552, 0.0500068627, 0.062843658, -0.0919483528, -0.3191089332, 0.1609367728, -0.273501277, 0.1588301361, 0.3647182286, 0.5645073056, -0.3012344837, -0.0509773977, -0.0399971828, 0.3442743123, 0.0167831257, -0.0334482007, -0.2761998773, -0.1711503714, -0.4797544777, 0.13883394, 0.2979475558, 0.2846295238, -0.2693780661, 0.0718104541, 0.0067695789, -0.2416712642, 0.2142867446, 0.0040951446, 0.1445521116, -0.0965575501, 0.2255923599, -0.11081478, 0.1980495751, 0.1853598803, 0.571210742, -0.0331978723, -0.1837669462, -0.1003365591, -0.1977580041, 0.3502211571, 0.3210822642, -0.1226088405, 0.2475553453, -0.0468921177, 0.2047811002, 0.0162530318, 0.3714406788, 0.2987755835, -0.0353544578, -0.2595378458, -0.2792067826, 0.1115558743, -0.0845497698, 0.186700657, 0.1408400685, -0.3193203807, -0.255785048, 0.0803830326, -0.3155543804, 0.5819581747, 0.0936629474, 0.3404354155, -0.0566669777, 0.1961902827, 0.0629075021, -0.5295382738, 0.2434756607, -0.3393684626, -0.2687769532, 0.0840471908, -0.0019826964, 0.0826097876, 0.1538456231, -0.2705693841, 0.099735558, 0.1199408472, 0.10925477, -0.2874219716, 0.3836100101, -0.4040072858, -0.1202326417, -0.3170889616, 0.1808964312, 0.03846246, -0.1584917009, -0.0123264939, -0.1195178777, 0.0203618258, -0.2523367405, -0.1452476829, -0.0493158996, -0.4198037088, 0.1822547764, 0.0395060033, -0.3351991177, 0.2435081601, 0.1777709424, 0.071756199, 0.450784713, -0.2375143468, 0.1786590517, 0.0119801173, -0.027082352, 0.1566384882, -0.0020523407, 0.3159703612, -0.0124907866, -0.226741448, 0.0910505056, -0.0267283153, 0.1002251059, -0.0211735368, 0.0054353997, -0.0417034775, -0.3602175117, -0.2473488748, -0.1221572012, -0.2448607236, -0.3007043302, 0.225500524, -0.0533377901, -0.0544012934, 0.1680142581, 0.070989497, -0.3310468793, -0.1633546203, 0.3258527517, -0.1954340786, 0.1789702475, 0.5283613205, 0.3944673538, -0.2252093703, -0.3289530873, 0.3111716211, -0.0288909152, -0.3540940285, 0.2759121656, 0.0270213559, -0.0696799457, 0.1265555322, 0.2427482009, -0.1405766606, -0.3056273758, -0.1277887672, -0.2381087393, -0.4304107428, 0.0736414492, -0.0146860834, 0.1428981125, -0.01201839, 0.2134836614, 0.1132788509, 0.2139742076, -0.4020756483, 0.1423479915, -0.1771860868, 0.0996007621, 0.0931536108, -0.1355484426, 0.1392203569, 0.0518150181, 0.1112697199, 0.2546561956, -0.0731490254, -0.349837631, -0.1104321927, 0.0802938268, 0.0429647863, -0.1213977784, -0.018652847, -0.37576285, -0.257121861, -0.2619501054, 0.2055384815, -0.059557002, -0.1060269549, 0.1788901836, -0.1484445781, -0.0594713204, 0.116279766, -0.0307371747, -0.2113559842, 0.1135026142, 0.1723379791, 0.03541122, 0.0073987618, 0.0013163909, -0.315947175, 0.0826842189, 0.1583469957, -0.074155882, 0.3244627714, -0.3757268488, -0.0566545688, 0.1769374162, 0.3609240651, 0.4115838408, -0.2069201022, -0.0283431634, 0.242904827, 0.2981537879, -0.2532998025, -0.1001687646, 0.0949342027, -0.274805218, 0.3003486991, 0.1226504594, 0.1055467576, -0.0864749253, -0.1073916331, 0.2233813852, 0.1748344898, -0.0378284156, -0.1495877504, 0.3407847881, 0.1178210974, -0.0499307029, 0.3016183078, 0.1821384281, 0.3013968468, 0.7971208692, 0.0785120055, 0.2538470328, -0.2110732347, -0.0938018262, -0.1365757138, -0.4474599957, 0.0228370614, -0.080156602, -0.0352987014, -0.1246765777, -0.2056945413, 0.2049505264, -0.0154316071, 0.2201548517, -0.149097085, 0.0436852798, -0.2486524582, -0.1612087339, 0.0436247662, -0.3107735515, -0.0689435601, 0.1061922908, -0.1068827659, 0.3263955414, 0.1242940426, 0.062380448, 0.1022238582, -0.4294127822, 0.2035559118, -0.0626899004, 0.0876706094, -0.3625389636, 0.283985883, 0.2613128424, 0.1896339655, 0.1571779847, 0.1909477562, 0.5729640126, 0.3944715559, -0.136331588, -0.0809937119, -0.2755988538, -0.1103980243, -0.4154099226, 0.2859198749, 0.1979575902, 0.3553818166, 0.4643360972, 0.2075482011, -0.2680901289, -0.0660787076, 0.1920684576, -0.2637647986, -0.2234544754, 0.1047904044, 0.06799905, -0.2863511443, -0.3225551844, -0.0183769017, -0.4338229895, -0.004564492, 0.5934920311, -0.113588199, 0.3091259897, -0.3395168185, 0.1448054761, 0.0052000768, 0.6057995558, 0.295388788, -0.0178113356, -0.2954340577, -0.0195162073, -0.6181396842, 0.1262813359, -0.1511760056, 0.0123335719, -0.0116539802, 0.142536521, 0.1738923341, -0.0330507942, -0.00510104, 0.0362966694, -0.1498090327, -0.1097010374, -0.3313085437, -0.267660886, -0.2250266969, 0.1569494605, -0.001462914, -0.3760662079, 0.1550828815, -0.095508799, 0.2300338149, -0.1793869138, 0.0802164599, -0.0206276886, -0.0355978087, 0.1578442454, 0.0340669006, 0.3657599092, -0.2838610411, 0.0340453833, -0.1346195787, -0.0631755218, -0.1700665653, 0.1941982657, 0.2527388334, 0.2654083371, -0.0962214097, -0.1707081348, -0.1380879581, 0.1567031145, 0.0587183684, 0.0371346846, 0.1429637074, -0.0507430211, 0.1262889057, 0.0587170273, 0.1863670945, 0.4029747844, -0.1113760322, -0.034302935, -0.3506343663, -0.5414937139, 0.2185380459, -0.1180816963, -0.515848279, 0.0103076659, 0.1300401539, 0.1811081767, 0.0267860945, -0.5836284161, 0.3067768812, 0.3096932471, 0.1339993179, -0.3529008031, 0.258033663, -0.0505171418, 0.0583431274, 0.0119165555, 0.1201328412, 0.0242640786, -0.2702243924, -0.2056573778, -0.0322017111 ]
https://github.com/huggingface/datasets/issues/630
Text dataset not working with large files
Right after `Caching processed dataset`, your function is applied to the dataset and there's a progress bar that shows how much time is left. How much time does it take for you ? Also caching isn't supposed to slow down your processing. But if you still want to disable it you can do `.map(..., load_from_cache_file=False)`
``` Traceback (most recent call last): File "examples/language-modeling/run_language_modeling.py", line 333, in <module> main() File "examples/language-modeling/run_language_modeling.py", line 262, in main get_dataset(data_args, tokenizer=tokenizer, cache_dir=model_args.cache_dir) if training_args.do_train else None File "examples/language-modeling/run_language_modeling.py", line 144, in get_dataset dataset = load_dataset("text", data_files=file_path, split='train+test') File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/load.py", line 611, in load_dataset ignore_verifications=ignore_verifications, File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 469, in download_and_prepare dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/ksjae/.local/lib/python3.7/site-packages/tqdm/std.py", line 1129, in __iter__ for obj in iterable: File "/home/ksjae/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 104, in _generate_tables convert_options=self.config.convert_options, File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status ``` **pyarrow.lib.ArrowInvalid: straddling object straddles two block boundaries (try to increase block size?)** It gives the same message for both 200MB, 10GB .tx files but not for 700MB file. Can't upload due to size & copyright problem. sorry.
55
Text dataset not working with large files ``` Traceback (most recent call last): File "examples/language-modeling/run_language_modeling.py", line 333, in <module> main() File "examples/language-modeling/run_language_modeling.py", line 262, in main get_dataset(data_args, tokenizer=tokenizer, cache_dir=model_args.cache_dir) if training_args.do_train else None File "examples/language-modeling/run_language_modeling.py", line 144, in get_dataset dataset = load_dataset("text", data_files=file_path, split='train+test') File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/load.py", line 611, in load_dataset ignore_verifications=ignore_verifications, File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 469, in download_and_prepare dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/ksjae/.local/lib/python3.7/site-packages/tqdm/std.py", line 1129, in __iter__ for obj in iterable: File "/home/ksjae/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 104, in _generate_tables convert_options=self.config.convert_options, File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status ``` **pyarrow.lib.ArrowInvalid: straddling object straddles two block boundaries (try to increase block size?)** It gives the same message for both 200MB, 10GB .tx files but not for 700MB file. Can't upload due to size & copyright problem. sorry. Right after `Caching processed dataset`, your function is applied to the dataset and there's a progress bar that shows how much time is left. How much time does it take for you ? Also caching isn't supposed to slow down your processing. But if you still want to disable it you can do `.map(..., load_from_cache_file=False)`
[ -0.4925667048, -0.231024459, -0.1198643669, 0.2836016417, 0.4663564563, -0.0735015422, 0.305367887, 0.5961021185, -0.1138257831, 0.0461650416, -0.0624195635, -0.0304050855, -0.1033422872, 0.3101792336, -0.1062135994, -0.038950067, -0.2278844714, 0.1130641326, -0.0946918353, 0.0499164239, -0.1125508845, 0.1110999957, -0.1209789515, -0.0956027284, -0.4804550707, -0.0829954073, 0.1185314134, 0.2535656095, -0.292010814, -0.3053141534, -0.1657070965, 0.0965851247, 0.1122045517, 0.6015895605, -0.0001037621, 0.129462257, 0.2131116986, -0.1573347449, -0.2265825868, -0.0314554796, 0.1559487879, -0.4493691325, -0.1823891997, -0.2179644406, 0.0261512175, -0.0627443939, -0.1082876548, -0.1175597459, 0.2653357387, 0.5598000884, 0.3412034512, 0.1707586795, 0.132169202, -0.112961188, 0.250754118, -0.0872496516, -0.0887441784, 0.0848499984, 0.3108182251, 0.0265772715, -0.2564009428, 0.3199440539, 0.0101718828, 0.0377003662, 0.000314123, -0.0240627509, -0.0827938169, -0.2940051258, 0.3711952865, 0.2321465015, 0.5224131942, -0.3169360459, -0.1641274989, -0.3288432062, -0.182441175, -0.3512557149, 0.2266326845, 0.1845538318, -0.111331135, 0.1161270589, -0.1436177939, -0.1014851034, -0.2063972205, -0.0293160938, -0.2615340948, 0.1022735834, -0.2105868161, -0.1004619896, 0.2575801313, -0.27024737, 0.0108299255, -0.1056218594, -0.1307831407, -0.0068053361, -0.247213304, -0.0840229169, 0.246850729, -0.199078083, 0.288942188, -0.0489984453, -0.0504209101, 0.2247608304, -0.0796671361, -0.0302130803, 0.0054933541, 0.3368761539, 0.0041130576, 0.0638572425, 0.408585608, 0.2356286943, -0.379357487, -0.2320609689, -0.1496266723, -0.5145173073, -0.0293559097, -0.1177168339, 0.0168719739, -0.0452367477, -0.1210253313, 0.1521078646, 0.1642947644, 0.1547950506, 0.1059662178, 0.4816168845, 0.1049142778, 0.2937653065, -0.1649461985, 0.1770391017, -0.1443264484, -0.1722711325, -0.2492275834, -0.0456117094, -0.0579166152, -0.0926889852, 0.2676634789, 0.1338512301, 0.2967100441, -0.1395805478, 0.1908533573, -0.1090159342, 0.1168144345, -0.5029691458, 0.1476962417, 0.1410788596, -0.0354774296, 0.1686937064, 0.2287830114, -0.1788697988, -0.102391243, 0.0502354205, -0.1035604477, -0.2971286774, 0.1772404313, 0.3213211298, 0.1347745359, -0.0030029803, -0.0189048126, 0.2770770788, 0.3757717609, -0.2176395059, 0.0281008035, -0.1773215979, -0.2569281459, -0.1258029342, 0.2650002539, 0.4393814206, -0.4250603914, 0.2090141028, 0.0690165535, 0.1223086491, 0.0890246034, 0.3739469349, -0.0650729612, 0.3143593669, -0.1145713925, 0.2009434551, 0.3094798923, -0.2855856121, -0.5090754032, 0.5406070948, -0.1859300882, -0.1181869656, 0.0992573202, -0.0444511101, 0.1034917086, 0.1358727366, 0.3047404289, 0.2793403864, 0.011319682, 0.2667298913, -0.260450542, -0.0832505971, -0.0501142181, 0.1559396088, 0.312137723, -0.2080694735, 0.195168823, -0.0172313079, 0.1943764687, -0.0444458202, 0.2128831893, 0.5059322715, 0.1271579862, -0.0078409538, -0.0123688877, -0.2239850014, 0.0316569805, 0.1751383543, 0.1210088432, 0.0843076557, -0.2666846514, -0.0822331682, -0.309245348, -0.1341455579, -0.2703467011, -0.1933863014, 0.2836053371, 0.0813238025, -0.0114755929, 0.2551390529, -0.0167011097, 0.0253123995, -0.1602132767, -0.0655840337, -0.0120966481, -0.0018739775, -0.091083914, -0.2782086134, 0.122949332, 0.1102732196, -0.0411426574, -0.0616680533, -0.1109170541, 0.3928825557, -0.0020204186, -0.0810619891, -0.0360697247, -0.0300613195, 0.1020017415, -0.0952063128, 0.0719456449, 0.0007553585, 0.1388050616, -0.105313383, -0.1801619083, 0.300614208, -0.1398147196, 0.2031629086, 0.3438570797, -0.2090739459, 0.2556621432, -0.0663510486, 0.2276657224, -0.1381174475, 0.2406678498, 0.0831319094, 0.1363214701, -0.0013490645, -0.2719839215, -0.0127428509, 0.6895840168, 0.1710654795, 0.0928928256, 0.1747953296, -0.2453399599, -0.0179575719, -0.2034239471, 0.1299687773, 0.3976371586, 0.274443984, 0.2679816484, 0.0357754976, 0.1870049238, -0.3740029037, 0.1397105753, 0.0032483395, 0.0111855678, 0.5398925543, 0.2758922875, -0.1295804679, -0.4534206986, -0.1563688219, 0.1057757214, 0.3134025633, -0.0895872042, 0.019882448, -0.2155626714, -0.2920288444, -0.2959759235, 0.210327521, -0.2627325058, -0.2253176868, 0.0694883168, 0.0018544411, -0.0096534938, 0.09686625, 0.0594208688, 0.1454249024, 0.3997492492, -0.1988250166, -0.0615104325, -0.2715907991, -0.1730494499, 0.1567442566, 0.2421877533, -0.0123151951, 0.1066426486, -0.1534678638, -0.1015961096, 0.0038625095, -0.2853767574, 0.0006908299, -0.0291902125, 0.1006610468, -0.0571643263, 0.2626276016, 0.166275382, 0.0769830346, 0.3128012419, -0.1297490746, -0.0807751566, 0.2233642489, 0.0906753093, -0.0652650967, -0.1705733389, -0.4048936069, -0.1530369967, -0.6100061536, 0.3575110734, 0.0955665559, 0.0380034968, 0.4022582173, 0.3181683421, 0.2154985219, -0.0212013423, 0.2124927938, 0.1002641469, -0.1950187981, 0.2925724685, -0.2248303741, -0.3832402229, -0.0998250395, 0.1606035233, 0.1626158357, 0.0072623119, -0.5234435201, 0.0302408189, -0.4368820786, 0.1623681039, -0.0637962669, 0.1686918736, 0.0814084336, -0.0684889853, -0.252260834, -0.0842750147, 0.0830213949, 0.0467449389, -0.0724339634, 0.0489530414, -0.1854186505, 0.4121848941, 0.2402960658, 0.4626803994, 0.1637824029, 0.081450291, 0.3938984275, -0.1217066795, 0.4271224439, -0.3532854915, -0.3265112042, 0.1239757389, -0.0912450999, -0.0329021886, 0.2716791034, 0.0862798318, 0.2699358165, -0.0800968781, -0.2192068994, 0.0848961473, -0.2793531418, 0.2020108402, -0.1586164832, 0.1581824124, -0.1357392967, -0.0241032168, -0.076738596, -0.1311157644, -0.0338695832, 0.1800868362, -0.2406114936, -0.0250575282, -0.3433841467, 0.0648814738, -0.4402942955, 0.1162604541, -0.0008667819, 0.3317792118, -0.1660779417, -0.1548695266, -0.1084448248, -0.1574717164, 0.5587422848, -0.0700286552, 0.0500068627, 0.062843658, -0.0919483528, -0.3191089332, 0.1609367728, -0.273501277, 0.1588301361, 0.3647182286, 0.5645073056, -0.3012344837, -0.0509773977, -0.0399971828, 0.3442743123, 0.0167831257, -0.0334482007, -0.2761998773, -0.1711503714, -0.4797544777, 0.13883394, 0.2979475558, 0.2846295238, -0.2693780661, 0.0718104541, 0.0067695789, -0.2416712642, 0.2142867446, 0.0040951446, 0.1445521116, -0.0965575501, 0.2255923599, -0.11081478, 0.1980495751, 0.1853598803, 0.571210742, -0.0331978723, -0.1837669462, -0.1003365591, -0.1977580041, 0.3502211571, 0.3210822642, -0.1226088405, 0.2475553453, -0.0468921177, 0.2047811002, 0.0162530318, 0.3714406788, 0.2987755835, -0.0353544578, -0.2595378458, -0.2792067826, 0.1115558743, -0.0845497698, 0.186700657, 0.1408400685, -0.3193203807, -0.255785048, 0.0803830326, -0.3155543804, 0.5819581747, 0.0936629474, 0.3404354155, -0.0566669777, 0.1961902827, 0.0629075021, -0.5295382738, 0.2434756607, -0.3393684626, -0.2687769532, 0.0840471908, -0.0019826964, 0.0826097876, 0.1538456231, -0.2705693841, 0.099735558, 0.1199408472, 0.10925477, -0.2874219716, 0.3836100101, -0.4040072858, -0.1202326417, -0.3170889616, 0.1808964312, 0.03846246, -0.1584917009, -0.0123264939, -0.1195178777, 0.0203618258, -0.2523367405, -0.1452476829, -0.0493158996, -0.4198037088, 0.1822547764, 0.0395060033, -0.3351991177, 0.2435081601, 0.1777709424, 0.071756199, 0.450784713, -0.2375143468, 0.1786590517, 0.0119801173, -0.027082352, 0.1566384882, -0.0020523407, 0.3159703612, -0.0124907866, -0.226741448, 0.0910505056, -0.0267283153, 0.1002251059, -0.0211735368, 0.0054353997, -0.0417034775, -0.3602175117, -0.2473488748, -0.1221572012, -0.2448607236, -0.3007043302, 0.225500524, -0.0533377901, -0.0544012934, 0.1680142581, 0.070989497, -0.3310468793, -0.1633546203, 0.3258527517, -0.1954340786, 0.1789702475, 0.5283613205, 0.3944673538, -0.2252093703, -0.3289530873, 0.3111716211, -0.0288909152, -0.3540940285, 0.2759121656, 0.0270213559, -0.0696799457, 0.1265555322, 0.2427482009, -0.1405766606, -0.3056273758, -0.1277887672, -0.2381087393, -0.4304107428, 0.0736414492, -0.0146860834, 0.1428981125, -0.01201839, 0.2134836614, 0.1132788509, 0.2139742076, -0.4020756483, 0.1423479915, -0.1771860868, 0.0996007621, 0.0931536108, -0.1355484426, 0.1392203569, 0.0518150181, 0.1112697199, 0.2546561956, -0.0731490254, -0.349837631, -0.1104321927, 0.0802938268, 0.0429647863, -0.1213977784, -0.018652847, -0.37576285, -0.257121861, -0.2619501054, 0.2055384815, -0.059557002, -0.1060269549, 0.1788901836, -0.1484445781, -0.0594713204, 0.116279766, -0.0307371747, -0.2113559842, 0.1135026142, 0.1723379791, 0.03541122, 0.0073987618, 0.0013163909, -0.315947175, 0.0826842189, 0.1583469957, -0.074155882, 0.3244627714, -0.3757268488, -0.0566545688, 0.1769374162, 0.3609240651, 0.4115838408, -0.2069201022, -0.0283431634, 0.242904827, 0.2981537879, -0.2532998025, -0.1001687646, 0.0949342027, -0.274805218, 0.3003486991, 0.1226504594, 0.1055467576, -0.0864749253, -0.1073916331, 0.2233813852, 0.1748344898, -0.0378284156, -0.1495877504, 0.3407847881, 0.1178210974, -0.0499307029, 0.3016183078, 0.1821384281, 0.3013968468, 0.7971208692, 0.0785120055, 0.2538470328, -0.2110732347, -0.0938018262, -0.1365757138, -0.4474599957, 0.0228370614, -0.080156602, -0.0352987014, -0.1246765777, -0.2056945413, 0.2049505264, -0.0154316071, 0.2201548517, -0.149097085, 0.0436852798, -0.2486524582, -0.1612087339, 0.0436247662, -0.3107735515, -0.0689435601, 0.1061922908, -0.1068827659, 0.3263955414, 0.1242940426, 0.062380448, 0.1022238582, -0.4294127822, 0.2035559118, -0.0626899004, 0.0876706094, -0.3625389636, 0.283985883, 0.2613128424, 0.1896339655, 0.1571779847, 0.1909477562, 0.5729640126, 0.3944715559, -0.136331588, -0.0809937119, -0.2755988538, -0.1103980243, -0.4154099226, 0.2859198749, 0.1979575902, 0.3553818166, 0.4643360972, 0.2075482011, -0.2680901289, -0.0660787076, 0.1920684576, -0.2637647986, -0.2234544754, 0.1047904044, 0.06799905, -0.2863511443, -0.3225551844, -0.0183769017, -0.4338229895, -0.004564492, 0.5934920311, -0.113588199, 0.3091259897, -0.3395168185, 0.1448054761, 0.0052000768, 0.6057995558, 0.295388788, -0.0178113356, -0.2954340577, -0.0195162073, -0.6181396842, 0.1262813359, -0.1511760056, 0.0123335719, -0.0116539802, 0.142536521, 0.1738923341, -0.0330507942, -0.00510104, 0.0362966694, -0.1498090327, -0.1097010374, -0.3313085437, -0.267660886, -0.2250266969, 0.1569494605, -0.001462914, -0.3760662079, 0.1550828815, -0.095508799, 0.2300338149, -0.1793869138, 0.0802164599, -0.0206276886, -0.0355978087, 0.1578442454, 0.0340669006, 0.3657599092, -0.2838610411, 0.0340453833, -0.1346195787, -0.0631755218, -0.1700665653, 0.1941982657, 0.2527388334, 0.2654083371, -0.0962214097, -0.1707081348, -0.1380879581, 0.1567031145, 0.0587183684, 0.0371346846, 0.1429637074, -0.0507430211, 0.1262889057, 0.0587170273, 0.1863670945, 0.4029747844, -0.1113760322, -0.034302935, -0.3506343663, -0.5414937139, 0.2185380459, -0.1180816963, -0.515848279, 0.0103076659, 0.1300401539, 0.1811081767, 0.0267860945, -0.5836284161, 0.3067768812, 0.3096932471, 0.1339993179, -0.3529008031, 0.258033663, -0.0505171418, 0.0583431274, 0.0119165555, 0.1201328412, 0.0242640786, -0.2702243924, -0.2056573778, -0.0322017111 ]
https://github.com/huggingface/datasets/issues/630
Text dataset not working with large files
Ah, it’s much faster now(Takes around 15~20min). BTW, any way to set default tensor output as plain tensors with distributed training? The ragged tensors are incompatible with tpustrategy :(
``` Traceback (most recent call last): File "examples/language-modeling/run_language_modeling.py", line 333, in <module> main() File "examples/language-modeling/run_language_modeling.py", line 262, in main get_dataset(data_args, tokenizer=tokenizer, cache_dir=model_args.cache_dir) if training_args.do_train else None File "examples/language-modeling/run_language_modeling.py", line 144, in get_dataset dataset = load_dataset("text", data_files=file_path, split='train+test') File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/load.py", line 611, in load_dataset ignore_verifications=ignore_verifications, File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 469, in download_and_prepare dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/ksjae/.local/lib/python3.7/site-packages/tqdm/std.py", line 1129, in __iter__ for obj in iterable: File "/home/ksjae/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 104, in _generate_tables convert_options=self.config.convert_options, File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status ``` **pyarrow.lib.ArrowInvalid: straddling object straddles two block boundaries (try to increase block size?)** It gives the same message for both 200MB, 10GB .tx files but not for 700MB file. Can't upload due to size & copyright problem. sorry.
29
Text dataset not working with large files ``` Traceback (most recent call last): File "examples/language-modeling/run_language_modeling.py", line 333, in <module> main() File "examples/language-modeling/run_language_modeling.py", line 262, in main get_dataset(data_args, tokenizer=tokenizer, cache_dir=model_args.cache_dir) if training_args.do_train else None File "examples/language-modeling/run_language_modeling.py", line 144, in get_dataset dataset = load_dataset("text", data_files=file_path, split='train+test') File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/load.py", line 611, in load_dataset ignore_verifications=ignore_verifications, File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 469, in download_and_prepare dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/ksjae/.local/lib/python3.7/site-packages/tqdm/std.py", line 1129, in __iter__ for obj in iterable: File "/home/ksjae/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 104, in _generate_tables convert_options=self.config.convert_options, File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status ``` **pyarrow.lib.ArrowInvalid: straddling object straddles two block boundaries (try to increase block size?)** It gives the same message for both 200MB, 10GB .tx files but not for 700MB file. Can't upload due to size & copyright problem. sorry. Ah, it’s much faster now(Takes around 15~20min). BTW, any way to set default tensor output as plain tensors with distributed training? The ragged tensors are incompatible with tpustrategy :(
[ -0.4925667048, -0.231024459, -0.1198643669, 0.2836016417, 0.4663564563, -0.0735015422, 0.305367887, 0.5961021185, -0.1138257831, 0.0461650416, -0.0624195635, -0.0304050855, -0.1033422872, 0.3101792336, -0.1062135994, -0.038950067, -0.2278844714, 0.1130641326, -0.0946918353, 0.0499164239, -0.1125508845, 0.1110999957, -0.1209789515, -0.0956027284, -0.4804550707, -0.0829954073, 0.1185314134, 0.2535656095, -0.292010814, -0.3053141534, -0.1657070965, 0.0965851247, 0.1122045517, 0.6015895605, -0.0001037621, 0.129462257, 0.2131116986, -0.1573347449, -0.2265825868, -0.0314554796, 0.1559487879, -0.4493691325, -0.1823891997, -0.2179644406, 0.0261512175, -0.0627443939, -0.1082876548, -0.1175597459, 0.2653357387, 0.5598000884, 0.3412034512, 0.1707586795, 0.132169202, -0.112961188, 0.250754118, -0.0872496516, -0.0887441784, 0.0848499984, 0.3108182251, 0.0265772715, -0.2564009428, 0.3199440539, 0.0101718828, 0.0377003662, 0.000314123, -0.0240627509, -0.0827938169, -0.2940051258, 0.3711952865, 0.2321465015, 0.5224131942, -0.3169360459, -0.1641274989, -0.3288432062, -0.182441175, -0.3512557149, 0.2266326845, 0.1845538318, -0.111331135, 0.1161270589, -0.1436177939, -0.1014851034, -0.2063972205, -0.0293160938, -0.2615340948, 0.1022735834, -0.2105868161, -0.1004619896, 0.2575801313, -0.27024737, 0.0108299255, -0.1056218594, -0.1307831407, -0.0068053361, -0.247213304, -0.0840229169, 0.246850729, -0.199078083, 0.288942188, -0.0489984453, -0.0504209101, 0.2247608304, -0.0796671361, -0.0302130803, 0.0054933541, 0.3368761539, 0.0041130576, 0.0638572425, 0.408585608, 0.2356286943, -0.379357487, -0.2320609689, -0.1496266723, -0.5145173073, -0.0293559097, -0.1177168339, 0.0168719739, -0.0452367477, -0.1210253313, 0.1521078646, 0.1642947644, 0.1547950506, 0.1059662178, 0.4816168845, 0.1049142778, 0.2937653065, -0.1649461985, 0.1770391017, -0.1443264484, -0.1722711325, -0.2492275834, -0.0456117094, -0.0579166152, -0.0926889852, 0.2676634789, 0.1338512301, 0.2967100441, -0.1395805478, 0.1908533573, -0.1090159342, 0.1168144345, -0.5029691458, 0.1476962417, 0.1410788596, -0.0354774296, 0.1686937064, 0.2287830114, -0.1788697988, -0.102391243, 0.0502354205, -0.1035604477, -0.2971286774, 0.1772404313, 0.3213211298, 0.1347745359, -0.0030029803, -0.0189048126, 0.2770770788, 0.3757717609, -0.2176395059, 0.0281008035, -0.1773215979, -0.2569281459, -0.1258029342, 0.2650002539, 0.4393814206, -0.4250603914, 0.2090141028, 0.0690165535, 0.1223086491, 0.0890246034, 0.3739469349, -0.0650729612, 0.3143593669, -0.1145713925, 0.2009434551, 0.3094798923, -0.2855856121, -0.5090754032, 0.5406070948, -0.1859300882, -0.1181869656, 0.0992573202, -0.0444511101, 0.1034917086, 0.1358727366, 0.3047404289, 0.2793403864, 0.011319682, 0.2667298913, -0.260450542, -0.0832505971, -0.0501142181, 0.1559396088, 0.312137723, -0.2080694735, 0.195168823, -0.0172313079, 0.1943764687, -0.0444458202, 0.2128831893, 0.5059322715, 0.1271579862, -0.0078409538, -0.0123688877, -0.2239850014, 0.0316569805, 0.1751383543, 0.1210088432, 0.0843076557, -0.2666846514, -0.0822331682, -0.309245348, -0.1341455579, -0.2703467011, -0.1933863014, 0.2836053371, 0.0813238025, -0.0114755929, 0.2551390529, -0.0167011097, 0.0253123995, -0.1602132767, -0.0655840337, -0.0120966481, -0.0018739775, -0.091083914, -0.2782086134, 0.122949332, 0.1102732196, -0.0411426574, -0.0616680533, -0.1109170541, 0.3928825557, -0.0020204186, -0.0810619891, -0.0360697247, -0.0300613195, 0.1020017415, -0.0952063128, 0.0719456449, 0.0007553585, 0.1388050616, -0.105313383, -0.1801619083, 0.300614208, -0.1398147196, 0.2031629086, 0.3438570797, -0.2090739459, 0.2556621432, -0.0663510486, 0.2276657224, -0.1381174475, 0.2406678498, 0.0831319094, 0.1363214701, -0.0013490645, -0.2719839215, -0.0127428509, 0.6895840168, 0.1710654795, 0.0928928256, 0.1747953296, -0.2453399599, -0.0179575719, -0.2034239471, 0.1299687773, 0.3976371586, 0.274443984, 0.2679816484, 0.0357754976, 0.1870049238, -0.3740029037, 0.1397105753, 0.0032483395, 0.0111855678, 0.5398925543, 0.2758922875, -0.1295804679, -0.4534206986, -0.1563688219, 0.1057757214, 0.3134025633, -0.0895872042, 0.019882448, -0.2155626714, -0.2920288444, -0.2959759235, 0.210327521, -0.2627325058, -0.2253176868, 0.0694883168, 0.0018544411, -0.0096534938, 0.09686625, 0.0594208688, 0.1454249024, 0.3997492492, -0.1988250166, -0.0615104325, -0.2715907991, -0.1730494499, 0.1567442566, 0.2421877533, -0.0123151951, 0.1066426486, -0.1534678638, -0.1015961096, 0.0038625095, -0.2853767574, 0.0006908299, -0.0291902125, 0.1006610468, -0.0571643263, 0.2626276016, 0.166275382, 0.0769830346, 0.3128012419, -0.1297490746, -0.0807751566, 0.2233642489, 0.0906753093, -0.0652650967, -0.1705733389, -0.4048936069, -0.1530369967, -0.6100061536, 0.3575110734, 0.0955665559, 0.0380034968, 0.4022582173, 0.3181683421, 0.2154985219, -0.0212013423, 0.2124927938, 0.1002641469, -0.1950187981, 0.2925724685, -0.2248303741, -0.3832402229, -0.0998250395, 0.1606035233, 0.1626158357, 0.0072623119, -0.5234435201, 0.0302408189, -0.4368820786, 0.1623681039, -0.0637962669, 0.1686918736, 0.0814084336, -0.0684889853, -0.252260834, -0.0842750147, 0.0830213949, 0.0467449389, -0.0724339634, 0.0489530414, -0.1854186505, 0.4121848941, 0.2402960658, 0.4626803994, 0.1637824029, 0.081450291, 0.3938984275, -0.1217066795, 0.4271224439, -0.3532854915, -0.3265112042, 0.1239757389, -0.0912450999, -0.0329021886, 0.2716791034, 0.0862798318, 0.2699358165, -0.0800968781, -0.2192068994, 0.0848961473, -0.2793531418, 0.2020108402, -0.1586164832, 0.1581824124, -0.1357392967, -0.0241032168, -0.076738596, -0.1311157644, -0.0338695832, 0.1800868362, -0.2406114936, -0.0250575282, -0.3433841467, 0.0648814738, -0.4402942955, 0.1162604541, -0.0008667819, 0.3317792118, -0.1660779417, -0.1548695266, -0.1084448248, -0.1574717164, 0.5587422848, -0.0700286552, 0.0500068627, 0.062843658, -0.0919483528, -0.3191089332, 0.1609367728, -0.273501277, 0.1588301361, 0.3647182286, 0.5645073056, -0.3012344837, -0.0509773977, -0.0399971828, 0.3442743123, 0.0167831257, -0.0334482007, -0.2761998773, -0.1711503714, -0.4797544777, 0.13883394, 0.2979475558, 0.2846295238, -0.2693780661, 0.0718104541, 0.0067695789, -0.2416712642, 0.2142867446, 0.0040951446, 0.1445521116, -0.0965575501, 0.2255923599, -0.11081478, 0.1980495751, 0.1853598803, 0.571210742, -0.0331978723, -0.1837669462, -0.1003365591, -0.1977580041, 0.3502211571, 0.3210822642, -0.1226088405, 0.2475553453, -0.0468921177, 0.2047811002, 0.0162530318, 0.3714406788, 0.2987755835, -0.0353544578, -0.2595378458, -0.2792067826, 0.1115558743, -0.0845497698, 0.186700657, 0.1408400685, -0.3193203807, -0.255785048, 0.0803830326, -0.3155543804, 0.5819581747, 0.0936629474, 0.3404354155, -0.0566669777, 0.1961902827, 0.0629075021, -0.5295382738, 0.2434756607, -0.3393684626, -0.2687769532, 0.0840471908, -0.0019826964, 0.0826097876, 0.1538456231, -0.2705693841, 0.099735558, 0.1199408472, 0.10925477, -0.2874219716, 0.3836100101, -0.4040072858, -0.1202326417, -0.3170889616, 0.1808964312, 0.03846246, -0.1584917009, -0.0123264939, -0.1195178777, 0.0203618258, -0.2523367405, -0.1452476829, -0.0493158996, -0.4198037088, 0.1822547764, 0.0395060033, -0.3351991177, 0.2435081601, 0.1777709424, 0.071756199, 0.450784713, -0.2375143468, 0.1786590517, 0.0119801173, -0.027082352, 0.1566384882, -0.0020523407, 0.3159703612, -0.0124907866, -0.226741448, 0.0910505056, -0.0267283153, 0.1002251059, -0.0211735368, 0.0054353997, -0.0417034775, -0.3602175117, -0.2473488748, -0.1221572012, -0.2448607236, -0.3007043302, 0.225500524, -0.0533377901, -0.0544012934, 0.1680142581, 0.070989497, -0.3310468793, -0.1633546203, 0.3258527517, -0.1954340786, 0.1789702475, 0.5283613205, 0.3944673538, -0.2252093703, -0.3289530873, 0.3111716211, -0.0288909152, -0.3540940285, 0.2759121656, 0.0270213559, -0.0696799457, 0.1265555322, 0.2427482009, -0.1405766606, -0.3056273758, -0.1277887672, -0.2381087393, -0.4304107428, 0.0736414492, -0.0146860834, 0.1428981125, -0.01201839, 0.2134836614, 0.1132788509, 0.2139742076, -0.4020756483, 0.1423479915, -0.1771860868, 0.0996007621, 0.0931536108, -0.1355484426, 0.1392203569, 0.0518150181, 0.1112697199, 0.2546561956, -0.0731490254, -0.349837631, -0.1104321927, 0.0802938268, 0.0429647863, -0.1213977784, -0.018652847, -0.37576285, -0.257121861, -0.2619501054, 0.2055384815, -0.059557002, -0.1060269549, 0.1788901836, -0.1484445781, -0.0594713204, 0.116279766, -0.0307371747, -0.2113559842, 0.1135026142, 0.1723379791, 0.03541122, 0.0073987618, 0.0013163909, -0.315947175, 0.0826842189, 0.1583469957, -0.074155882, 0.3244627714, -0.3757268488, -0.0566545688, 0.1769374162, 0.3609240651, 0.4115838408, -0.2069201022, -0.0283431634, 0.242904827, 0.2981537879, -0.2532998025, -0.1001687646, 0.0949342027, -0.274805218, 0.3003486991, 0.1226504594, 0.1055467576, -0.0864749253, -0.1073916331, 0.2233813852, 0.1748344898, -0.0378284156, -0.1495877504, 0.3407847881, 0.1178210974, -0.0499307029, 0.3016183078, 0.1821384281, 0.3013968468, 0.7971208692, 0.0785120055, 0.2538470328, -0.2110732347, -0.0938018262, -0.1365757138, -0.4474599957, 0.0228370614, -0.080156602, -0.0352987014, -0.1246765777, -0.2056945413, 0.2049505264, -0.0154316071, 0.2201548517, -0.149097085, 0.0436852798, -0.2486524582, -0.1612087339, 0.0436247662, -0.3107735515, -0.0689435601, 0.1061922908, -0.1068827659, 0.3263955414, 0.1242940426, 0.062380448, 0.1022238582, -0.4294127822, 0.2035559118, -0.0626899004, 0.0876706094, -0.3625389636, 0.283985883, 0.2613128424, 0.1896339655, 0.1571779847, 0.1909477562, 0.5729640126, 0.3944715559, -0.136331588, -0.0809937119, -0.2755988538, -0.1103980243, -0.4154099226, 0.2859198749, 0.1979575902, 0.3553818166, 0.4643360972, 0.2075482011, -0.2680901289, -0.0660787076, 0.1920684576, -0.2637647986, -0.2234544754, 0.1047904044, 0.06799905, -0.2863511443, -0.3225551844, -0.0183769017, -0.4338229895, -0.004564492, 0.5934920311, -0.113588199, 0.3091259897, -0.3395168185, 0.1448054761, 0.0052000768, 0.6057995558, 0.295388788, -0.0178113356, -0.2954340577, -0.0195162073, -0.6181396842, 0.1262813359, -0.1511760056, 0.0123335719, -0.0116539802, 0.142536521, 0.1738923341, -0.0330507942, -0.00510104, 0.0362966694, -0.1498090327, -0.1097010374, -0.3313085437, -0.267660886, -0.2250266969, 0.1569494605, -0.001462914, -0.3760662079, 0.1550828815, -0.095508799, 0.2300338149, -0.1793869138, 0.0802164599, -0.0206276886, -0.0355978087, 0.1578442454, 0.0340669006, 0.3657599092, -0.2838610411, 0.0340453833, -0.1346195787, -0.0631755218, -0.1700665653, 0.1941982657, 0.2527388334, 0.2654083371, -0.0962214097, -0.1707081348, -0.1380879581, 0.1567031145, 0.0587183684, 0.0371346846, 0.1429637074, -0.0507430211, 0.1262889057, 0.0587170273, 0.1863670945, 0.4029747844, -0.1113760322, -0.034302935, -0.3506343663, -0.5414937139, 0.2185380459, -0.1180816963, -0.515848279, 0.0103076659, 0.1300401539, 0.1811081767, 0.0267860945, -0.5836284161, 0.3067768812, 0.3096932471, 0.1339993179, -0.3529008031, 0.258033663, -0.0505171418, 0.0583431274, 0.0119165555, 0.1201328412, 0.0242640786, -0.2702243924, -0.2056573778, -0.0322017111 ]
https://github.com/huggingface/datasets/issues/630
Text dataset not working with large files
> Ah, it’s much faster now(Takes around 15~20min). Glad to see that it's faster now. What did you change exactly ? > BTW, any way to set default tensor output as plain tensors with distributed training? The ragged tensors are incompatible with tpustrategy :( Oh I didn't know about that. Feel free to open an issue to mention that. I guess what you can do for now is set the dataset format to numpy instead of tensorflow, and use a wrapper of the dataset that converts the numpy arrays to tf tensors.
``` Traceback (most recent call last): File "examples/language-modeling/run_language_modeling.py", line 333, in <module> main() File "examples/language-modeling/run_language_modeling.py", line 262, in main get_dataset(data_args, tokenizer=tokenizer, cache_dir=model_args.cache_dir) if training_args.do_train else None File "examples/language-modeling/run_language_modeling.py", line 144, in get_dataset dataset = load_dataset("text", data_files=file_path, split='train+test') File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/load.py", line 611, in load_dataset ignore_verifications=ignore_verifications, File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 469, in download_and_prepare dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/ksjae/.local/lib/python3.7/site-packages/tqdm/std.py", line 1129, in __iter__ for obj in iterable: File "/home/ksjae/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 104, in _generate_tables convert_options=self.config.convert_options, File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status ``` **pyarrow.lib.ArrowInvalid: straddling object straddles two block boundaries (try to increase block size?)** It gives the same message for both 200MB, 10GB .tx files but not for 700MB file. Can't upload due to size & copyright problem. sorry.
92
Text dataset not working with large files ``` Traceback (most recent call last): File "examples/language-modeling/run_language_modeling.py", line 333, in <module> main() File "examples/language-modeling/run_language_modeling.py", line 262, in main get_dataset(data_args, tokenizer=tokenizer, cache_dir=model_args.cache_dir) if training_args.do_train else None File "examples/language-modeling/run_language_modeling.py", line 144, in get_dataset dataset = load_dataset("text", data_files=file_path, split='train+test') File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/load.py", line 611, in load_dataset ignore_verifications=ignore_verifications, File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 469, in download_and_prepare dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/ksjae/.local/lib/python3.7/site-packages/tqdm/std.py", line 1129, in __iter__ for obj in iterable: File "/home/ksjae/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 104, in _generate_tables convert_options=self.config.convert_options, File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status ``` **pyarrow.lib.ArrowInvalid: straddling object straddles two block boundaries (try to increase block size?)** It gives the same message for both 200MB, 10GB .tx files but not for 700MB file. Can't upload due to size & copyright problem. sorry. > Ah, it’s much faster now(Takes around 15~20min). Glad to see that it's faster now. What did you change exactly ? > BTW, any way to set default tensor output as plain tensors with distributed training? The ragged tensors are incompatible with tpustrategy :( Oh I didn't know about that. Feel free to open an issue to mention that. I guess what you can do for now is set the dataset format to numpy instead of tensorflow, and use a wrapper of the dataset that converts the numpy arrays to tf tensors.
[ -0.4925667048, -0.231024459, -0.1198643669, 0.2836016417, 0.4663564563, -0.0735015422, 0.305367887, 0.5961021185, -0.1138257831, 0.0461650416, -0.0624195635, -0.0304050855, -0.1033422872, 0.3101792336, -0.1062135994, -0.038950067, -0.2278844714, 0.1130641326, -0.0946918353, 0.0499164239, -0.1125508845, 0.1110999957, -0.1209789515, -0.0956027284, -0.4804550707, -0.0829954073, 0.1185314134, 0.2535656095, -0.292010814, -0.3053141534, -0.1657070965, 0.0965851247, 0.1122045517, 0.6015895605, -0.0001037621, 0.129462257, 0.2131116986, -0.1573347449, -0.2265825868, -0.0314554796, 0.1559487879, -0.4493691325, -0.1823891997, -0.2179644406, 0.0261512175, -0.0627443939, -0.1082876548, -0.1175597459, 0.2653357387, 0.5598000884, 0.3412034512, 0.1707586795, 0.132169202, -0.112961188, 0.250754118, -0.0872496516, -0.0887441784, 0.0848499984, 0.3108182251, 0.0265772715, -0.2564009428, 0.3199440539, 0.0101718828, 0.0377003662, 0.000314123, -0.0240627509, -0.0827938169, -0.2940051258, 0.3711952865, 0.2321465015, 0.5224131942, -0.3169360459, -0.1641274989, -0.3288432062, -0.182441175, -0.3512557149, 0.2266326845, 0.1845538318, -0.111331135, 0.1161270589, -0.1436177939, -0.1014851034, -0.2063972205, -0.0293160938, -0.2615340948, 0.1022735834, -0.2105868161, -0.1004619896, 0.2575801313, -0.27024737, 0.0108299255, -0.1056218594, -0.1307831407, -0.0068053361, -0.247213304, -0.0840229169, 0.246850729, -0.199078083, 0.288942188, -0.0489984453, -0.0504209101, 0.2247608304, -0.0796671361, -0.0302130803, 0.0054933541, 0.3368761539, 0.0041130576, 0.0638572425, 0.408585608, 0.2356286943, -0.379357487, -0.2320609689, -0.1496266723, -0.5145173073, -0.0293559097, -0.1177168339, 0.0168719739, -0.0452367477, -0.1210253313, 0.1521078646, 0.1642947644, 0.1547950506, 0.1059662178, 0.4816168845, 0.1049142778, 0.2937653065, -0.1649461985, 0.1770391017, -0.1443264484, -0.1722711325, -0.2492275834, -0.0456117094, -0.0579166152, -0.0926889852, 0.2676634789, 0.1338512301, 0.2967100441, -0.1395805478, 0.1908533573, -0.1090159342, 0.1168144345, -0.5029691458, 0.1476962417, 0.1410788596, -0.0354774296, 0.1686937064, 0.2287830114, -0.1788697988, -0.102391243, 0.0502354205, -0.1035604477, -0.2971286774, 0.1772404313, 0.3213211298, 0.1347745359, -0.0030029803, -0.0189048126, 0.2770770788, 0.3757717609, -0.2176395059, 0.0281008035, -0.1773215979, -0.2569281459, -0.1258029342, 0.2650002539, 0.4393814206, -0.4250603914, 0.2090141028, 0.0690165535, 0.1223086491, 0.0890246034, 0.3739469349, -0.0650729612, 0.3143593669, -0.1145713925, 0.2009434551, 0.3094798923, -0.2855856121, -0.5090754032, 0.5406070948, -0.1859300882, -0.1181869656, 0.0992573202, -0.0444511101, 0.1034917086, 0.1358727366, 0.3047404289, 0.2793403864, 0.011319682, 0.2667298913, -0.260450542, -0.0832505971, -0.0501142181, 0.1559396088, 0.312137723, -0.2080694735, 0.195168823, -0.0172313079, 0.1943764687, -0.0444458202, 0.2128831893, 0.5059322715, 0.1271579862, -0.0078409538, -0.0123688877, -0.2239850014, 0.0316569805, 0.1751383543, 0.1210088432, 0.0843076557, -0.2666846514, -0.0822331682, -0.309245348, -0.1341455579, -0.2703467011, -0.1933863014, 0.2836053371, 0.0813238025, -0.0114755929, 0.2551390529, -0.0167011097, 0.0253123995, -0.1602132767, -0.0655840337, -0.0120966481, -0.0018739775, -0.091083914, -0.2782086134, 0.122949332, 0.1102732196, -0.0411426574, -0.0616680533, -0.1109170541, 0.3928825557, -0.0020204186, -0.0810619891, -0.0360697247, -0.0300613195, 0.1020017415, -0.0952063128, 0.0719456449, 0.0007553585, 0.1388050616, -0.105313383, -0.1801619083, 0.300614208, -0.1398147196, 0.2031629086, 0.3438570797, -0.2090739459, 0.2556621432, -0.0663510486, 0.2276657224, -0.1381174475, 0.2406678498, 0.0831319094, 0.1363214701, -0.0013490645, -0.2719839215, -0.0127428509, 0.6895840168, 0.1710654795, 0.0928928256, 0.1747953296, -0.2453399599, -0.0179575719, -0.2034239471, 0.1299687773, 0.3976371586, 0.274443984, 0.2679816484, 0.0357754976, 0.1870049238, -0.3740029037, 0.1397105753, 0.0032483395, 0.0111855678, 0.5398925543, 0.2758922875, -0.1295804679, -0.4534206986, -0.1563688219, 0.1057757214, 0.3134025633, -0.0895872042, 0.019882448, -0.2155626714, -0.2920288444, -0.2959759235, 0.210327521, -0.2627325058, -0.2253176868, 0.0694883168, 0.0018544411, -0.0096534938, 0.09686625, 0.0594208688, 0.1454249024, 0.3997492492, -0.1988250166, -0.0615104325, -0.2715907991, -0.1730494499, 0.1567442566, 0.2421877533, -0.0123151951, 0.1066426486, -0.1534678638, -0.1015961096, 0.0038625095, -0.2853767574, 0.0006908299, -0.0291902125, 0.1006610468, -0.0571643263, 0.2626276016, 0.166275382, 0.0769830346, 0.3128012419, -0.1297490746, -0.0807751566, 0.2233642489, 0.0906753093, -0.0652650967, -0.1705733389, -0.4048936069, -0.1530369967, -0.6100061536, 0.3575110734, 0.0955665559, 0.0380034968, 0.4022582173, 0.3181683421, 0.2154985219, -0.0212013423, 0.2124927938, 0.1002641469, -0.1950187981, 0.2925724685, -0.2248303741, -0.3832402229, -0.0998250395, 0.1606035233, 0.1626158357, 0.0072623119, -0.5234435201, 0.0302408189, -0.4368820786, 0.1623681039, -0.0637962669, 0.1686918736, 0.0814084336, -0.0684889853, -0.252260834, -0.0842750147, 0.0830213949, 0.0467449389, -0.0724339634, 0.0489530414, -0.1854186505, 0.4121848941, 0.2402960658, 0.4626803994, 0.1637824029, 0.081450291, 0.3938984275, -0.1217066795, 0.4271224439, -0.3532854915, -0.3265112042, 0.1239757389, -0.0912450999, -0.0329021886, 0.2716791034, 0.0862798318, 0.2699358165, -0.0800968781, -0.2192068994, 0.0848961473, -0.2793531418, 0.2020108402, -0.1586164832, 0.1581824124, -0.1357392967, -0.0241032168, -0.076738596, -0.1311157644, -0.0338695832, 0.1800868362, -0.2406114936, -0.0250575282, -0.3433841467, 0.0648814738, -0.4402942955, 0.1162604541, -0.0008667819, 0.3317792118, -0.1660779417, -0.1548695266, -0.1084448248, -0.1574717164, 0.5587422848, -0.0700286552, 0.0500068627, 0.062843658, -0.0919483528, -0.3191089332, 0.1609367728, -0.273501277, 0.1588301361, 0.3647182286, 0.5645073056, -0.3012344837, -0.0509773977, -0.0399971828, 0.3442743123, 0.0167831257, -0.0334482007, -0.2761998773, -0.1711503714, -0.4797544777, 0.13883394, 0.2979475558, 0.2846295238, -0.2693780661, 0.0718104541, 0.0067695789, -0.2416712642, 0.2142867446, 0.0040951446, 0.1445521116, -0.0965575501, 0.2255923599, -0.11081478, 0.1980495751, 0.1853598803, 0.571210742, -0.0331978723, -0.1837669462, -0.1003365591, -0.1977580041, 0.3502211571, 0.3210822642, -0.1226088405, 0.2475553453, -0.0468921177, 0.2047811002, 0.0162530318, 0.3714406788, 0.2987755835, -0.0353544578, -0.2595378458, -0.2792067826, 0.1115558743, -0.0845497698, 0.186700657, 0.1408400685, -0.3193203807, -0.255785048, 0.0803830326, -0.3155543804, 0.5819581747, 0.0936629474, 0.3404354155, -0.0566669777, 0.1961902827, 0.0629075021, -0.5295382738, 0.2434756607, -0.3393684626, -0.2687769532, 0.0840471908, -0.0019826964, 0.0826097876, 0.1538456231, -0.2705693841, 0.099735558, 0.1199408472, 0.10925477, -0.2874219716, 0.3836100101, -0.4040072858, -0.1202326417, -0.3170889616, 0.1808964312, 0.03846246, -0.1584917009, -0.0123264939, -0.1195178777, 0.0203618258, -0.2523367405, -0.1452476829, -0.0493158996, -0.4198037088, 0.1822547764, 0.0395060033, -0.3351991177, 0.2435081601, 0.1777709424, 0.071756199, 0.450784713, -0.2375143468, 0.1786590517, 0.0119801173, -0.027082352, 0.1566384882, -0.0020523407, 0.3159703612, -0.0124907866, -0.226741448, 0.0910505056, -0.0267283153, 0.1002251059, -0.0211735368, 0.0054353997, -0.0417034775, -0.3602175117, -0.2473488748, -0.1221572012, -0.2448607236, -0.3007043302, 0.225500524, -0.0533377901, -0.0544012934, 0.1680142581, 0.070989497, -0.3310468793, -0.1633546203, 0.3258527517, -0.1954340786, 0.1789702475, 0.5283613205, 0.3944673538, -0.2252093703, -0.3289530873, 0.3111716211, -0.0288909152, -0.3540940285, 0.2759121656, 0.0270213559, -0.0696799457, 0.1265555322, 0.2427482009, -0.1405766606, -0.3056273758, -0.1277887672, -0.2381087393, -0.4304107428, 0.0736414492, -0.0146860834, 0.1428981125, -0.01201839, 0.2134836614, 0.1132788509, 0.2139742076, -0.4020756483, 0.1423479915, -0.1771860868, 0.0996007621, 0.0931536108, -0.1355484426, 0.1392203569, 0.0518150181, 0.1112697199, 0.2546561956, -0.0731490254, -0.349837631, -0.1104321927, 0.0802938268, 0.0429647863, -0.1213977784, -0.018652847, -0.37576285, -0.257121861, -0.2619501054, 0.2055384815, -0.059557002, -0.1060269549, 0.1788901836, -0.1484445781, -0.0594713204, 0.116279766, -0.0307371747, -0.2113559842, 0.1135026142, 0.1723379791, 0.03541122, 0.0073987618, 0.0013163909, -0.315947175, 0.0826842189, 0.1583469957, -0.074155882, 0.3244627714, -0.3757268488, -0.0566545688, 0.1769374162, 0.3609240651, 0.4115838408, -0.2069201022, -0.0283431634, 0.242904827, 0.2981537879, -0.2532998025, -0.1001687646, 0.0949342027, -0.274805218, 0.3003486991, 0.1226504594, 0.1055467576, -0.0864749253, -0.1073916331, 0.2233813852, 0.1748344898, -0.0378284156, -0.1495877504, 0.3407847881, 0.1178210974, -0.0499307029, 0.3016183078, 0.1821384281, 0.3013968468, 0.7971208692, 0.0785120055, 0.2538470328, -0.2110732347, -0.0938018262, -0.1365757138, -0.4474599957, 0.0228370614, -0.080156602, -0.0352987014, -0.1246765777, -0.2056945413, 0.2049505264, -0.0154316071, 0.2201548517, -0.149097085, 0.0436852798, -0.2486524582, -0.1612087339, 0.0436247662, -0.3107735515, -0.0689435601, 0.1061922908, -0.1068827659, 0.3263955414, 0.1242940426, 0.062380448, 0.1022238582, -0.4294127822, 0.2035559118, -0.0626899004, 0.0876706094, -0.3625389636, 0.283985883, 0.2613128424, 0.1896339655, 0.1571779847, 0.1909477562, 0.5729640126, 0.3944715559, -0.136331588, -0.0809937119, -0.2755988538, -0.1103980243, -0.4154099226, 0.2859198749, 0.1979575902, 0.3553818166, 0.4643360972, 0.2075482011, -0.2680901289, -0.0660787076, 0.1920684576, -0.2637647986, -0.2234544754, 0.1047904044, 0.06799905, -0.2863511443, -0.3225551844, -0.0183769017, -0.4338229895, -0.004564492, 0.5934920311, -0.113588199, 0.3091259897, -0.3395168185, 0.1448054761, 0.0052000768, 0.6057995558, 0.295388788, -0.0178113356, -0.2954340577, -0.0195162073, -0.6181396842, 0.1262813359, -0.1511760056, 0.0123335719, -0.0116539802, 0.142536521, 0.1738923341, -0.0330507942, -0.00510104, 0.0362966694, -0.1498090327, -0.1097010374, -0.3313085437, -0.267660886, -0.2250266969, 0.1569494605, -0.001462914, -0.3760662079, 0.1550828815, -0.095508799, 0.2300338149, -0.1793869138, 0.0802164599, -0.0206276886, -0.0355978087, 0.1578442454, 0.0340669006, 0.3657599092, -0.2838610411, 0.0340453833, -0.1346195787, -0.0631755218, -0.1700665653, 0.1941982657, 0.2527388334, 0.2654083371, -0.0962214097, -0.1707081348, -0.1380879581, 0.1567031145, 0.0587183684, 0.0371346846, 0.1429637074, -0.0507430211, 0.1262889057, 0.0587170273, 0.1863670945, 0.4029747844, -0.1113760322, -0.034302935, -0.3506343663, -0.5414937139, 0.2185380459, -0.1180816963, -0.515848279, 0.0103076659, 0.1300401539, 0.1811081767, 0.0267860945, -0.5836284161, 0.3067768812, 0.3096932471, 0.1339993179, -0.3529008031, 0.258033663, -0.0505171418, 0.0583431274, 0.0119165555, 0.1201328412, 0.0242640786, -0.2702243924, -0.2056573778, -0.0322017111 ]
https://github.com/huggingface/datasets/issues/630
Text dataset not working with large files
>>> Glad to see that it's faster now. What did you change exactly ? I don't know, it just worked...? Sorry I couldn't be more helpful. Setting with numpy array is a great idea! Thanks.
``` Traceback (most recent call last): File "examples/language-modeling/run_language_modeling.py", line 333, in <module> main() File "examples/language-modeling/run_language_modeling.py", line 262, in main get_dataset(data_args, tokenizer=tokenizer, cache_dir=model_args.cache_dir) if training_args.do_train else None File "examples/language-modeling/run_language_modeling.py", line 144, in get_dataset dataset = load_dataset("text", data_files=file_path, split='train+test') File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/load.py", line 611, in load_dataset ignore_verifications=ignore_verifications, File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 469, in download_and_prepare dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/ksjae/.local/lib/python3.7/site-packages/tqdm/std.py", line 1129, in __iter__ for obj in iterable: File "/home/ksjae/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 104, in _generate_tables convert_options=self.config.convert_options, File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status ``` **pyarrow.lib.ArrowInvalid: straddling object straddles two block boundaries (try to increase block size?)** It gives the same message for both 200MB, 10GB .tx files but not for 700MB file. Can't upload due to size & copyright problem. sorry.
35
Text dataset not working with large files ``` Traceback (most recent call last): File "examples/language-modeling/run_language_modeling.py", line 333, in <module> main() File "examples/language-modeling/run_language_modeling.py", line 262, in main get_dataset(data_args, tokenizer=tokenizer, cache_dir=model_args.cache_dir) if training_args.do_train else None File "examples/language-modeling/run_language_modeling.py", line 144, in get_dataset dataset = load_dataset("text", data_files=file_path, split='train+test') File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/load.py", line 611, in load_dataset ignore_verifications=ignore_verifications, File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 469, in download_and_prepare dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/ksjae/.local/lib/python3.7/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/ksjae/.local/lib/python3.7/site-packages/tqdm/std.py", line 1129, in __iter__ for obj in iterable: File "/home/ksjae/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 104, in _generate_tables convert_options=self.config.convert_options, File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status ``` **pyarrow.lib.ArrowInvalid: straddling object straddles two block boundaries (try to increase block size?)** It gives the same message for both 200MB, 10GB .tx files but not for 700MB file. Can't upload due to size & copyright problem. sorry. >>> Glad to see that it's faster now. What did you change exactly ? I don't know, it just worked...? Sorry I couldn't be more helpful. Setting with numpy array is a great idea! Thanks.
[ -0.4925667048, -0.231024459, -0.1198643669, 0.2836016417, 0.4663564563, -0.0735015422, 0.305367887, 0.5961021185, -0.1138257831, 0.0461650416, -0.0624195635, -0.0304050855, -0.1033422872, 0.3101792336, -0.1062135994, -0.038950067, -0.2278844714, 0.1130641326, -0.0946918353, 0.0499164239, -0.1125508845, 0.1110999957, -0.1209789515, -0.0956027284, -0.4804550707, -0.0829954073, 0.1185314134, 0.2535656095, -0.292010814, -0.3053141534, -0.1657070965, 0.0965851247, 0.1122045517, 0.6015895605, -0.0001037621, 0.129462257, 0.2131116986, -0.1573347449, -0.2265825868, -0.0314554796, 0.1559487879, -0.4493691325, -0.1823891997, -0.2179644406, 0.0261512175, -0.0627443939, -0.1082876548, -0.1175597459, 0.2653357387, 0.5598000884, 0.3412034512, 0.1707586795, 0.132169202, -0.112961188, 0.250754118, -0.0872496516, -0.0887441784, 0.0848499984, 0.3108182251, 0.0265772715, -0.2564009428, 0.3199440539, 0.0101718828, 0.0377003662, 0.000314123, -0.0240627509, -0.0827938169, -0.2940051258, 0.3711952865, 0.2321465015, 0.5224131942, -0.3169360459, -0.1641274989, -0.3288432062, -0.182441175, -0.3512557149, 0.2266326845, 0.1845538318, -0.111331135, 0.1161270589, -0.1436177939, -0.1014851034, -0.2063972205, -0.0293160938, -0.2615340948, 0.1022735834, -0.2105868161, -0.1004619896, 0.2575801313, -0.27024737, 0.0108299255, -0.1056218594, -0.1307831407, -0.0068053361, -0.247213304, -0.0840229169, 0.246850729, -0.199078083, 0.288942188, -0.0489984453, -0.0504209101, 0.2247608304, -0.0796671361, -0.0302130803, 0.0054933541, 0.3368761539, 0.0041130576, 0.0638572425, 0.408585608, 0.2356286943, -0.379357487, -0.2320609689, -0.1496266723, -0.5145173073, -0.0293559097, -0.1177168339, 0.0168719739, -0.0452367477, -0.1210253313, 0.1521078646, 0.1642947644, 0.1547950506, 0.1059662178, 0.4816168845, 0.1049142778, 0.2937653065, -0.1649461985, 0.1770391017, -0.1443264484, -0.1722711325, -0.2492275834, -0.0456117094, -0.0579166152, -0.0926889852, 0.2676634789, 0.1338512301, 0.2967100441, -0.1395805478, 0.1908533573, -0.1090159342, 0.1168144345, -0.5029691458, 0.1476962417, 0.1410788596, -0.0354774296, 0.1686937064, 0.2287830114, -0.1788697988, -0.102391243, 0.0502354205, -0.1035604477, -0.2971286774, 0.1772404313, 0.3213211298, 0.1347745359, -0.0030029803, -0.0189048126, 0.2770770788, 0.3757717609, -0.2176395059, 0.0281008035, -0.1773215979, -0.2569281459, -0.1258029342, 0.2650002539, 0.4393814206, -0.4250603914, 0.2090141028, 0.0690165535, 0.1223086491, 0.0890246034, 0.3739469349, -0.0650729612, 0.3143593669, -0.1145713925, 0.2009434551, 0.3094798923, -0.2855856121, -0.5090754032, 0.5406070948, -0.1859300882, -0.1181869656, 0.0992573202, -0.0444511101, 0.1034917086, 0.1358727366, 0.3047404289, 0.2793403864, 0.011319682, 0.2667298913, -0.260450542, -0.0832505971, -0.0501142181, 0.1559396088, 0.312137723, -0.2080694735, 0.195168823, -0.0172313079, 0.1943764687, -0.0444458202, 0.2128831893, 0.5059322715, 0.1271579862, -0.0078409538, -0.0123688877, -0.2239850014, 0.0316569805, 0.1751383543, 0.1210088432, 0.0843076557, -0.2666846514, -0.0822331682, -0.309245348, -0.1341455579, -0.2703467011, -0.1933863014, 0.2836053371, 0.0813238025, -0.0114755929, 0.2551390529, -0.0167011097, 0.0253123995, -0.1602132767, -0.0655840337, -0.0120966481, -0.0018739775, -0.091083914, -0.2782086134, 0.122949332, 0.1102732196, -0.0411426574, -0.0616680533, -0.1109170541, 0.3928825557, -0.0020204186, -0.0810619891, -0.0360697247, -0.0300613195, 0.1020017415, -0.0952063128, 0.0719456449, 0.0007553585, 0.1388050616, -0.105313383, -0.1801619083, 0.300614208, -0.1398147196, 0.2031629086, 0.3438570797, -0.2090739459, 0.2556621432, -0.0663510486, 0.2276657224, -0.1381174475, 0.2406678498, 0.0831319094, 0.1363214701, -0.0013490645, -0.2719839215, -0.0127428509, 0.6895840168, 0.1710654795, 0.0928928256, 0.1747953296, -0.2453399599, -0.0179575719, -0.2034239471, 0.1299687773, 0.3976371586, 0.274443984, 0.2679816484, 0.0357754976, 0.1870049238, -0.3740029037, 0.1397105753, 0.0032483395, 0.0111855678, 0.5398925543, 0.2758922875, -0.1295804679, -0.4534206986, -0.1563688219, 0.1057757214, 0.3134025633, -0.0895872042, 0.019882448, -0.2155626714, -0.2920288444, -0.2959759235, 0.210327521, -0.2627325058, -0.2253176868, 0.0694883168, 0.0018544411, -0.0096534938, 0.09686625, 0.0594208688, 0.1454249024, 0.3997492492, -0.1988250166, -0.0615104325, -0.2715907991, -0.1730494499, 0.1567442566, 0.2421877533, -0.0123151951, 0.1066426486, -0.1534678638, -0.1015961096, 0.0038625095, -0.2853767574, 0.0006908299, -0.0291902125, 0.1006610468, -0.0571643263, 0.2626276016, 0.166275382, 0.0769830346, 0.3128012419, -0.1297490746, -0.0807751566, 0.2233642489, 0.0906753093, -0.0652650967, -0.1705733389, -0.4048936069, -0.1530369967, -0.6100061536, 0.3575110734, 0.0955665559, 0.0380034968, 0.4022582173, 0.3181683421, 0.2154985219, -0.0212013423, 0.2124927938, 0.1002641469, -0.1950187981, 0.2925724685, -0.2248303741, -0.3832402229, -0.0998250395, 0.1606035233, 0.1626158357, 0.0072623119, -0.5234435201, 0.0302408189, -0.4368820786, 0.1623681039, -0.0637962669, 0.1686918736, 0.0814084336, -0.0684889853, -0.252260834, -0.0842750147, 0.0830213949, 0.0467449389, -0.0724339634, 0.0489530414, -0.1854186505, 0.4121848941, 0.2402960658, 0.4626803994, 0.1637824029, 0.081450291, 0.3938984275, -0.1217066795, 0.4271224439, -0.3532854915, -0.3265112042, 0.1239757389, -0.0912450999, -0.0329021886, 0.2716791034, 0.0862798318, 0.2699358165, -0.0800968781, -0.2192068994, 0.0848961473, -0.2793531418, 0.2020108402, -0.1586164832, 0.1581824124, -0.1357392967, -0.0241032168, -0.076738596, -0.1311157644, -0.0338695832, 0.1800868362, -0.2406114936, -0.0250575282, -0.3433841467, 0.0648814738, -0.4402942955, 0.1162604541, -0.0008667819, 0.3317792118, -0.1660779417, -0.1548695266, -0.1084448248, -0.1574717164, 0.5587422848, -0.0700286552, 0.0500068627, 0.062843658, -0.0919483528, -0.3191089332, 0.1609367728, -0.273501277, 0.1588301361, 0.3647182286, 0.5645073056, -0.3012344837, -0.0509773977, -0.0399971828, 0.3442743123, 0.0167831257, -0.0334482007, -0.2761998773, -0.1711503714, -0.4797544777, 0.13883394, 0.2979475558, 0.2846295238, -0.2693780661, 0.0718104541, 0.0067695789, -0.2416712642, 0.2142867446, 0.0040951446, 0.1445521116, -0.0965575501, 0.2255923599, -0.11081478, 0.1980495751, 0.1853598803, 0.571210742, -0.0331978723, -0.1837669462, -0.1003365591, -0.1977580041, 0.3502211571, 0.3210822642, -0.1226088405, 0.2475553453, -0.0468921177, 0.2047811002, 0.0162530318, 0.3714406788, 0.2987755835, -0.0353544578, -0.2595378458, -0.2792067826, 0.1115558743, -0.0845497698, 0.186700657, 0.1408400685, -0.3193203807, -0.255785048, 0.0803830326, -0.3155543804, 0.5819581747, 0.0936629474, 0.3404354155, -0.0566669777, 0.1961902827, 0.0629075021, -0.5295382738, 0.2434756607, -0.3393684626, -0.2687769532, 0.0840471908, -0.0019826964, 0.0826097876, 0.1538456231, -0.2705693841, 0.099735558, 0.1199408472, 0.10925477, -0.2874219716, 0.3836100101, -0.4040072858, -0.1202326417, -0.3170889616, 0.1808964312, 0.03846246, -0.1584917009, -0.0123264939, -0.1195178777, 0.0203618258, -0.2523367405, -0.1452476829, -0.0493158996, -0.4198037088, 0.1822547764, 0.0395060033, -0.3351991177, 0.2435081601, 0.1777709424, 0.071756199, 0.450784713, -0.2375143468, 0.1786590517, 0.0119801173, -0.027082352, 0.1566384882, -0.0020523407, 0.3159703612, -0.0124907866, -0.226741448, 0.0910505056, -0.0267283153, 0.1002251059, -0.0211735368, 0.0054353997, -0.0417034775, -0.3602175117, -0.2473488748, -0.1221572012, -0.2448607236, -0.3007043302, 0.225500524, -0.0533377901, -0.0544012934, 0.1680142581, 0.070989497, -0.3310468793, -0.1633546203, 0.3258527517, -0.1954340786, 0.1789702475, 0.5283613205, 0.3944673538, -0.2252093703, -0.3289530873, 0.3111716211, -0.0288909152, -0.3540940285, 0.2759121656, 0.0270213559, -0.0696799457, 0.1265555322, 0.2427482009, -0.1405766606, -0.3056273758, -0.1277887672, -0.2381087393, -0.4304107428, 0.0736414492, -0.0146860834, 0.1428981125, -0.01201839, 0.2134836614, 0.1132788509, 0.2139742076, -0.4020756483, 0.1423479915, -0.1771860868, 0.0996007621, 0.0931536108, -0.1355484426, 0.1392203569, 0.0518150181, 0.1112697199, 0.2546561956, -0.0731490254, -0.349837631, -0.1104321927, 0.0802938268, 0.0429647863, -0.1213977784, -0.018652847, -0.37576285, -0.257121861, -0.2619501054, 0.2055384815, -0.059557002, -0.1060269549, 0.1788901836, -0.1484445781, -0.0594713204, 0.116279766, -0.0307371747, -0.2113559842, 0.1135026142, 0.1723379791, 0.03541122, 0.0073987618, 0.0013163909, -0.315947175, 0.0826842189, 0.1583469957, -0.074155882, 0.3244627714, -0.3757268488, -0.0566545688, 0.1769374162, 0.3609240651, 0.4115838408, -0.2069201022, -0.0283431634, 0.242904827, 0.2981537879, -0.2532998025, -0.1001687646, 0.0949342027, -0.274805218, 0.3003486991, 0.1226504594, 0.1055467576, -0.0864749253, -0.1073916331, 0.2233813852, 0.1748344898, -0.0378284156, -0.1495877504, 0.3407847881, 0.1178210974, -0.0499307029, 0.3016183078, 0.1821384281, 0.3013968468, 0.7971208692, 0.0785120055, 0.2538470328, -0.2110732347, -0.0938018262, -0.1365757138, -0.4474599957, 0.0228370614, -0.080156602, -0.0352987014, -0.1246765777, -0.2056945413, 0.2049505264, -0.0154316071, 0.2201548517, -0.149097085, 0.0436852798, -0.2486524582, -0.1612087339, 0.0436247662, -0.3107735515, -0.0689435601, 0.1061922908, -0.1068827659, 0.3263955414, 0.1242940426, 0.062380448, 0.1022238582, -0.4294127822, 0.2035559118, -0.0626899004, 0.0876706094, -0.3625389636, 0.283985883, 0.2613128424, 0.1896339655, 0.1571779847, 0.1909477562, 0.5729640126, 0.3944715559, -0.136331588, -0.0809937119, -0.2755988538, -0.1103980243, -0.4154099226, 0.2859198749, 0.1979575902, 0.3553818166, 0.4643360972, 0.2075482011, -0.2680901289, -0.0660787076, 0.1920684576, -0.2637647986, -0.2234544754, 0.1047904044, 0.06799905, -0.2863511443, -0.3225551844, -0.0183769017, -0.4338229895, -0.004564492, 0.5934920311, -0.113588199, 0.3091259897, -0.3395168185, 0.1448054761, 0.0052000768, 0.6057995558, 0.295388788, -0.0178113356, -0.2954340577, -0.0195162073, -0.6181396842, 0.1262813359, -0.1511760056, 0.0123335719, -0.0116539802, 0.142536521, 0.1738923341, -0.0330507942, -0.00510104, 0.0362966694, -0.1498090327, -0.1097010374, -0.3313085437, -0.267660886, -0.2250266969, 0.1569494605, -0.001462914, -0.3760662079, 0.1550828815, -0.095508799, 0.2300338149, -0.1793869138, 0.0802164599, -0.0206276886, -0.0355978087, 0.1578442454, 0.0340669006, 0.3657599092, -0.2838610411, 0.0340453833, -0.1346195787, -0.0631755218, -0.1700665653, 0.1941982657, 0.2527388334, 0.2654083371, -0.0962214097, -0.1707081348, -0.1380879581, 0.1567031145, 0.0587183684, 0.0371346846, 0.1429637074, -0.0507430211, 0.1262889057, 0.0587170273, 0.1863670945, 0.4029747844, -0.1113760322, -0.034302935, -0.3506343663, -0.5414937139, 0.2185380459, -0.1180816963, -0.515848279, 0.0103076659, 0.1300401539, 0.1811081767, 0.0267860945, -0.5836284161, 0.3067768812, 0.3096932471, 0.1339993179, -0.3529008031, 0.258033663, -0.0505171418, 0.0583431274, 0.0119165555, 0.1201328412, 0.0242640786, -0.2702243924, -0.2056573778, -0.0322017111 ]
https://github.com/huggingface/datasets/issues/625
dtype of tensors should be preserved
Indeed we convert tensors to list to be able to write in arrow format. Because of this conversion we lose the dtype information. We should add the dtype detection when we do type inference. However it would require a bit of refactoring since currently the conversion happens before the type inference.. And then for your information, when reading from arrow format we have to cast from arrow to numpy (which is fast since pyarrow has a numpy integration), and then to torch. However there's one thing that can help you: we make sure that the dtypes correspond to what is defined in `features`. Therefore what you can do is provide `features` in `.map(preprocess, feature=...)` to specify the output types. For example in your case: ```python from datasets import Features, Value, Sequence features = Features({ "input_ids": Sequence(Value("int32")), "sembedding": Sequence(Value("float32")) }) preprocessed_dataset = dataset.map(preprocess, features=features) preprocessed_dataset.set_format("torch", columns=["input_ids", "sembedding"]) print(preprocessed_dataset[0]["sembedding"].dtype) # "torch.float32" ``` Let me know if it helps
After switching to `datasets` my model just broke. After a weekend of debugging, the issue was that my model could not handle the double that the Dataset provided, as it expected a float (but didn't give a warning, which seems a [PyTorch issue](https://discuss.pytorch.org/t/is-it-required-that-input-and-hidden-for-gru-have-the-same-dtype-float32/96221)). As a user I did not expect this bug. I have a `map` function that I call on the Dataset that looks like this: ```python def preprocess(sentences: List[str]): token_ids = [[vocab.to_index(t) for t in s.split()] for s in sentences] sembeddings = stransformer.encode(sentences) print(sembeddings.dtype) return {"input_ids": token_ids, "sembedding": sembeddings} ``` Given a list of `sentences` (`List[str]`), it converts those into token_ids on the one hand (list of lists of ints; `List[List[int]]`) and into sentence embeddings on the other (Tensor of dtype `torch.float32`). That means that I actually set the column "sembedding" to a tensor that I as a user expect to be a float32. It appears though that behind the scenes, this tensor is converted into a **list**. I did not find this documented anywhere but I might have missed it. From a user's perspective this is incredibly important though, because it means you cannot do any data_type or tensor casting yourself in a mapping function! Furthermore, this can lead to issues, as was my case. My model expected float32 precision, which I thought `sembedding` was because that is what `stransformer.encode` outputs. But behind the scenes this tensor is first cast to a list, and when we then set its format, as below, this column is cast not to float32 but to double precision float64. ```python dataset.set_format(type="torch", columns=["input_ids", "sembedding"]) ``` This happens because apparently there is an intermediate step of casting to a **numpy** array (?) **whose dtype creation/deduction is different from torch dtypes** (see the snippet below). As you can see, this means that the dtype is not preserved: if I got it right, the dataset goes from torch.float32 -> list -> float64 (numpy) -> torch.float64. ```python import torch import numpy as np l = [-0.03010837361216545, -0.035979013890028, -0.016949838027358055] torch_tensor = torch.tensor(l) np_array = np.array(l) np_to_torch = torch.from_numpy(np_array) print(torch_tensor.dtype) # torch.float32 print(np_array.dtype) # float64 print(np_to_torch.dtype) # torch.float64 ``` This might lead to unwanted behaviour. I understand that the whole library is probably built around casting from numpy to other frameworks, so this might be difficult to solve. Perhaps `set_format` should include a `dtypes` option where for each input column the user can specify the wanted precision. The alternative is that the user needs to cast manually after loading data from the dataset but that does not seem user-friendly, makes the dataset less portable, and might use more space in memory as well as on disk than is actually needed.
156
dtype of tensors should be preserved After switching to `datasets` my model just broke. After a weekend of debugging, the issue was that my model could not handle the double that the Dataset provided, as it expected a float (but didn't give a warning, which seems a [PyTorch issue](https://discuss.pytorch.org/t/is-it-required-that-input-and-hidden-for-gru-have-the-same-dtype-float32/96221)). As a user I did not expect this bug. I have a `map` function that I call on the Dataset that looks like this: ```python def preprocess(sentences: List[str]): token_ids = [[vocab.to_index(t) for t in s.split()] for s in sentences] sembeddings = stransformer.encode(sentences) print(sembeddings.dtype) return {"input_ids": token_ids, "sembedding": sembeddings} ``` Given a list of `sentences` (`List[str]`), it converts those into token_ids on the one hand (list of lists of ints; `List[List[int]]`) and into sentence embeddings on the other (Tensor of dtype `torch.float32`). That means that I actually set the column "sembedding" to a tensor that I as a user expect to be a float32. It appears though that behind the scenes, this tensor is converted into a **list**. I did not find this documented anywhere but I might have missed it. From a user's perspective this is incredibly important though, because it means you cannot do any data_type or tensor casting yourself in a mapping function! Furthermore, this can lead to issues, as was my case. My model expected float32 precision, which I thought `sembedding` was because that is what `stransformer.encode` outputs. But behind the scenes this tensor is first cast to a list, and when we then set its format, as below, this column is cast not to float32 but to double precision float64. ```python dataset.set_format(type="torch", columns=["input_ids", "sembedding"]) ``` This happens because apparently there is an intermediate step of casting to a **numpy** array (?) **whose dtype creation/deduction is different from torch dtypes** (see the snippet below). As you can see, this means that the dtype is not preserved: if I got it right, the dataset goes from torch.float32 -> list -> float64 (numpy) -> torch.float64. ```python import torch import numpy as np l = [-0.03010837361216545, -0.035979013890028, -0.016949838027358055] torch_tensor = torch.tensor(l) np_array = np.array(l) np_to_torch = torch.from_numpy(np_array) print(torch_tensor.dtype) # torch.float32 print(np_array.dtype) # float64 print(np_to_torch.dtype) # torch.float64 ``` This might lead to unwanted behaviour. I understand that the whole library is probably built around casting from numpy to other frameworks, so this might be difficult to solve. Perhaps `set_format` should include a `dtypes` option where for each input column the user can specify the wanted precision. The alternative is that the user needs to cast manually after loading data from the dataset but that does not seem user-friendly, makes the dataset less portable, and might use more space in memory as well as on disk than is actually needed. Indeed we convert tensors to list to be able to write in arrow format. Because of this conversion we lose the dtype information. We should add the dtype detection when we do type inference. However it would require a bit of refactoring since currently the conversion happens before the type inference.. And then for your information, when reading from arrow format we have to cast from arrow to numpy (which is fast since pyarrow has a numpy integration), and then to torch. However there's one thing that can help you: we make sure that the dtypes correspond to what is defined in `features`. Therefore what you can do is provide `features` in `.map(preprocess, feature=...)` to specify the output types. For example in your case: ```python from datasets import Features, Value, Sequence features = Features({ "input_ids": Sequence(Value("int32")), "sembedding": Sequence(Value("float32")) }) preprocessed_dataset = dataset.map(preprocess, features=features) preprocessed_dataset.set_format("torch", columns=["input_ids", "sembedding"]) print(preprocessed_dataset[0]["sembedding"].dtype) # "torch.float32" ``` Let me know if it helps
[ -0.1134336144, -0.2211150825, -0.0097108036, 0.2073050141, 0.5532286763, 0.1730129719, 0.5313700438, 0.1225808039, 0.1504826248, -0.0665392429, -0.0843995586, 0.2457151115, -0.1175517887, -0.1751454324, 0.1026160568, -0.2025274038, 0.2282531857, -0.065256834, -0.1441731602, -0.2050459087, -0.2274058163, -0.0888315886, -0.0027086586, -0.2082732171, -0.1601838768, -0.1640592217, 0.240316689, -0.1392810941, -0.1851638108, -0.0010074526, 0.1801276207, -0.165222764, 0.4167292118, 0.5717648268, -0.0001163156, 0.2217252254, -0.0266850367, -0.0754260197, -0.1886361241, -0.0001530126, 0.0046068206, -0.1931502819, -0.0723766983, -0.1798605621, -0.1541206837, -0.1833254099, -0.0845148191, -0.7935447097, 0.2889492214, 0.4108264744, 0.1519325674, 0.2570821345, 0.0712023601, 0.1525934786, -0.1351298988, 0.2725165784, -0.1030129716, 0.2802621126, -0.0208771788, 0.3604672551, -0.0614817999, 0.4459062815, -0.3614788353, -0.0750735849, 0.0291571636, 0.1127887666, 0.0541249253, -0.4639117122, -0.0662840605, 0.1654174775, 0.1973421127, -0.2320810258, -0.2843654752, -0.1672884375, -0.0933568627, -0.3671623468, 0.0662377477, -0.0845816582, 0.1454866976, 0.1031547934, -0.084708333, -0.1715694964, -0.0345693901, 0.132095933, -0.4753400683, 0.2185825408, -0.0089361612, 0.147419095, -0.1035436094, -0.0327194408, 0.0743178278, -0.1159509942, 0.220984593, -0.0332038254, -0.1700053811, -0.2236017585, 0.106810376, -0.4505302608, -0.1278471351, -0.4818674624, 0.2282645702, 0.1562226564, -0.2183919698, -0.0274901241, 0.2101505399, 0.3317387402, -0.2976430655, 0.3678762019, 0.2994464934, -0.0447379835, 0.2111326456, 0.0985192806, 0.1282294542, 0.0464339815, -0.0115372315, 0.1037944406, 0.5391174555, -0.1061273888, -0.3156950176, 0.2423449755, -0.4640918672, 0.0854397118, 0.0290503949, 0.0762219578, -0.1580674946, 0.4469192326, 0.1909371465, 0.0856407434, -0.1537803113, -0.0482111759, -0.1408703327, -0.2463302016, 0.0493122786, -0.1757228971, 0.0473481566, 0.1110769957, -0.0556813776, 0.290984422, 0.130494982, 0.1616605818, -0.0623957328, -0.1468246132, 0.515632689, 0.2943936288, -0.3300796151, 0.1609629989, 0.214812547, -0.3500459194, -0.1941234469, 0.3413830698, -0.2835557461, 0.0002672374, -0.2552977204, 0.0896735489, -0.062633574, -0.1942968667, 0.0383377224, 0.5168799162, 0.5945782065, 0.0745405406, 0.3335527182, -0.5255037546, -0.2444200218, -0.2121092826, 0.0651091039, -0.0023090839, -0.4651450515, -0.0084925443, 0.2366488278, 0.1842958629, 0.3561901748, 0.4137145579, 0.070492059, 0.1077741683, 0.0749109462, -0.0053548366, 0.422085762, 0.1117165983, -0.2204664946, -0.0649489835, -0.0586558543, 0.3635666966, -0.0807049125, 0.0079561472, 0.3194953799, -0.2035727054, 0.207832396, 0.0254101194, -0.2768172324, 0.081522055, -0.2061398327, 0.0387927443, 0.4647619426, -0.0825527012, -0.0355011635, -0.0381798372, -0.3315748572, 0.3626126051, 0.2424904704, -0.1368880272, 0.0368466824, 0.0449779406, 0.0548131429, -0.0328166448, 0.083296001, -0.0828899145, -0.523478508, -0.0470568091, 0.0362096056, 0.2670903206, -0.0503192022, -0.1362543404, 0.2245899439, 0.0831880346, -0.0103787761, 0.0391421914, 0.068150647, -0.1717520505, -0.4256087542, 0.0723533034, -0.0966649354, -0.2199125588, -0.0393037573, 0.1064352095, -0.497748524, 0.080811061, 0.0075051542, -0.196670413, -0.2570156753, 0.2851540446, 0.0231733844, -0.0891863108, -0.2323271185, 0.0862705931, 0.1931111217, -0.0923556909, -0.6234654784, 0.5602860451, 0.5223325491, 0.0226680264, 0.196981132, 0.417899996, 0.0167535916, 0.066881381, -0.1525681913, 0.166479975, 0.1083380207, 0.0184073355, -0.295819819, 0.0697329417, 0.2497302741, 0.1761268228, -0.2609066963, -0.1829362661, -0.14051193, -0.0761566609, -0.0279821493, 0.1387432218, -0.4120872617, 0.3359063864, 0.6934579611, -0.0322736055, 0.264243722, 0.0308322217, -0.3138769269, -0.1533609182, 0.2285328209, -0.121576719, 0.2470543087, 0.085140489, 0.3608531058, -0.1999171078, -0.1625251323, -0.0748272985, 0.2153990865, 0.0469215438, 0.0663606897, 0.0049414169, 0.075100854, 0.154191941, -0.1057291776, 0.2709604204, -0.0503993109, 0.2032074034, -0.4259135127, 0.2171183378, -0.401653856, 0.0512693822, -0.2296759188, -0.0917811692, -0.0777893066, -0.245613724, -0.0293585751, 0.0008401242, -0.2700946331, 0.393512845, 0.2614876032, 0.1166924238, 0.0873448104, -0.2350942492, -0.114613235, -0.1724228412, -0.2385483682, 0.0400344506, 0.2083546817, -0.3795831203, 0.1849565059, 0.3574537933, -0.2676814795, -0.2696792185, -0.7421148419, 0.0485986546, -0.2355481386, 0.0960824043, 0.0381220654, 0.129312247, -0.0246390998, 0.0510538854, 0.0254739635, 0.09447667, -0.018103987, 0.1180343628, -0.1916359961, 0.0291313007, -0.2872900963, -0.2801525295, -0.0367823467, -0.1219954342, 0.1812552363, -0.0798305348, -0.0676222444, 0.0514864661, 0.0912298709, 0.0310502388, -0.0581304803, 0.231089741, -0.4207526445, 0.0914394483, 0.334905386, -0.1597365737, -0.3068856299, -0.1432787329, 0.0234638304, -0.0182584673, 0.074514091, -0.2800922692, -0.0570331328, -0.2157622725, 0.2658356428, -0.0140345972, -0.1009146497, 0.5142833591, 0.2329592407, 0.0712705329, -0.0603920557, 0.015321821, 0.0630325675, 0.2177909315, 0.2721605599, -0.0094192941, 0.3699364364, 0.1042173952, 0.358104825, 0.2943380475, -0.6948934793, -0.1260359883, -0.0573533401, 0.2153843343, -0.2174500972, -0.2911496758, 0.0320031941, -0.2955265641, -0.1989386678, 0.0065346621, -0.2766370475, -0.2770921886, 0.1184649616, 0.2331217527, 0.1338419616, -0.4014646411, 0.4034070671, -0.3055262268, -0.0188107193, 0.0703195035, 0.1725529581, -0.1366490126, -0.1540073156, 0.1249558032, 0.0767138973, 0.1390782148, -0.0452389829, -0.1560621262, -0.1616585106, -0.5573343635, 0.2960476875, 0.0733386129, 0.3998683095, -0.101998888, -0.3138101101, -0.1236894354, 0.2658504844, 0.3583014607, -0.0651800931, -0.3085550964, -0.0340753198, 0.0468397588, -0.1119081452, 0.0908225179, -0.3705395162, 0.1399977207, 0.1598604321, 0.4799205959, -0.3152985573, -0.1501232237, 0.3407086432, 0.0990059972, -0.2651865184, 0.1257001013, -0.3059801161, -0.217253387, 0.1029197499, -0.1605291367, 0.3218427896, -0.1130135208, -0.1629984826, -0.1421162784, -0.1887492687, -0.0593866706, 0.3024404943, -0.0537968241, 0.174654305, 0.2735586166, -0.0524263158, 0.1738077998, 0.4484737515, 0.0710387602, 0.3958069086, -0.2539947331, -0.4725272655, 0.1205892116, 0.2626002133, 0.2191854417, 0.3360384107, -0.0665045381, -0.0107503273, 0.0495241545, 0.2013027668, -0.4152361155, 0.1450720429, 0.3002058864, 0.4395445585, -0.3818066716, -0.3916044831, 0.3398996592, 0.1049043238, -0.1118495092, 0.3452863693, -0.3406514525, -0.4210640788, 0.4126034677, 0.4866728783, 0.94391644, -0.051524777, 0.3623065948, 0.3886758685, 0.0395122543, 0.7027943134, -0.2195883691, 0.0953721404, -0.1092886031, -0.2003382593, -0.1821272969, 0.0283359066, 0.1922812462, -0.1768276691, -0.3934643269, -0.0114231706, 0.0450150669, 0.3950945139, 0.0918087959, 0.1370074153, 0.1337354779, -0.3839355707, 0.0010396261, 0.0962077603, -0.1420923024, -0.0245580226, 0.0258369744, -0.1296163052, -0.2271974683, -0.0388556868, -0.2351937592, -0.2038245648, -0.6721232533, 0.227905646, -0.2112895697, -0.3824329972, 0.2119166255, 0.3805316985, 0.4986085892, 0.0699417368, -0.1444133222, -0.0540998578, 0.1406935751, 0.250490725, 0.3674525619, -0.3385370374, 0.5262500048, 0.1788809597, -0.0170450509, -0.0523226112, 0.0190298371, 0.1195784956, -0.3839370608, -0.0444362238, 0.0104464153, -0.4486308694, 0.0180262439, 0.051819168, 0.1929473132, -0.0719196349, 0.0803572536, 0.0033199172, 0.0604200028, 0.4390888512, -0.2384051532, -0.2049400359, -0.1360176504, 0.2581453621, 0.3120487928, -0.0276789591, 0.3316819072, 0.1748650521, -0.2199060917, -0.1326006949, 0.269662261, 0.096247673, -0.2718694508, 0.2938762605, -0.2294586897, -0.3275020421, -0.4029333889, 0.0242942739, 0.1296151876, 0.0200842898, -0.4275779128, 0.0481217206, -0.4269217849, 0.1955786496, 0.1994587332, 0.2294970751, -0.0968036354, 0.2902588248, -0.1862219274, -0.0752581805, -0.2788595259, -0.1454714239, 0.2015165091, 0.1607854664, -0.0480513833, -0.0622588359, 0.215378046, -0.2433128953, 0.0800852478, 0.0955788642, 0.0832241625, -0.1688763201, -0.1338000298, 0.0844032094, -0.0721536726, -0.1631307006, -0.0209845863, -0.1953461468, -0.1867281199, -0.1678042412, 0.1126663238, 0.2799354494, 0.1018424034, 0.2763626873, 0.0417338498, 0.1626680046, 0.3086151481, 0.0054950528, -0.1270997971, 0.2509423196, 0.063558951, 0.3367977738, 0.0193025991, -0.0522144139, -0.1994487792, 0.2303505838, 0.0502555668, -0.0898418948, 0.4385260344, -0.1862916201, -0.1380280405, 0.0113268159, 0.1839900762, 0.2505348027, -0.312196672, -0.1436043233, -0.0271206591, 0.108014293, -0.1846054196, 0.1011112928, 0.1920876205, -0.1086048707, 0.126740098, 0.4136994183, -0.0296261907, 0.5634446144, -0.0397619195, -0.0074641779, 0.1605867147, -0.2002462596, 0.2797859609, 0.263199091, -0.0898274779, 0.1576704383, 0.2914582789, 0.0799151585, 0.1357218623, 0.143338576, 0.0475201681, 0.2089090943, 0.059794981, 0.0278488994, 0.4162576199, -0.1557864845, 0.1227426454, 0.0908497199, -0.4397898018, 0.2601423264, 0.3826687336, 0.0751200765, -0.305226177, -0.2359906286, -0.1305059344, 0.1503946781, -0.4149085879, 0.0025797896, -0.2539524436, 0.0859751105, -0.0619615614, -0.2450638115, -0.2087459266, -0.0473562516, 0.2073627561, 0.2033633888, -0.1458738595, 0.0916258693, -0.2127765268, -0.1652701348, 0.08959952, 0.0469890982, 0.3118065298, 0.3976687193, -0.190330416, -0.1640623808, 0.2795585692, 0.2608897388, 0.1900887787, -0.2551001608, 0.3188129663, 0.1368777901, -0.2822489142, 0.0019034073, 0.259434849, 0.0223127957, -0.0181629397, 0.2152704597, 0.1037105396, -0.0287231989, 0.1587678939, -0.0829132944, 0.2346436232, 0.0197712034, 0.6455356479, -0.0429135337, -0.1467235833, -0.0784748271, -0.207385391, -0.2123956978, -0.2224058956, 0.7710805535, -0.1710174084, 0.2005573809, -0.0775000826, 0.0657909364, 0.2825327218, 0.3724639416, 0.0875459909, -0.0523023382, -0.2187705934, 0.3671227694, -0.27810359, 0.1055852175, 0.2141847312, -0.0478555821, -0.0218304284, 0.2236845195, 0.4392274618, 0.2588707805, 0.0447612181, -0.4173417985, 0.2164114714, 0.0331137255, -0.1054255143, -0.2507251501, -0.4146488607, 0.1806094348, 0.0294061229, -0.3498019576, 0.3316437304, 0.1330267787, -0.0241066515, -0.4745330513, 0.0894951522, -0.0053971633, 0.2187124789, 0.2378495634, 0.4377464056, 0.1268993616, -0.1296064258, -0.255592823, 0.1482959688, -0.0524923131, -0.4803517461, 0.4116392136, -0.1601350904, 0.3657150269, -0.3011513948, -0.2907636166, -0.1951593459, 0.0258601829, 0.0851696581, -0.4364663959, -0.5503332615, 0.3894612789, -0.1322183162, -0.0845067501, 0.2017910331, 0.4802950025, 0.0986787528, 0.3818565309, -0.2019928843, -0.3419241309, 0.6405502558, -0.2690971494, -0.0412026197, -0.0623358898, -0.0725637078, -0.0648595989, 0.0509312525, -0.8520144224, -0.2399327308, 0.2974298, -0.2665521801, -0.2482232749, 0.309158653, 0.1196828038, 0.134145543, -0.3145253956, 0.5315724015, -0.045614209, -0.152200073, 0.094045639, -0.1882256418 ]
https://github.com/huggingface/datasets/issues/625
dtype of tensors should be preserved
If the arrow format is basically lists, why is the intermediate step to numpy necessary? I am a bit confused about that part. Thanks for your suggestion. as I have currently implemented this, I cast to torch.Tensor in my collate_fn to save disk space (so I do not have to save padded tensors to max_len but can pad up to max batch len in collate_fn) at the cost of a bit slower processing. So for me this is not relevant anymore, but I am sure it is for others!
After switching to `datasets` my model just broke. After a weekend of debugging, the issue was that my model could not handle the double that the Dataset provided, as it expected a float (but didn't give a warning, which seems a [PyTorch issue](https://discuss.pytorch.org/t/is-it-required-that-input-and-hidden-for-gru-have-the-same-dtype-float32/96221)). As a user I did not expect this bug. I have a `map` function that I call on the Dataset that looks like this: ```python def preprocess(sentences: List[str]): token_ids = [[vocab.to_index(t) for t in s.split()] for s in sentences] sembeddings = stransformer.encode(sentences) print(sembeddings.dtype) return {"input_ids": token_ids, "sembedding": sembeddings} ``` Given a list of `sentences` (`List[str]`), it converts those into token_ids on the one hand (list of lists of ints; `List[List[int]]`) and into sentence embeddings on the other (Tensor of dtype `torch.float32`). That means that I actually set the column "sembedding" to a tensor that I as a user expect to be a float32. It appears though that behind the scenes, this tensor is converted into a **list**. I did not find this documented anywhere but I might have missed it. From a user's perspective this is incredibly important though, because it means you cannot do any data_type or tensor casting yourself in a mapping function! Furthermore, this can lead to issues, as was my case. My model expected float32 precision, which I thought `sembedding` was because that is what `stransformer.encode` outputs. But behind the scenes this tensor is first cast to a list, and when we then set its format, as below, this column is cast not to float32 but to double precision float64. ```python dataset.set_format(type="torch", columns=["input_ids", "sembedding"]) ``` This happens because apparently there is an intermediate step of casting to a **numpy** array (?) **whose dtype creation/deduction is different from torch dtypes** (see the snippet below). As you can see, this means that the dtype is not preserved: if I got it right, the dataset goes from torch.float32 -> list -> float64 (numpy) -> torch.float64. ```python import torch import numpy as np l = [-0.03010837361216545, -0.035979013890028, -0.016949838027358055] torch_tensor = torch.tensor(l) np_array = np.array(l) np_to_torch = torch.from_numpy(np_array) print(torch_tensor.dtype) # torch.float32 print(np_array.dtype) # float64 print(np_to_torch.dtype) # torch.float64 ``` This might lead to unwanted behaviour. I understand that the whole library is probably built around casting from numpy to other frameworks, so this might be difficult to solve. Perhaps `set_format` should include a `dtypes` option where for each input column the user can specify the wanted precision. The alternative is that the user needs to cast manually after loading data from the dataset but that does not seem user-friendly, makes the dataset less portable, and might use more space in memory as well as on disk than is actually needed.
89
dtype of tensors should be preserved After switching to `datasets` my model just broke. After a weekend of debugging, the issue was that my model could not handle the double that the Dataset provided, as it expected a float (but didn't give a warning, which seems a [PyTorch issue](https://discuss.pytorch.org/t/is-it-required-that-input-and-hidden-for-gru-have-the-same-dtype-float32/96221)). As a user I did not expect this bug. I have a `map` function that I call on the Dataset that looks like this: ```python def preprocess(sentences: List[str]): token_ids = [[vocab.to_index(t) for t in s.split()] for s in sentences] sembeddings = stransformer.encode(sentences) print(sembeddings.dtype) return {"input_ids": token_ids, "sembedding": sembeddings} ``` Given a list of `sentences` (`List[str]`), it converts those into token_ids on the one hand (list of lists of ints; `List[List[int]]`) and into sentence embeddings on the other (Tensor of dtype `torch.float32`). That means that I actually set the column "sembedding" to a tensor that I as a user expect to be a float32. It appears though that behind the scenes, this tensor is converted into a **list**. I did not find this documented anywhere but I might have missed it. From a user's perspective this is incredibly important though, because it means you cannot do any data_type or tensor casting yourself in a mapping function! Furthermore, this can lead to issues, as was my case. My model expected float32 precision, which I thought `sembedding` was because that is what `stransformer.encode` outputs. But behind the scenes this tensor is first cast to a list, and when we then set its format, as below, this column is cast not to float32 but to double precision float64. ```python dataset.set_format(type="torch", columns=["input_ids", "sembedding"]) ``` This happens because apparently there is an intermediate step of casting to a **numpy** array (?) **whose dtype creation/deduction is different from torch dtypes** (see the snippet below). As you can see, this means that the dtype is not preserved: if I got it right, the dataset goes from torch.float32 -> list -> float64 (numpy) -> torch.float64. ```python import torch import numpy as np l = [-0.03010837361216545, -0.035979013890028, -0.016949838027358055] torch_tensor = torch.tensor(l) np_array = np.array(l) np_to_torch = torch.from_numpy(np_array) print(torch_tensor.dtype) # torch.float32 print(np_array.dtype) # float64 print(np_to_torch.dtype) # torch.float64 ``` This might lead to unwanted behaviour. I understand that the whole library is probably built around casting from numpy to other frameworks, so this might be difficult to solve. Perhaps `set_format` should include a `dtypes` option where for each input column the user can specify the wanted precision. The alternative is that the user needs to cast manually after loading data from the dataset but that does not seem user-friendly, makes the dataset less portable, and might use more space in memory as well as on disk than is actually needed. If the arrow format is basically lists, why is the intermediate step to numpy necessary? I am a bit confused about that part. Thanks for your suggestion. as I have currently implemented this, I cast to torch.Tensor in my collate_fn to save disk space (so I do not have to save padded tensors to max_len but can pad up to max batch len in collate_fn) at the cost of a bit slower processing. So for me this is not relevant anymore, but I am sure it is for others!
[ -0.1134336144, -0.2211150825, -0.0097108036, 0.2073050141, 0.5532286763, 0.1730129719, 0.5313700438, 0.1225808039, 0.1504826248, -0.0665392429, -0.0843995586, 0.2457151115, -0.1175517887, -0.1751454324, 0.1026160568, -0.2025274038, 0.2282531857, -0.065256834, -0.1441731602, -0.2050459087, -0.2274058163, -0.0888315886, -0.0027086586, -0.2082732171, -0.1601838768, -0.1640592217, 0.240316689, -0.1392810941, -0.1851638108, -0.0010074526, 0.1801276207, -0.165222764, 0.4167292118, 0.5717648268, -0.0001163156, 0.2217252254, -0.0266850367, -0.0754260197, -0.1886361241, -0.0001530126, 0.0046068206, -0.1931502819, -0.0723766983, -0.1798605621, -0.1541206837, -0.1833254099, -0.0845148191, -0.7935447097, 0.2889492214, 0.4108264744, 0.1519325674, 0.2570821345, 0.0712023601, 0.1525934786, -0.1351298988, 0.2725165784, -0.1030129716, 0.2802621126, -0.0208771788, 0.3604672551, -0.0614817999, 0.4459062815, -0.3614788353, -0.0750735849, 0.0291571636, 0.1127887666, 0.0541249253, -0.4639117122, -0.0662840605, 0.1654174775, 0.1973421127, -0.2320810258, -0.2843654752, -0.1672884375, -0.0933568627, -0.3671623468, 0.0662377477, -0.0845816582, 0.1454866976, 0.1031547934, -0.084708333, -0.1715694964, -0.0345693901, 0.132095933, -0.4753400683, 0.2185825408, -0.0089361612, 0.147419095, -0.1035436094, -0.0327194408, 0.0743178278, -0.1159509942, 0.220984593, -0.0332038254, -0.1700053811, -0.2236017585, 0.106810376, -0.4505302608, -0.1278471351, -0.4818674624, 0.2282645702, 0.1562226564, -0.2183919698, -0.0274901241, 0.2101505399, 0.3317387402, -0.2976430655, 0.3678762019, 0.2994464934, -0.0447379835, 0.2111326456, 0.0985192806, 0.1282294542, 0.0464339815, -0.0115372315, 0.1037944406, 0.5391174555, -0.1061273888, -0.3156950176, 0.2423449755, -0.4640918672, 0.0854397118, 0.0290503949, 0.0762219578, -0.1580674946, 0.4469192326, 0.1909371465, 0.0856407434, -0.1537803113, -0.0482111759, -0.1408703327, -0.2463302016, 0.0493122786, -0.1757228971, 0.0473481566, 0.1110769957, -0.0556813776, 0.290984422, 0.130494982, 0.1616605818, -0.0623957328, -0.1468246132, 0.515632689, 0.2943936288, -0.3300796151, 0.1609629989, 0.214812547, -0.3500459194, -0.1941234469, 0.3413830698, -0.2835557461, 0.0002672374, -0.2552977204, 0.0896735489, -0.062633574, -0.1942968667, 0.0383377224, 0.5168799162, 0.5945782065, 0.0745405406, 0.3335527182, -0.5255037546, -0.2444200218, -0.2121092826, 0.0651091039, -0.0023090839, -0.4651450515, -0.0084925443, 0.2366488278, 0.1842958629, 0.3561901748, 0.4137145579, 0.070492059, 0.1077741683, 0.0749109462, -0.0053548366, 0.422085762, 0.1117165983, -0.2204664946, -0.0649489835, -0.0586558543, 0.3635666966, -0.0807049125, 0.0079561472, 0.3194953799, -0.2035727054, 0.207832396, 0.0254101194, -0.2768172324, 0.081522055, -0.2061398327, 0.0387927443, 0.4647619426, -0.0825527012, -0.0355011635, -0.0381798372, -0.3315748572, 0.3626126051, 0.2424904704, -0.1368880272, 0.0368466824, 0.0449779406, 0.0548131429, -0.0328166448, 0.083296001, -0.0828899145, -0.523478508, -0.0470568091, 0.0362096056, 0.2670903206, -0.0503192022, -0.1362543404, 0.2245899439, 0.0831880346, -0.0103787761, 0.0391421914, 0.068150647, -0.1717520505, -0.4256087542, 0.0723533034, -0.0966649354, -0.2199125588, -0.0393037573, 0.1064352095, -0.497748524, 0.080811061, 0.0075051542, -0.196670413, -0.2570156753, 0.2851540446, 0.0231733844, -0.0891863108, -0.2323271185, 0.0862705931, 0.1931111217, -0.0923556909, -0.6234654784, 0.5602860451, 0.5223325491, 0.0226680264, 0.196981132, 0.417899996, 0.0167535916, 0.066881381, -0.1525681913, 0.166479975, 0.1083380207, 0.0184073355, -0.295819819, 0.0697329417, 0.2497302741, 0.1761268228, -0.2609066963, -0.1829362661, -0.14051193, -0.0761566609, -0.0279821493, 0.1387432218, -0.4120872617, 0.3359063864, 0.6934579611, -0.0322736055, 0.264243722, 0.0308322217, -0.3138769269, -0.1533609182, 0.2285328209, -0.121576719, 0.2470543087, 0.085140489, 0.3608531058, -0.1999171078, -0.1625251323, -0.0748272985, 0.2153990865, 0.0469215438, 0.0663606897, 0.0049414169, 0.075100854, 0.154191941, -0.1057291776, 0.2709604204, -0.0503993109, 0.2032074034, -0.4259135127, 0.2171183378, -0.401653856, 0.0512693822, -0.2296759188, -0.0917811692, -0.0777893066, -0.245613724, -0.0293585751, 0.0008401242, -0.2700946331, 0.393512845, 0.2614876032, 0.1166924238, 0.0873448104, -0.2350942492, -0.114613235, -0.1724228412, -0.2385483682, 0.0400344506, 0.2083546817, -0.3795831203, 0.1849565059, 0.3574537933, -0.2676814795, -0.2696792185, -0.7421148419, 0.0485986546, -0.2355481386, 0.0960824043, 0.0381220654, 0.129312247, -0.0246390998, 0.0510538854, 0.0254739635, 0.09447667, -0.018103987, 0.1180343628, -0.1916359961, 0.0291313007, -0.2872900963, -0.2801525295, -0.0367823467, -0.1219954342, 0.1812552363, -0.0798305348, -0.0676222444, 0.0514864661, 0.0912298709, 0.0310502388, -0.0581304803, 0.231089741, -0.4207526445, 0.0914394483, 0.334905386, -0.1597365737, -0.3068856299, -0.1432787329, 0.0234638304, -0.0182584673, 0.074514091, -0.2800922692, -0.0570331328, -0.2157622725, 0.2658356428, -0.0140345972, -0.1009146497, 0.5142833591, 0.2329592407, 0.0712705329, -0.0603920557, 0.015321821, 0.0630325675, 0.2177909315, 0.2721605599, -0.0094192941, 0.3699364364, 0.1042173952, 0.358104825, 0.2943380475, -0.6948934793, -0.1260359883, -0.0573533401, 0.2153843343, -0.2174500972, -0.2911496758, 0.0320031941, -0.2955265641, -0.1989386678, 0.0065346621, -0.2766370475, -0.2770921886, 0.1184649616, 0.2331217527, 0.1338419616, -0.4014646411, 0.4034070671, -0.3055262268, -0.0188107193, 0.0703195035, 0.1725529581, -0.1366490126, -0.1540073156, 0.1249558032, 0.0767138973, 0.1390782148, -0.0452389829, -0.1560621262, -0.1616585106, -0.5573343635, 0.2960476875, 0.0733386129, 0.3998683095, -0.101998888, -0.3138101101, -0.1236894354, 0.2658504844, 0.3583014607, -0.0651800931, -0.3085550964, -0.0340753198, 0.0468397588, -0.1119081452, 0.0908225179, -0.3705395162, 0.1399977207, 0.1598604321, 0.4799205959, -0.3152985573, -0.1501232237, 0.3407086432, 0.0990059972, -0.2651865184, 0.1257001013, -0.3059801161, -0.217253387, 0.1029197499, -0.1605291367, 0.3218427896, -0.1130135208, -0.1629984826, -0.1421162784, -0.1887492687, -0.0593866706, 0.3024404943, -0.0537968241, 0.174654305, 0.2735586166, -0.0524263158, 0.1738077998, 0.4484737515, 0.0710387602, 0.3958069086, -0.2539947331, -0.4725272655, 0.1205892116, 0.2626002133, 0.2191854417, 0.3360384107, -0.0665045381, -0.0107503273, 0.0495241545, 0.2013027668, -0.4152361155, 0.1450720429, 0.3002058864, 0.4395445585, -0.3818066716, -0.3916044831, 0.3398996592, 0.1049043238, -0.1118495092, 0.3452863693, -0.3406514525, -0.4210640788, 0.4126034677, 0.4866728783, 0.94391644, -0.051524777, 0.3623065948, 0.3886758685, 0.0395122543, 0.7027943134, -0.2195883691, 0.0953721404, -0.1092886031, -0.2003382593, -0.1821272969, 0.0283359066, 0.1922812462, -0.1768276691, -0.3934643269, -0.0114231706, 0.0450150669, 0.3950945139, 0.0918087959, 0.1370074153, 0.1337354779, -0.3839355707, 0.0010396261, 0.0962077603, -0.1420923024, -0.0245580226, 0.0258369744, -0.1296163052, -0.2271974683, -0.0388556868, -0.2351937592, -0.2038245648, -0.6721232533, 0.227905646, -0.2112895697, -0.3824329972, 0.2119166255, 0.3805316985, 0.4986085892, 0.0699417368, -0.1444133222, -0.0540998578, 0.1406935751, 0.250490725, 0.3674525619, -0.3385370374, 0.5262500048, 0.1788809597, -0.0170450509, -0.0523226112, 0.0190298371, 0.1195784956, -0.3839370608, -0.0444362238, 0.0104464153, -0.4486308694, 0.0180262439, 0.051819168, 0.1929473132, -0.0719196349, 0.0803572536, 0.0033199172, 0.0604200028, 0.4390888512, -0.2384051532, -0.2049400359, -0.1360176504, 0.2581453621, 0.3120487928, -0.0276789591, 0.3316819072, 0.1748650521, -0.2199060917, -0.1326006949, 0.269662261, 0.096247673, -0.2718694508, 0.2938762605, -0.2294586897, -0.3275020421, -0.4029333889, 0.0242942739, 0.1296151876, 0.0200842898, -0.4275779128, 0.0481217206, -0.4269217849, 0.1955786496, 0.1994587332, 0.2294970751, -0.0968036354, 0.2902588248, -0.1862219274, -0.0752581805, -0.2788595259, -0.1454714239, 0.2015165091, 0.1607854664, -0.0480513833, -0.0622588359, 0.215378046, -0.2433128953, 0.0800852478, 0.0955788642, 0.0832241625, -0.1688763201, -0.1338000298, 0.0844032094, -0.0721536726, -0.1631307006, -0.0209845863, -0.1953461468, -0.1867281199, -0.1678042412, 0.1126663238, 0.2799354494, 0.1018424034, 0.2763626873, 0.0417338498, 0.1626680046, 0.3086151481, 0.0054950528, -0.1270997971, 0.2509423196, 0.063558951, 0.3367977738, 0.0193025991, -0.0522144139, -0.1994487792, 0.2303505838, 0.0502555668, -0.0898418948, 0.4385260344, -0.1862916201, -0.1380280405, 0.0113268159, 0.1839900762, 0.2505348027, -0.312196672, -0.1436043233, -0.0271206591, 0.108014293, -0.1846054196, 0.1011112928, 0.1920876205, -0.1086048707, 0.126740098, 0.4136994183, -0.0296261907, 0.5634446144, -0.0397619195, -0.0074641779, 0.1605867147, -0.2002462596, 0.2797859609, 0.263199091, -0.0898274779, 0.1576704383, 0.2914582789, 0.0799151585, 0.1357218623, 0.143338576, 0.0475201681, 0.2089090943, 0.059794981, 0.0278488994, 0.4162576199, -0.1557864845, 0.1227426454, 0.0908497199, -0.4397898018, 0.2601423264, 0.3826687336, 0.0751200765, -0.305226177, -0.2359906286, -0.1305059344, 0.1503946781, -0.4149085879, 0.0025797896, -0.2539524436, 0.0859751105, -0.0619615614, -0.2450638115, -0.2087459266, -0.0473562516, 0.2073627561, 0.2033633888, -0.1458738595, 0.0916258693, -0.2127765268, -0.1652701348, 0.08959952, 0.0469890982, 0.3118065298, 0.3976687193, -0.190330416, -0.1640623808, 0.2795585692, 0.2608897388, 0.1900887787, -0.2551001608, 0.3188129663, 0.1368777901, -0.2822489142, 0.0019034073, 0.259434849, 0.0223127957, -0.0181629397, 0.2152704597, 0.1037105396, -0.0287231989, 0.1587678939, -0.0829132944, 0.2346436232, 0.0197712034, 0.6455356479, -0.0429135337, -0.1467235833, -0.0784748271, -0.207385391, -0.2123956978, -0.2224058956, 0.7710805535, -0.1710174084, 0.2005573809, -0.0775000826, 0.0657909364, 0.2825327218, 0.3724639416, 0.0875459909, -0.0523023382, -0.2187705934, 0.3671227694, -0.27810359, 0.1055852175, 0.2141847312, -0.0478555821, -0.0218304284, 0.2236845195, 0.4392274618, 0.2588707805, 0.0447612181, -0.4173417985, 0.2164114714, 0.0331137255, -0.1054255143, -0.2507251501, -0.4146488607, 0.1806094348, 0.0294061229, -0.3498019576, 0.3316437304, 0.1330267787, -0.0241066515, -0.4745330513, 0.0894951522, -0.0053971633, 0.2187124789, 0.2378495634, 0.4377464056, 0.1268993616, -0.1296064258, -0.255592823, 0.1482959688, -0.0524923131, -0.4803517461, 0.4116392136, -0.1601350904, 0.3657150269, -0.3011513948, -0.2907636166, -0.1951593459, 0.0258601829, 0.0851696581, -0.4364663959, -0.5503332615, 0.3894612789, -0.1322183162, -0.0845067501, 0.2017910331, 0.4802950025, 0.0986787528, 0.3818565309, -0.2019928843, -0.3419241309, 0.6405502558, -0.2690971494, -0.0412026197, -0.0623358898, -0.0725637078, -0.0648595989, 0.0509312525, -0.8520144224, -0.2399327308, 0.2974298, -0.2665521801, -0.2482232749, 0.309158653, 0.1196828038, 0.134145543, -0.3145253956, 0.5315724015, -0.045614209, -0.152200073, 0.094045639, -0.1882256418 ]
https://github.com/huggingface/datasets/issues/625
dtype of tensors should be preserved
I'm glad you managed to figure something out :) Casting from arrow to numpy can be 100x faster than casting from arrow to list. This is because arrow has an integration with numpy that allows it to instantiate numpy arrays with zero-copy from arrow. On the other hand to create python lists it is slow since it has to recreate the list object by iterating through each element in python.
After switching to `datasets` my model just broke. After a weekend of debugging, the issue was that my model could not handle the double that the Dataset provided, as it expected a float (but didn't give a warning, which seems a [PyTorch issue](https://discuss.pytorch.org/t/is-it-required-that-input-and-hidden-for-gru-have-the-same-dtype-float32/96221)). As a user I did not expect this bug. I have a `map` function that I call on the Dataset that looks like this: ```python def preprocess(sentences: List[str]): token_ids = [[vocab.to_index(t) for t in s.split()] for s in sentences] sembeddings = stransformer.encode(sentences) print(sembeddings.dtype) return {"input_ids": token_ids, "sembedding": sembeddings} ``` Given a list of `sentences` (`List[str]`), it converts those into token_ids on the one hand (list of lists of ints; `List[List[int]]`) and into sentence embeddings on the other (Tensor of dtype `torch.float32`). That means that I actually set the column "sembedding" to a tensor that I as a user expect to be a float32. It appears though that behind the scenes, this tensor is converted into a **list**. I did not find this documented anywhere but I might have missed it. From a user's perspective this is incredibly important though, because it means you cannot do any data_type or tensor casting yourself in a mapping function! Furthermore, this can lead to issues, as was my case. My model expected float32 precision, which I thought `sembedding` was because that is what `stransformer.encode` outputs. But behind the scenes this tensor is first cast to a list, and when we then set its format, as below, this column is cast not to float32 but to double precision float64. ```python dataset.set_format(type="torch", columns=["input_ids", "sembedding"]) ``` This happens because apparently there is an intermediate step of casting to a **numpy** array (?) **whose dtype creation/deduction is different from torch dtypes** (see the snippet below). As you can see, this means that the dtype is not preserved: if I got it right, the dataset goes from torch.float32 -> list -> float64 (numpy) -> torch.float64. ```python import torch import numpy as np l = [-0.03010837361216545, -0.035979013890028, -0.016949838027358055] torch_tensor = torch.tensor(l) np_array = np.array(l) np_to_torch = torch.from_numpy(np_array) print(torch_tensor.dtype) # torch.float32 print(np_array.dtype) # float64 print(np_to_torch.dtype) # torch.float64 ``` This might lead to unwanted behaviour. I understand that the whole library is probably built around casting from numpy to other frameworks, so this might be difficult to solve. Perhaps `set_format` should include a `dtypes` option where for each input column the user can specify the wanted precision. The alternative is that the user needs to cast manually after loading data from the dataset but that does not seem user-friendly, makes the dataset less portable, and might use more space in memory as well as on disk than is actually needed.
70
dtype of tensors should be preserved After switching to `datasets` my model just broke. After a weekend of debugging, the issue was that my model could not handle the double that the Dataset provided, as it expected a float (but didn't give a warning, which seems a [PyTorch issue](https://discuss.pytorch.org/t/is-it-required-that-input-and-hidden-for-gru-have-the-same-dtype-float32/96221)). As a user I did not expect this bug. I have a `map` function that I call on the Dataset that looks like this: ```python def preprocess(sentences: List[str]): token_ids = [[vocab.to_index(t) for t in s.split()] for s in sentences] sembeddings = stransformer.encode(sentences) print(sembeddings.dtype) return {"input_ids": token_ids, "sembedding": sembeddings} ``` Given a list of `sentences` (`List[str]`), it converts those into token_ids on the one hand (list of lists of ints; `List[List[int]]`) and into sentence embeddings on the other (Tensor of dtype `torch.float32`). That means that I actually set the column "sembedding" to a tensor that I as a user expect to be a float32. It appears though that behind the scenes, this tensor is converted into a **list**. I did not find this documented anywhere but I might have missed it. From a user's perspective this is incredibly important though, because it means you cannot do any data_type or tensor casting yourself in a mapping function! Furthermore, this can lead to issues, as was my case. My model expected float32 precision, which I thought `sembedding` was because that is what `stransformer.encode` outputs. But behind the scenes this tensor is first cast to a list, and when we then set its format, as below, this column is cast not to float32 but to double precision float64. ```python dataset.set_format(type="torch", columns=["input_ids", "sembedding"]) ``` This happens because apparently there is an intermediate step of casting to a **numpy** array (?) **whose dtype creation/deduction is different from torch dtypes** (see the snippet below). As you can see, this means that the dtype is not preserved: if I got it right, the dataset goes from torch.float32 -> list -> float64 (numpy) -> torch.float64. ```python import torch import numpy as np l = [-0.03010837361216545, -0.035979013890028, -0.016949838027358055] torch_tensor = torch.tensor(l) np_array = np.array(l) np_to_torch = torch.from_numpy(np_array) print(torch_tensor.dtype) # torch.float32 print(np_array.dtype) # float64 print(np_to_torch.dtype) # torch.float64 ``` This might lead to unwanted behaviour. I understand that the whole library is probably built around casting from numpy to other frameworks, so this might be difficult to solve. Perhaps `set_format` should include a `dtypes` option where for each input column the user can specify the wanted precision. The alternative is that the user needs to cast manually after loading data from the dataset but that does not seem user-friendly, makes the dataset less portable, and might use more space in memory as well as on disk than is actually needed. I'm glad you managed to figure something out :) Casting from arrow to numpy can be 100x faster than casting from arrow to list. This is because arrow has an integration with numpy that allows it to instantiate numpy arrays with zero-copy from arrow. On the other hand to create python lists it is slow since it has to recreate the list object by iterating through each element in python.
[ -0.1134336144, -0.2211150825, -0.0097108036, 0.2073050141, 0.5532286763, 0.1730129719, 0.5313700438, 0.1225808039, 0.1504826248, -0.0665392429, -0.0843995586, 0.2457151115, -0.1175517887, -0.1751454324, 0.1026160568, -0.2025274038, 0.2282531857, -0.065256834, -0.1441731602, -0.2050459087, -0.2274058163, -0.0888315886, -0.0027086586, -0.2082732171, -0.1601838768, -0.1640592217, 0.240316689, -0.1392810941, -0.1851638108, -0.0010074526, 0.1801276207, -0.165222764, 0.4167292118, 0.5717648268, -0.0001163156, 0.2217252254, -0.0266850367, -0.0754260197, -0.1886361241, -0.0001530126, 0.0046068206, -0.1931502819, -0.0723766983, -0.1798605621, -0.1541206837, -0.1833254099, -0.0845148191, -0.7935447097, 0.2889492214, 0.4108264744, 0.1519325674, 0.2570821345, 0.0712023601, 0.1525934786, -0.1351298988, 0.2725165784, -0.1030129716, 0.2802621126, -0.0208771788, 0.3604672551, -0.0614817999, 0.4459062815, -0.3614788353, -0.0750735849, 0.0291571636, 0.1127887666, 0.0541249253, -0.4639117122, -0.0662840605, 0.1654174775, 0.1973421127, -0.2320810258, -0.2843654752, -0.1672884375, -0.0933568627, -0.3671623468, 0.0662377477, -0.0845816582, 0.1454866976, 0.1031547934, -0.084708333, -0.1715694964, -0.0345693901, 0.132095933, -0.4753400683, 0.2185825408, -0.0089361612, 0.147419095, -0.1035436094, -0.0327194408, 0.0743178278, -0.1159509942, 0.220984593, -0.0332038254, -0.1700053811, -0.2236017585, 0.106810376, -0.4505302608, -0.1278471351, -0.4818674624, 0.2282645702, 0.1562226564, -0.2183919698, -0.0274901241, 0.2101505399, 0.3317387402, -0.2976430655, 0.3678762019, 0.2994464934, -0.0447379835, 0.2111326456, 0.0985192806, 0.1282294542, 0.0464339815, -0.0115372315, 0.1037944406, 0.5391174555, -0.1061273888, -0.3156950176, 0.2423449755, -0.4640918672, 0.0854397118, 0.0290503949, 0.0762219578, -0.1580674946, 0.4469192326, 0.1909371465, 0.0856407434, -0.1537803113, -0.0482111759, -0.1408703327, -0.2463302016, 0.0493122786, -0.1757228971, 0.0473481566, 0.1110769957, -0.0556813776, 0.290984422, 0.130494982, 0.1616605818, -0.0623957328, -0.1468246132, 0.515632689, 0.2943936288, -0.3300796151, 0.1609629989, 0.214812547, -0.3500459194, -0.1941234469, 0.3413830698, -0.2835557461, 0.0002672374, -0.2552977204, 0.0896735489, -0.062633574, -0.1942968667, 0.0383377224, 0.5168799162, 0.5945782065, 0.0745405406, 0.3335527182, -0.5255037546, -0.2444200218, -0.2121092826, 0.0651091039, -0.0023090839, -0.4651450515, -0.0084925443, 0.2366488278, 0.1842958629, 0.3561901748, 0.4137145579, 0.070492059, 0.1077741683, 0.0749109462, -0.0053548366, 0.422085762, 0.1117165983, -0.2204664946, -0.0649489835, -0.0586558543, 0.3635666966, -0.0807049125, 0.0079561472, 0.3194953799, -0.2035727054, 0.207832396, 0.0254101194, -0.2768172324, 0.081522055, -0.2061398327, 0.0387927443, 0.4647619426, -0.0825527012, -0.0355011635, -0.0381798372, -0.3315748572, 0.3626126051, 0.2424904704, -0.1368880272, 0.0368466824, 0.0449779406, 0.0548131429, -0.0328166448, 0.083296001, -0.0828899145, -0.523478508, -0.0470568091, 0.0362096056, 0.2670903206, -0.0503192022, -0.1362543404, 0.2245899439, 0.0831880346, -0.0103787761, 0.0391421914, 0.068150647, -0.1717520505, -0.4256087542, 0.0723533034, -0.0966649354, -0.2199125588, -0.0393037573, 0.1064352095, -0.497748524, 0.080811061, 0.0075051542, -0.196670413, -0.2570156753, 0.2851540446, 0.0231733844, -0.0891863108, -0.2323271185, 0.0862705931, 0.1931111217, -0.0923556909, -0.6234654784, 0.5602860451, 0.5223325491, 0.0226680264, 0.196981132, 0.417899996, 0.0167535916, 0.066881381, -0.1525681913, 0.166479975, 0.1083380207, 0.0184073355, -0.295819819, 0.0697329417, 0.2497302741, 0.1761268228, -0.2609066963, -0.1829362661, -0.14051193, -0.0761566609, -0.0279821493, 0.1387432218, -0.4120872617, 0.3359063864, 0.6934579611, -0.0322736055, 0.264243722, 0.0308322217, -0.3138769269, -0.1533609182, 0.2285328209, -0.121576719, 0.2470543087, 0.085140489, 0.3608531058, -0.1999171078, -0.1625251323, -0.0748272985, 0.2153990865, 0.0469215438, 0.0663606897, 0.0049414169, 0.075100854, 0.154191941, -0.1057291776, 0.2709604204, -0.0503993109, 0.2032074034, -0.4259135127, 0.2171183378, -0.401653856, 0.0512693822, -0.2296759188, -0.0917811692, -0.0777893066, -0.245613724, -0.0293585751, 0.0008401242, -0.2700946331, 0.393512845, 0.2614876032, 0.1166924238, 0.0873448104, -0.2350942492, -0.114613235, -0.1724228412, -0.2385483682, 0.0400344506, 0.2083546817, -0.3795831203, 0.1849565059, 0.3574537933, -0.2676814795, -0.2696792185, -0.7421148419, 0.0485986546, -0.2355481386, 0.0960824043, 0.0381220654, 0.129312247, -0.0246390998, 0.0510538854, 0.0254739635, 0.09447667, -0.018103987, 0.1180343628, -0.1916359961, 0.0291313007, -0.2872900963, -0.2801525295, -0.0367823467, -0.1219954342, 0.1812552363, -0.0798305348, -0.0676222444, 0.0514864661, 0.0912298709, 0.0310502388, -0.0581304803, 0.231089741, -0.4207526445, 0.0914394483, 0.334905386, -0.1597365737, -0.3068856299, -0.1432787329, 0.0234638304, -0.0182584673, 0.074514091, -0.2800922692, -0.0570331328, -0.2157622725, 0.2658356428, -0.0140345972, -0.1009146497, 0.5142833591, 0.2329592407, 0.0712705329, -0.0603920557, 0.015321821, 0.0630325675, 0.2177909315, 0.2721605599, -0.0094192941, 0.3699364364, 0.1042173952, 0.358104825, 0.2943380475, -0.6948934793, -0.1260359883, -0.0573533401, 0.2153843343, -0.2174500972, -0.2911496758, 0.0320031941, -0.2955265641, -0.1989386678, 0.0065346621, -0.2766370475, -0.2770921886, 0.1184649616, 0.2331217527, 0.1338419616, -0.4014646411, 0.4034070671, -0.3055262268, -0.0188107193, 0.0703195035, 0.1725529581, -0.1366490126, -0.1540073156, 0.1249558032, 0.0767138973, 0.1390782148, -0.0452389829, -0.1560621262, -0.1616585106, -0.5573343635, 0.2960476875, 0.0733386129, 0.3998683095, -0.101998888, -0.3138101101, -0.1236894354, 0.2658504844, 0.3583014607, -0.0651800931, -0.3085550964, -0.0340753198, 0.0468397588, -0.1119081452, 0.0908225179, -0.3705395162, 0.1399977207, 0.1598604321, 0.4799205959, -0.3152985573, -0.1501232237, 0.3407086432, 0.0990059972, -0.2651865184, 0.1257001013, -0.3059801161, -0.217253387, 0.1029197499, -0.1605291367, 0.3218427896, -0.1130135208, -0.1629984826, -0.1421162784, -0.1887492687, -0.0593866706, 0.3024404943, -0.0537968241, 0.174654305, 0.2735586166, -0.0524263158, 0.1738077998, 0.4484737515, 0.0710387602, 0.3958069086, -0.2539947331, -0.4725272655, 0.1205892116, 0.2626002133, 0.2191854417, 0.3360384107, -0.0665045381, -0.0107503273, 0.0495241545, 0.2013027668, -0.4152361155, 0.1450720429, 0.3002058864, 0.4395445585, -0.3818066716, -0.3916044831, 0.3398996592, 0.1049043238, -0.1118495092, 0.3452863693, -0.3406514525, -0.4210640788, 0.4126034677, 0.4866728783, 0.94391644, -0.051524777, 0.3623065948, 0.3886758685, 0.0395122543, 0.7027943134, -0.2195883691, 0.0953721404, -0.1092886031, -0.2003382593, -0.1821272969, 0.0283359066, 0.1922812462, -0.1768276691, -0.3934643269, -0.0114231706, 0.0450150669, 0.3950945139, 0.0918087959, 0.1370074153, 0.1337354779, -0.3839355707, 0.0010396261, 0.0962077603, -0.1420923024, -0.0245580226, 0.0258369744, -0.1296163052, -0.2271974683, -0.0388556868, -0.2351937592, -0.2038245648, -0.6721232533, 0.227905646, -0.2112895697, -0.3824329972, 0.2119166255, 0.3805316985, 0.4986085892, 0.0699417368, -0.1444133222, -0.0540998578, 0.1406935751, 0.250490725, 0.3674525619, -0.3385370374, 0.5262500048, 0.1788809597, -0.0170450509, -0.0523226112, 0.0190298371, 0.1195784956, -0.3839370608, -0.0444362238, 0.0104464153, -0.4486308694, 0.0180262439, 0.051819168, 0.1929473132, -0.0719196349, 0.0803572536, 0.0033199172, 0.0604200028, 0.4390888512, -0.2384051532, -0.2049400359, -0.1360176504, 0.2581453621, 0.3120487928, -0.0276789591, 0.3316819072, 0.1748650521, -0.2199060917, -0.1326006949, 0.269662261, 0.096247673, -0.2718694508, 0.2938762605, -0.2294586897, -0.3275020421, -0.4029333889, 0.0242942739, 0.1296151876, 0.0200842898, -0.4275779128, 0.0481217206, -0.4269217849, 0.1955786496, 0.1994587332, 0.2294970751, -0.0968036354, 0.2902588248, -0.1862219274, -0.0752581805, -0.2788595259, -0.1454714239, 0.2015165091, 0.1607854664, -0.0480513833, -0.0622588359, 0.215378046, -0.2433128953, 0.0800852478, 0.0955788642, 0.0832241625, -0.1688763201, -0.1338000298, 0.0844032094, -0.0721536726, -0.1631307006, -0.0209845863, -0.1953461468, -0.1867281199, -0.1678042412, 0.1126663238, 0.2799354494, 0.1018424034, 0.2763626873, 0.0417338498, 0.1626680046, 0.3086151481, 0.0054950528, -0.1270997971, 0.2509423196, 0.063558951, 0.3367977738, 0.0193025991, -0.0522144139, -0.1994487792, 0.2303505838, 0.0502555668, -0.0898418948, 0.4385260344, -0.1862916201, -0.1380280405, 0.0113268159, 0.1839900762, 0.2505348027, -0.312196672, -0.1436043233, -0.0271206591, 0.108014293, -0.1846054196, 0.1011112928, 0.1920876205, -0.1086048707, 0.126740098, 0.4136994183, -0.0296261907, 0.5634446144, -0.0397619195, -0.0074641779, 0.1605867147, -0.2002462596, 0.2797859609, 0.263199091, -0.0898274779, 0.1576704383, 0.2914582789, 0.0799151585, 0.1357218623, 0.143338576, 0.0475201681, 0.2089090943, 0.059794981, 0.0278488994, 0.4162576199, -0.1557864845, 0.1227426454, 0.0908497199, -0.4397898018, 0.2601423264, 0.3826687336, 0.0751200765, -0.305226177, -0.2359906286, -0.1305059344, 0.1503946781, -0.4149085879, 0.0025797896, -0.2539524436, 0.0859751105, -0.0619615614, -0.2450638115, -0.2087459266, -0.0473562516, 0.2073627561, 0.2033633888, -0.1458738595, 0.0916258693, -0.2127765268, -0.1652701348, 0.08959952, 0.0469890982, 0.3118065298, 0.3976687193, -0.190330416, -0.1640623808, 0.2795585692, 0.2608897388, 0.1900887787, -0.2551001608, 0.3188129663, 0.1368777901, -0.2822489142, 0.0019034073, 0.259434849, 0.0223127957, -0.0181629397, 0.2152704597, 0.1037105396, -0.0287231989, 0.1587678939, -0.0829132944, 0.2346436232, 0.0197712034, 0.6455356479, -0.0429135337, -0.1467235833, -0.0784748271, -0.207385391, -0.2123956978, -0.2224058956, 0.7710805535, -0.1710174084, 0.2005573809, -0.0775000826, 0.0657909364, 0.2825327218, 0.3724639416, 0.0875459909, -0.0523023382, -0.2187705934, 0.3671227694, -0.27810359, 0.1055852175, 0.2141847312, -0.0478555821, -0.0218304284, 0.2236845195, 0.4392274618, 0.2588707805, 0.0447612181, -0.4173417985, 0.2164114714, 0.0331137255, -0.1054255143, -0.2507251501, -0.4146488607, 0.1806094348, 0.0294061229, -0.3498019576, 0.3316437304, 0.1330267787, -0.0241066515, -0.4745330513, 0.0894951522, -0.0053971633, 0.2187124789, 0.2378495634, 0.4377464056, 0.1268993616, -0.1296064258, -0.255592823, 0.1482959688, -0.0524923131, -0.4803517461, 0.4116392136, -0.1601350904, 0.3657150269, -0.3011513948, -0.2907636166, -0.1951593459, 0.0258601829, 0.0851696581, -0.4364663959, -0.5503332615, 0.3894612789, -0.1322183162, -0.0845067501, 0.2017910331, 0.4802950025, 0.0986787528, 0.3818565309, -0.2019928843, -0.3419241309, 0.6405502558, -0.2690971494, -0.0412026197, -0.0623358898, -0.0725637078, -0.0648595989, 0.0509312525, -0.8520144224, -0.2399327308, 0.2974298, -0.2665521801, -0.2482232749, 0.309158653, 0.1196828038, 0.134145543, -0.3145253956, 0.5315724015, -0.045614209, -0.152200073, 0.094045639, -0.1882256418 ]
https://github.com/huggingface/datasets/issues/625
dtype of tensors should be preserved
I encountered a simliar issue: `datasets` converted my float numpy array to `torch.float64` tensors, while many pytorch operations require `torch.float32` inputs and it's very troublesome. I tried @lhoestq 's solution, but since it's mixed with the preprocess function, it's not very intuitive. I just want to share another possible simpler solution: directly cast the dtype of the processed dataset. Now I want to change the type of `labels` in `train_dataset` from float64 to float32, I can do this. ``` from datasets import Value, Sequence, Features feats = train_dataset.features.copy() feats['labels'].feature = Value(dtype='float32') feats = Features(feats) train_dataset.cast_(feats) ```
After switching to `datasets` my model just broke. After a weekend of debugging, the issue was that my model could not handle the double that the Dataset provided, as it expected a float (but didn't give a warning, which seems a [PyTorch issue](https://discuss.pytorch.org/t/is-it-required-that-input-and-hidden-for-gru-have-the-same-dtype-float32/96221)). As a user I did not expect this bug. I have a `map` function that I call on the Dataset that looks like this: ```python def preprocess(sentences: List[str]): token_ids = [[vocab.to_index(t) for t in s.split()] for s in sentences] sembeddings = stransformer.encode(sentences) print(sembeddings.dtype) return {"input_ids": token_ids, "sembedding": sembeddings} ``` Given a list of `sentences` (`List[str]`), it converts those into token_ids on the one hand (list of lists of ints; `List[List[int]]`) and into sentence embeddings on the other (Tensor of dtype `torch.float32`). That means that I actually set the column "sembedding" to a tensor that I as a user expect to be a float32. It appears though that behind the scenes, this tensor is converted into a **list**. I did not find this documented anywhere but I might have missed it. From a user's perspective this is incredibly important though, because it means you cannot do any data_type or tensor casting yourself in a mapping function! Furthermore, this can lead to issues, as was my case. My model expected float32 precision, which I thought `sembedding` was because that is what `stransformer.encode` outputs. But behind the scenes this tensor is first cast to a list, and when we then set its format, as below, this column is cast not to float32 but to double precision float64. ```python dataset.set_format(type="torch", columns=["input_ids", "sembedding"]) ``` This happens because apparently there is an intermediate step of casting to a **numpy** array (?) **whose dtype creation/deduction is different from torch dtypes** (see the snippet below). As you can see, this means that the dtype is not preserved: if I got it right, the dataset goes from torch.float32 -> list -> float64 (numpy) -> torch.float64. ```python import torch import numpy as np l = [-0.03010837361216545, -0.035979013890028, -0.016949838027358055] torch_tensor = torch.tensor(l) np_array = np.array(l) np_to_torch = torch.from_numpy(np_array) print(torch_tensor.dtype) # torch.float32 print(np_array.dtype) # float64 print(np_to_torch.dtype) # torch.float64 ``` This might lead to unwanted behaviour. I understand that the whole library is probably built around casting from numpy to other frameworks, so this might be difficult to solve. Perhaps `set_format` should include a `dtypes` option where for each input column the user can specify the wanted precision. The alternative is that the user needs to cast manually after loading data from the dataset but that does not seem user-friendly, makes the dataset less portable, and might use more space in memory as well as on disk than is actually needed.
96
dtype of tensors should be preserved After switching to `datasets` my model just broke. After a weekend of debugging, the issue was that my model could not handle the double that the Dataset provided, as it expected a float (but didn't give a warning, which seems a [PyTorch issue](https://discuss.pytorch.org/t/is-it-required-that-input-and-hidden-for-gru-have-the-same-dtype-float32/96221)). As a user I did not expect this bug. I have a `map` function that I call on the Dataset that looks like this: ```python def preprocess(sentences: List[str]): token_ids = [[vocab.to_index(t) for t in s.split()] for s in sentences] sembeddings = stransformer.encode(sentences) print(sembeddings.dtype) return {"input_ids": token_ids, "sembedding": sembeddings} ``` Given a list of `sentences` (`List[str]`), it converts those into token_ids on the one hand (list of lists of ints; `List[List[int]]`) and into sentence embeddings on the other (Tensor of dtype `torch.float32`). That means that I actually set the column "sembedding" to a tensor that I as a user expect to be a float32. It appears though that behind the scenes, this tensor is converted into a **list**. I did not find this documented anywhere but I might have missed it. From a user's perspective this is incredibly important though, because it means you cannot do any data_type or tensor casting yourself in a mapping function! Furthermore, this can lead to issues, as was my case. My model expected float32 precision, which I thought `sembedding` was because that is what `stransformer.encode` outputs. But behind the scenes this tensor is first cast to a list, and when we then set its format, as below, this column is cast not to float32 but to double precision float64. ```python dataset.set_format(type="torch", columns=["input_ids", "sembedding"]) ``` This happens because apparently there is an intermediate step of casting to a **numpy** array (?) **whose dtype creation/deduction is different from torch dtypes** (see the snippet below). As you can see, this means that the dtype is not preserved: if I got it right, the dataset goes from torch.float32 -> list -> float64 (numpy) -> torch.float64. ```python import torch import numpy as np l = [-0.03010837361216545, -0.035979013890028, -0.016949838027358055] torch_tensor = torch.tensor(l) np_array = np.array(l) np_to_torch = torch.from_numpy(np_array) print(torch_tensor.dtype) # torch.float32 print(np_array.dtype) # float64 print(np_to_torch.dtype) # torch.float64 ``` This might lead to unwanted behaviour. I understand that the whole library is probably built around casting from numpy to other frameworks, so this might be difficult to solve. Perhaps `set_format` should include a `dtypes` option where for each input column the user can specify the wanted precision. The alternative is that the user needs to cast manually after loading data from the dataset but that does not seem user-friendly, makes the dataset less portable, and might use more space in memory as well as on disk than is actually needed. I encountered a simliar issue: `datasets` converted my float numpy array to `torch.float64` tensors, while many pytorch operations require `torch.float32` inputs and it's very troublesome. I tried @lhoestq 's solution, but since it's mixed with the preprocess function, it's not very intuitive. I just want to share another possible simpler solution: directly cast the dtype of the processed dataset. Now I want to change the type of `labels` in `train_dataset` from float64 to float32, I can do this. ``` from datasets import Value, Sequence, Features feats = train_dataset.features.copy() feats['labels'].feature = Value(dtype='float32') feats = Features(feats) train_dataset.cast_(feats) ```
[ -0.1134336144, -0.2211150825, -0.0097108036, 0.2073050141, 0.5532286763, 0.1730129719, 0.5313700438, 0.1225808039, 0.1504826248, -0.0665392429, -0.0843995586, 0.2457151115, -0.1175517887, -0.1751454324, 0.1026160568, -0.2025274038, 0.2282531857, -0.065256834, -0.1441731602, -0.2050459087, -0.2274058163, -0.0888315886, -0.0027086586, -0.2082732171, -0.1601838768, -0.1640592217, 0.240316689, -0.1392810941, -0.1851638108, -0.0010074526, 0.1801276207, -0.165222764, 0.4167292118, 0.5717648268, -0.0001163156, 0.2217252254, -0.0266850367, -0.0754260197, -0.1886361241, -0.0001530126, 0.0046068206, -0.1931502819, -0.0723766983, -0.1798605621, -0.1541206837, -0.1833254099, -0.0845148191, -0.7935447097, 0.2889492214, 0.4108264744, 0.1519325674, 0.2570821345, 0.0712023601, 0.1525934786, -0.1351298988, 0.2725165784, -0.1030129716, 0.2802621126, -0.0208771788, 0.3604672551, -0.0614817999, 0.4459062815, -0.3614788353, -0.0750735849, 0.0291571636, 0.1127887666, 0.0541249253, -0.4639117122, -0.0662840605, 0.1654174775, 0.1973421127, -0.2320810258, -0.2843654752, -0.1672884375, -0.0933568627, -0.3671623468, 0.0662377477, -0.0845816582, 0.1454866976, 0.1031547934, -0.084708333, -0.1715694964, -0.0345693901, 0.132095933, -0.4753400683, 0.2185825408, -0.0089361612, 0.147419095, -0.1035436094, -0.0327194408, 0.0743178278, -0.1159509942, 0.220984593, -0.0332038254, -0.1700053811, -0.2236017585, 0.106810376, -0.4505302608, -0.1278471351, -0.4818674624, 0.2282645702, 0.1562226564, -0.2183919698, -0.0274901241, 0.2101505399, 0.3317387402, -0.2976430655, 0.3678762019, 0.2994464934, -0.0447379835, 0.2111326456, 0.0985192806, 0.1282294542, 0.0464339815, -0.0115372315, 0.1037944406, 0.5391174555, -0.1061273888, -0.3156950176, 0.2423449755, -0.4640918672, 0.0854397118, 0.0290503949, 0.0762219578, -0.1580674946, 0.4469192326, 0.1909371465, 0.0856407434, -0.1537803113, -0.0482111759, -0.1408703327, -0.2463302016, 0.0493122786, -0.1757228971, 0.0473481566, 0.1110769957, -0.0556813776, 0.290984422, 0.130494982, 0.1616605818, -0.0623957328, -0.1468246132, 0.515632689, 0.2943936288, -0.3300796151, 0.1609629989, 0.214812547, -0.3500459194, -0.1941234469, 0.3413830698, -0.2835557461, 0.0002672374, -0.2552977204, 0.0896735489, -0.062633574, -0.1942968667, 0.0383377224, 0.5168799162, 0.5945782065, 0.0745405406, 0.3335527182, -0.5255037546, -0.2444200218, -0.2121092826, 0.0651091039, -0.0023090839, -0.4651450515, -0.0084925443, 0.2366488278, 0.1842958629, 0.3561901748, 0.4137145579, 0.070492059, 0.1077741683, 0.0749109462, -0.0053548366, 0.422085762, 0.1117165983, -0.2204664946, -0.0649489835, -0.0586558543, 0.3635666966, -0.0807049125, 0.0079561472, 0.3194953799, -0.2035727054, 0.207832396, 0.0254101194, -0.2768172324, 0.081522055, -0.2061398327, 0.0387927443, 0.4647619426, -0.0825527012, -0.0355011635, -0.0381798372, -0.3315748572, 0.3626126051, 0.2424904704, -0.1368880272, 0.0368466824, 0.0449779406, 0.0548131429, -0.0328166448, 0.083296001, -0.0828899145, -0.523478508, -0.0470568091, 0.0362096056, 0.2670903206, -0.0503192022, -0.1362543404, 0.2245899439, 0.0831880346, -0.0103787761, 0.0391421914, 0.068150647, -0.1717520505, -0.4256087542, 0.0723533034, -0.0966649354, -0.2199125588, -0.0393037573, 0.1064352095, -0.497748524, 0.080811061, 0.0075051542, -0.196670413, -0.2570156753, 0.2851540446, 0.0231733844, -0.0891863108, -0.2323271185, 0.0862705931, 0.1931111217, -0.0923556909, -0.6234654784, 0.5602860451, 0.5223325491, 0.0226680264, 0.196981132, 0.417899996, 0.0167535916, 0.066881381, -0.1525681913, 0.166479975, 0.1083380207, 0.0184073355, -0.295819819, 0.0697329417, 0.2497302741, 0.1761268228, -0.2609066963, -0.1829362661, -0.14051193, -0.0761566609, -0.0279821493, 0.1387432218, -0.4120872617, 0.3359063864, 0.6934579611, -0.0322736055, 0.264243722, 0.0308322217, -0.3138769269, -0.1533609182, 0.2285328209, -0.121576719, 0.2470543087, 0.085140489, 0.3608531058, -0.1999171078, -0.1625251323, -0.0748272985, 0.2153990865, 0.0469215438, 0.0663606897, 0.0049414169, 0.075100854, 0.154191941, -0.1057291776, 0.2709604204, -0.0503993109, 0.2032074034, -0.4259135127, 0.2171183378, -0.401653856, 0.0512693822, -0.2296759188, -0.0917811692, -0.0777893066, -0.245613724, -0.0293585751, 0.0008401242, -0.2700946331, 0.393512845, 0.2614876032, 0.1166924238, 0.0873448104, -0.2350942492, -0.114613235, -0.1724228412, -0.2385483682, 0.0400344506, 0.2083546817, -0.3795831203, 0.1849565059, 0.3574537933, -0.2676814795, -0.2696792185, -0.7421148419, 0.0485986546, -0.2355481386, 0.0960824043, 0.0381220654, 0.129312247, -0.0246390998, 0.0510538854, 0.0254739635, 0.09447667, -0.018103987, 0.1180343628, -0.1916359961, 0.0291313007, -0.2872900963, -0.2801525295, -0.0367823467, -0.1219954342, 0.1812552363, -0.0798305348, -0.0676222444, 0.0514864661, 0.0912298709, 0.0310502388, -0.0581304803, 0.231089741, -0.4207526445, 0.0914394483, 0.334905386, -0.1597365737, -0.3068856299, -0.1432787329, 0.0234638304, -0.0182584673, 0.074514091, -0.2800922692, -0.0570331328, -0.2157622725, 0.2658356428, -0.0140345972, -0.1009146497, 0.5142833591, 0.2329592407, 0.0712705329, -0.0603920557, 0.015321821, 0.0630325675, 0.2177909315, 0.2721605599, -0.0094192941, 0.3699364364, 0.1042173952, 0.358104825, 0.2943380475, -0.6948934793, -0.1260359883, -0.0573533401, 0.2153843343, -0.2174500972, -0.2911496758, 0.0320031941, -0.2955265641, -0.1989386678, 0.0065346621, -0.2766370475, -0.2770921886, 0.1184649616, 0.2331217527, 0.1338419616, -0.4014646411, 0.4034070671, -0.3055262268, -0.0188107193, 0.0703195035, 0.1725529581, -0.1366490126, -0.1540073156, 0.1249558032, 0.0767138973, 0.1390782148, -0.0452389829, -0.1560621262, -0.1616585106, -0.5573343635, 0.2960476875, 0.0733386129, 0.3998683095, -0.101998888, -0.3138101101, -0.1236894354, 0.2658504844, 0.3583014607, -0.0651800931, -0.3085550964, -0.0340753198, 0.0468397588, -0.1119081452, 0.0908225179, -0.3705395162, 0.1399977207, 0.1598604321, 0.4799205959, -0.3152985573, -0.1501232237, 0.3407086432, 0.0990059972, -0.2651865184, 0.1257001013, -0.3059801161, -0.217253387, 0.1029197499, -0.1605291367, 0.3218427896, -0.1130135208, -0.1629984826, -0.1421162784, -0.1887492687, -0.0593866706, 0.3024404943, -0.0537968241, 0.174654305, 0.2735586166, -0.0524263158, 0.1738077998, 0.4484737515, 0.0710387602, 0.3958069086, -0.2539947331, -0.4725272655, 0.1205892116, 0.2626002133, 0.2191854417, 0.3360384107, -0.0665045381, -0.0107503273, 0.0495241545, 0.2013027668, -0.4152361155, 0.1450720429, 0.3002058864, 0.4395445585, -0.3818066716, -0.3916044831, 0.3398996592, 0.1049043238, -0.1118495092, 0.3452863693, -0.3406514525, -0.4210640788, 0.4126034677, 0.4866728783, 0.94391644, -0.051524777, 0.3623065948, 0.3886758685, 0.0395122543, 0.7027943134, -0.2195883691, 0.0953721404, -0.1092886031, -0.2003382593, -0.1821272969, 0.0283359066, 0.1922812462, -0.1768276691, -0.3934643269, -0.0114231706, 0.0450150669, 0.3950945139, 0.0918087959, 0.1370074153, 0.1337354779, -0.3839355707, 0.0010396261, 0.0962077603, -0.1420923024, -0.0245580226, 0.0258369744, -0.1296163052, -0.2271974683, -0.0388556868, -0.2351937592, -0.2038245648, -0.6721232533, 0.227905646, -0.2112895697, -0.3824329972, 0.2119166255, 0.3805316985, 0.4986085892, 0.0699417368, -0.1444133222, -0.0540998578, 0.1406935751, 0.250490725, 0.3674525619, -0.3385370374, 0.5262500048, 0.1788809597, -0.0170450509, -0.0523226112, 0.0190298371, 0.1195784956, -0.3839370608, -0.0444362238, 0.0104464153, -0.4486308694, 0.0180262439, 0.051819168, 0.1929473132, -0.0719196349, 0.0803572536, 0.0033199172, 0.0604200028, 0.4390888512, -0.2384051532, -0.2049400359, -0.1360176504, 0.2581453621, 0.3120487928, -0.0276789591, 0.3316819072, 0.1748650521, -0.2199060917, -0.1326006949, 0.269662261, 0.096247673, -0.2718694508, 0.2938762605, -0.2294586897, -0.3275020421, -0.4029333889, 0.0242942739, 0.1296151876, 0.0200842898, -0.4275779128, 0.0481217206, -0.4269217849, 0.1955786496, 0.1994587332, 0.2294970751, -0.0968036354, 0.2902588248, -0.1862219274, -0.0752581805, -0.2788595259, -0.1454714239, 0.2015165091, 0.1607854664, -0.0480513833, -0.0622588359, 0.215378046, -0.2433128953, 0.0800852478, 0.0955788642, 0.0832241625, -0.1688763201, -0.1338000298, 0.0844032094, -0.0721536726, -0.1631307006, -0.0209845863, -0.1953461468, -0.1867281199, -0.1678042412, 0.1126663238, 0.2799354494, 0.1018424034, 0.2763626873, 0.0417338498, 0.1626680046, 0.3086151481, 0.0054950528, -0.1270997971, 0.2509423196, 0.063558951, 0.3367977738, 0.0193025991, -0.0522144139, -0.1994487792, 0.2303505838, 0.0502555668, -0.0898418948, 0.4385260344, -0.1862916201, -0.1380280405, 0.0113268159, 0.1839900762, 0.2505348027, -0.312196672, -0.1436043233, -0.0271206591, 0.108014293, -0.1846054196, 0.1011112928, 0.1920876205, -0.1086048707, 0.126740098, 0.4136994183, -0.0296261907, 0.5634446144, -0.0397619195, -0.0074641779, 0.1605867147, -0.2002462596, 0.2797859609, 0.263199091, -0.0898274779, 0.1576704383, 0.2914582789, 0.0799151585, 0.1357218623, 0.143338576, 0.0475201681, 0.2089090943, 0.059794981, 0.0278488994, 0.4162576199, -0.1557864845, 0.1227426454, 0.0908497199, -0.4397898018, 0.2601423264, 0.3826687336, 0.0751200765, -0.305226177, -0.2359906286, -0.1305059344, 0.1503946781, -0.4149085879, 0.0025797896, -0.2539524436, 0.0859751105, -0.0619615614, -0.2450638115, -0.2087459266, -0.0473562516, 0.2073627561, 0.2033633888, -0.1458738595, 0.0916258693, -0.2127765268, -0.1652701348, 0.08959952, 0.0469890982, 0.3118065298, 0.3976687193, -0.190330416, -0.1640623808, 0.2795585692, 0.2608897388, 0.1900887787, -0.2551001608, 0.3188129663, 0.1368777901, -0.2822489142, 0.0019034073, 0.259434849, 0.0223127957, -0.0181629397, 0.2152704597, 0.1037105396, -0.0287231989, 0.1587678939, -0.0829132944, 0.2346436232, 0.0197712034, 0.6455356479, -0.0429135337, -0.1467235833, -0.0784748271, -0.207385391, -0.2123956978, -0.2224058956, 0.7710805535, -0.1710174084, 0.2005573809, -0.0775000826, 0.0657909364, 0.2825327218, 0.3724639416, 0.0875459909, -0.0523023382, -0.2187705934, 0.3671227694, -0.27810359, 0.1055852175, 0.2141847312, -0.0478555821, -0.0218304284, 0.2236845195, 0.4392274618, 0.2588707805, 0.0447612181, -0.4173417985, 0.2164114714, 0.0331137255, -0.1054255143, -0.2507251501, -0.4146488607, 0.1806094348, 0.0294061229, -0.3498019576, 0.3316437304, 0.1330267787, -0.0241066515, -0.4745330513, 0.0894951522, -0.0053971633, 0.2187124789, 0.2378495634, 0.4377464056, 0.1268993616, -0.1296064258, -0.255592823, 0.1482959688, -0.0524923131, -0.4803517461, 0.4116392136, -0.1601350904, 0.3657150269, -0.3011513948, -0.2907636166, -0.1951593459, 0.0258601829, 0.0851696581, -0.4364663959, -0.5503332615, 0.3894612789, -0.1322183162, -0.0845067501, 0.2017910331, 0.4802950025, 0.0986787528, 0.3818565309, -0.2019928843, -0.3419241309, 0.6405502558, -0.2690971494, -0.0412026197, -0.0623358898, -0.0725637078, -0.0648595989, 0.0509312525, -0.8520144224, -0.2399327308, 0.2974298, -0.2665521801, -0.2482232749, 0.309158653, 0.1196828038, 0.134145543, -0.3145253956, 0.5315724015, -0.045614209, -0.152200073, 0.094045639, -0.1882256418 ]
https://github.com/huggingface/datasets/issues/625
dtype of tensors should be preserved
Reopening since @bhavitvyamalik started looking into it ! Also I'm posting here a function that could be helpful to support preserving the dtype of tensors. It's used to build a pyarrow array out of a numpy array and: - it doesn't convert the numpy array to a python list - it keeps the precision of the numpy array for the pyarrow array - it works with multidimensional arrays (while `pa.array` can only take a 1D array as input) - it builds the pyarrow ListArray from offsets created on-the-fly and values that come from the flattened numpy array ```python from functools import reduce from operator import mul import numpy as np import pyarrow as pa def pa_ndarray(a): """Build a PyArrow ListArray from a multidimensional NumPy array""" values = pa.array(a.flatten()) for i in range(a.ndim - 1): n_offsets = reduce(mul, a.shape[:a.ndim - i - 1], 1) step_offsets = a.shape[a.ndim - i - 1] offsets = pa.array(np.arange(n_offsets + 1) * step_offsets, type=pa.int32()) values = pa.ListArray.from_arrays(offsets, values) return values narr = np.arange(42).reshape(7, 2, 3).astype(np.uint8) parr = pa_ndarray(narr) assert isinstance(parr, pa.Array) assert parr.type == pa.list_(pa.list_(pa.uint8())) assert narr.tolist() == parr.to_pylist() ``` The only costly operation is the offsets computations. Since it doesn't iterate on the numpy array values this function is pretty fast.
After switching to `datasets` my model just broke. After a weekend of debugging, the issue was that my model could not handle the double that the Dataset provided, as it expected a float (but didn't give a warning, which seems a [PyTorch issue](https://discuss.pytorch.org/t/is-it-required-that-input-and-hidden-for-gru-have-the-same-dtype-float32/96221)). As a user I did not expect this bug. I have a `map` function that I call on the Dataset that looks like this: ```python def preprocess(sentences: List[str]): token_ids = [[vocab.to_index(t) for t in s.split()] for s in sentences] sembeddings = stransformer.encode(sentences) print(sembeddings.dtype) return {"input_ids": token_ids, "sembedding": sembeddings} ``` Given a list of `sentences` (`List[str]`), it converts those into token_ids on the one hand (list of lists of ints; `List[List[int]]`) and into sentence embeddings on the other (Tensor of dtype `torch.float32`). That means that I actually set the column "sembedding" to a tensor that I as a user expect to be a float32. It appears though that behind the scenes, this tensor is converted into a **list**. I did not find this documented anywhere but I might have missed it. From a user's perspective this is incredibly important though, because it means you cannot do any data_type or tensor casting yourself in a mapping function! Furthermore, this can lead to issues, as was my case. My model expected float32 precision, which I thought `sembedding` was because that is what `stransformer.encode` outputs. But behind the scenes this tensor is first cast to a list, and when we then set its format, as below, this column is cast not to float32 but to double precision float64. ```python dataset.set_format(type="torch", columns=["input_ids", "sembedding"]) ``` This happens because apparently there is an intermediate step of casting to a **numpy** array (?) **whose dtype creation/deduction is different from torch dtypes** (see the snippet below). As you can see, this means that the dtype is not preserved: if I got it right, the dataset goes from torch.float32 -> list -> float64 (numpy) -> torch.float64. ```python import torch import numpy as np l = [-0.03010837361216545, -0.035979013890028, -0.016949838027358055] torch_tensor = torch.tensor(l) np_array = np.array(l) np_to_torch = torch.from_numpy(np_array) print(torch_tensor.dtype) # torch.float32 print(np_array.dtype) # float64 print(np_to_torch.dtype) # torch.float64 ``` This might lead to unwanted behaviour. I understand that the whole library is probably built around casting from numpy to other frameworks, so this might be difficult to solve. Perhaps `set_format` should include a `dtypes` option where for each input column the user can specify the wanted precision. The alternative is that the user needs to cast manually after loading data from the dataset but that does not seem user-friendly, makes the dataset less portable, and might use more space in memory as well as on disk than is actually needed.
206
dtype of tensors should be preserved After switching to `datasets` my model just broke. After a weekend of debugging, the issue was that my model could not handle the double that the Dataset provided, as it expected a float (but didn't give a warning, which seems a [PyTorch issue](https://discuss.pytorch.org/t/is-it-required-that-input-and-hidden-for-gru-have-the-same-dtype-float32/96221)). As a user I did not expect this bug. I have a `map` function that I call on the Dataset that looks like this: ```python def preprocess(sentences: List[str]): token_ids = [[vocab.to_index(t) for t in s.split()] for s in sentences] sembeddings = stransformer.encode(sentences) print(sembeddings.dtype) return {"input_ids": token_ids, "sembedding": sembeddings} ``` Given a list of `sentences` (`List[str]`), it converts those into token_ids on the one hand (list of lists of ints; `List[List[int]]`) and into sentence embeddings on the other (Tensor of dtype `torch.float32`). That means that I actually set the column "sembedding" to a tensor that I as a user expect to be a float32. It appears though that behind the scenes, this tensor is converted into a **list**. I did not find this documented anywhere but I might have missed it. From a user's perspective this is incredibly important though, because it means you cannot do any data_type or tensor casting yourself in a mapping function! Furthermore, this can lead to issues, as was my case. My model expected float32 precision, which I thought `sembedding` was because that is what `stransformer.encode` outputs. But behind the scenes this tensor is first cast to a list, and when we then set its format, as below, this column is cast not to float32 but to double precision float64. ```python dataset.set_format(type="torch", columns=["input_ids", "sembedding"]) ``` This happens because apparently there is an intermediate step of casting to a **numpy** array (?) **whose dtype creation/deduction is different from torch dtypes** (see the snippet below). As you can see, this means that the dtype is not preserved: if I got it right, the dataset goes from torch.float32 -> list -> float64 (numpy) -> torch.float64. ```python import torch import numpy as np l = [-0.03010837361216545, -0.035979013890028, -0.016949838027358055] torch_tensor = torch.tensor(l) np_array = np.array(l) np_to_torch = torch.from_numpy(np_array) print(torch_tensor.dtype) # torch.float32 print(np_array.dtype) # float64 print(np_to_torch.dtype) # torch.float64 ``` This might lead to unwanted behaviour. I understand that the whole library is probably built around casting from numpy to other frameworks, so this might be difficult to solve. Perhaps `set_format` should include a `dtypes` option where for each input column the user can specify the wanted precision. The alternative is that the user needs to cast manually after loading data from the dataset but that does not seem user-friendly, makes the dataset less portable, and might use more space in memory as well as on disk than is actually needed. Reopening since @bhavitvyamalik started looking into it ! Also I'm posting here a function that could be helpful to support preserving the dtype of tensors. It's used to build a pyarrow array out of a numpy array and: - it doesn't convert the numpy array to a python list - it keeps the precision of the numpy array for the pyarrow array - it works with multidimensional arrays (while `pa.array` can only take a 1D array as input) - it builds the pyarrow ListArray from offsets created on-the-fly and values that come from the flattened numpy array ```python from functools import reduce from operator import mul import numpy as np import pyarrow as pa def pa_ndarray(a): """Build a PyArrow ListArray from a multidimensional NumPy array""" values = pa.array(a.flatten()) for i in range(a.ndim - 1): n_offsets = reduce(mul, a.shape[:a.ndim - i - 1], 1) step_offsets = a.shape[a.ndim - i - 1] offsets = pa.array(np.arange(n_offsets + 1) * step_offsets, type=pa.int32()) values = pa.ListArray.from_arrays(offsets, values) return values narr = np.arange(42).reshape(7, 2, 3).astype(np.uint8) parr = pa_ndarray(narr) assert isinstance(parr, pa.Array) assert parr.type == pa.list_(pa.list_(pa.uint8())) assert narr.tolist() == parr.to_pylist() ``` The only costly operation is the offsets computations. Since it doesn't iterate on the numpy array values this function is pretty fast.
[ -0.1134336144, -0.2211150825, -0.0097108036, 0.2073050141, 0.5532286763, 0.1730129719, 0.5313700438, 0.1225808039, 0.1504826248, -0.0665392429, -0.0843995586, 0.2457151115, -0.1175517887, -0.1751454324, 0.1026160568, -0.2025274038, 0.2282531857, -0.065256834, -0.1441731602, -0.2050459087, -0.2274058163, -0.0888315886, -0.0027086586, -0.2082732171, -0.1601838768, -0.1640592217, 0.240316689, -0.1392810941, -0.1851638108, -0.0010074526, 0.1801276207, -0.165222764, 0.4167292118, 0.5717648268, -0.0001163156, 0.2217252254, -0.0266850367, -0.0754260197, -0.1886361241, -0.0001530126, 0.0046068206, -0.1931502819, -0.0723766983, -0.1798605621, -0.1541206837, -0.1833254099, -0.0845148191, -0.7935447097, 0.2889492214, 0.4108264744, 0.1519325674, 0.2570821345, 0.0712023601, 0.1525934786, -0.1351298988, 0.2725165784, -0.1030129716, 0.2802621126, -0.0208771788, 0.3604672551, -0.0614817999, 0.4459062815, -0.3614788353, -0.0750735849, 0.0291571636, 0.1127887666, 0.0541249253, -0.4639117122, -0.0662840605, 0.1654174775, 0.1973421127, -0.2320810258, -0.2843654752, -0.1672884375, -0.0933568627, -0.3671623468, 0.0662377477, -0.0845816582, 0.1454866976, 0.1031547934, -0.084708333, -0.1715694964, -0.0345693901, 0.132095933, -0.4753400683, 0.2185825408, -0.0089361612, 0.147419095, -0.1035436094, -0.0327194408, 0.0743178278, -0.1159509942, 0.220984593, -0.0332038254, -0.1700053811, -0.2236017585, 0.106810376, -0.4505302608, -0.1278471351, -0.4818674624, 0.2282645702, 0.1562226564, -0.2183919698, -0.0274901241, 0.2101505399, 0.3317387402, -0.2976430655, 0.3678762019, 0.2994464934, -0.0447379835, 0.2111326456, 0.0985192806, 0.1282294542, 0.0464339815, -0.0115372315, 0.1037944406, 0.5391174555, -0.1061273888, -0.3156950176, 0.2423449755, -0.4640918672, 0.0854397118, 0.0290503949, 0.0762219578, -0.1580674946, 0.4469192326, 0.1909371465, 0.0856407434, -0.1537803113, -0.0482111759, -0.1408703327, -0.2463302016, 0.0493122786, -0.1757228971, 0.0473481566, 0.1110769957, -0.0556813776, 0.290984422, 0.130494982, 0.1616605818, -0.0623957328, -0.1468246132, 0.515632689, 0.2943936288, -0.3300796151, 0.1609629989, 0.214812547, -0.3500459194, -0.1941234469, 0.3413830698, -0.2835557461, 0.0002672374, -0.2552977204, 0.0896735489, -0.062633574, -0.1942968667, 0.0383377224, 0.5168799162, 0.5945782065, 0.0745405406, 0.3335527182, -0.5255037546, -0.2444200218, -0.2121092826, 0.0651091039, -0.0023090839, -0.4651450515, -0.0084925443, 0.2366488278, 0.1842958629, 0.3561901748, 0.4137145579, 0.070492059, 0.1077741683, 0.0749109462, -0.0053548366, 0.422085762, 0.1117165983, -0.2204664946, -0.0649489835, -0.0586558543, 0.3635666966, -0.0807049125, 0.0079561472, 0.3194953799, -0.2035727054, 0.207832396, 0.0254101194, -0.2768172324, 0.081522055, -0.2061398327, 0.0387927443, 0.4647619426, -0.0825527012, -0.0355011635, -0.0381798372, -0.3315748572, 0.3626126051, 0.2424904704, -0.1368880272, 0.0368466824, 0.0449779406, 0.0548131429, -0.0328166448, 0.083296001, -0.0828899145, -0.523478508, -0.0470568091, 0.0362096056, 0.2670903206, -0.0503192022, -0.1362543404, 0.2245899439, 0.0831880346, -0.0103787761, 0.0391421914, 0.068150647, -0.1717520505, -0.4256087542, 0.0723533034, -0.0966649354, -0.2199125588, -0.0393037573, 0.1064352095, -0.497748524, 0.080811061, 0.0075051542, -0.196670413, -0.2570156753, 0.2851540446, 0.0231733844, -0.0891863108, -0.2323271185, 0.0862705931, 0.1931111217, -0.0923556909, -0.6234654784, 0.5602860451, 0.5223325491, 0.0226680264, 0.196981132, 0.417899996, 0.0167535916, 0.066881381, -0.1525681913, 0.166479975, 0.1083380207, 0.0184073355, -0.295819819, 0.0697329417, 0.2497302741, 0.1761268228, -0.2609066963, -0.1829362661, -0.14051193, -0.0761566609, -0.0279821493, 0.1387432218, -0.4120872617, 0.3359063864, 0.6934579611, -0.0322736055, 0.264243722, 0.0308322217, -0.3138769269, -0.1533609182, 0.2285328209, -0.121576719, 0.2470543087, 0.085140489, 0.3608531058, -0.1999171078, -0.1625251323, -0.0748272985, 0.2153990865, 0.0469215438, 0.0663606897, 0.0049414169, 0.075100854, 0.154191941, -0.1057291776, 0.2709604204, -0.0503993109, 0.2032074034, -0.4259135127, 0.2171183378, -0.401653856, 0.0512693822, -0.2296759188, -0.0917811692, -0.0777893066, -0.245613724, -0.0293585751, 0.0008401242, -0.2700946331, 0.393512845, 0.2614876032, 0.1166924238, 0.0873448104, -0.2350942492, -0.114613235, -0.1724228412, -0.2385483682, 0.0400344506, 0.2083546817, -0.3795831203, 0.1849565059, 0.3574537933, -0.2676814795, -0.2696792185, -0.7421148419, 0.0485986546, -0.2355481386, 0.0960824043, 0.0381220654, 0.129312247, -0.0246390998, 0.0510538854, 0.0254739635, 0.09447667, -0.018103987, 0.1180343628, -0.1916359961, 0.0291313007, -0.2872900963, -0.2801525295, -0.0367823467, -0.1219954342, 0.1812552363, -0.0798305348, -0.0676222444, 0.0514864661, 0.0912298709, 0.0310502388, -0.0581304803, 0.231089741, -0.4207526445, 0.0914394483, 0.334905386, -0.1597365737, -0.3068856299, -0.1432787329, 0.0234638304, -0.0182584673, 0.074514091, -0.2800922692, -0.0570331328, -0.2157622725, 0.2658356428, -0.0140345972, -0.1009146497, 0.5142833591, 0.2329592407, 0.0712705329, -0.0603920557, 0.015321821, 0.0630325675, 0.2177909315, 0.2721605599, -0.0094192941, 0.3699364364, 0.1042173952, 0.358104825, 0.2943380475, -0.6948934793, -0.1260359883, -0.0573533401, 0.2153843343, -0.2174500972, -0.2911496758, 0.0320031941, -0.2955265641, -0.1989386678, 0.0065346621, -0.2766370475, -0.2770921886, 0.1184649616, 0.2331217527, 0.1338419616, -0.4014646411, 0.4034070671, -0.3055262268, -0.0188107193, 0.0703195035, 0.1725529581, -0.1366490126, -0.1540073156, 0.1249558032, 0.0767138973, 0.1390782148, -0.0452389829, -0.1560621262, -0.1616585106, -0.5573343635, 0.2960476875, 0.0733386129, 0.3998683095, -0.101998888, -0.3138101101, -0.1236894354, 0.2658504844, 0.3583014607, -0.0651800931, -0.3085550964, -0.0340753198, 0.0468397588, -0.1119081452, 0.0908225179, -0.3705395162, 0.1399977207, 0.1598604321, 0.4799205959, -0.3152985573, -0.1501232237, 0.3407086432, 0.0990059972, -0.2651865184, 0.1257001013, -0.3059801161, -0.217253387, 0.1029197499, -0.1605291367, 0.3218427896, -0.1130135208, -0.1629984826, -0.1421162784, -0.1887492687, -0.0593866706, 0.3024404943, -0.0537968241, 0.174654305, 0.2735586166, -0.0524263158, 0.1738077998, 0.4484737515, 0.0710387602, 0.3958069086, -0.2539947331, -0.4725272655, 0.1205892116, 0.2626002133, 0.2191854417, 0.3360384107, -0.0665045381, -0.0107503273, 0.0495241545, 0.2013027668, -0.4152361155, 0.1450720429, 0.3002058864, 0.4395445585, -0.3818066716, -0.3916044831, 0.3398996592, 0.1049043238, -0.1118495092, 0.3452863693, -0.3406514525, -0.4210640788, 0.4126034677, 0.4866728783, 0.94391644, -0.051524777, 0.3623065948, 0.3886758685, 0.0395122543, 0.7027943134, -0.2195883691, 0.0953721404, -0.1092886031, -0.2003382593, -0.1821272969, 0.0283359066, 0.1922812462, -0.1768276691, -0.3934643269, -0.0114231706, 0.0450150669, 0.3950945139, 0.0918087959, 0.1370074153, 0.1337354779, -0.3839355707, 0.0010396261, 0.0962077603, -0.1420923024, -0.0245580226, 0.0258369744, -0.1296163052, -0.2271974683, -0.0388556868, -0.2351937592, -0.2038245648, -0.6721232533, 0.227905646, -0.2112895697, -0.3824329972, 0.2119166255, 0.3805316985, 0.4986085892, 0.0699417368, -0.1444133222, -0.0540998578, 0.1406935751, 0.250490725, 0.3674525619, -0.3385370374, 0.5262500048, 0.1788809597, -0.0170450509, -0.0523226112, 0.0190298371, 0.1195784956, -0.3839370608, -0.0444362238, 0.0104464153, -0.4486308694, 0.0180262439, 0.051819168, 0.1929473132, -0.0719196349, 0.0803572536, 0.0033199172, 0.0604200028, 0.4390888512, -0.2384051532, -0.2049400359, -0.1360176504, 0.2581453621, 0.3120487928, -0.0276789591, 0.3316819072, 0.1748650521, -0.2199060917, -0.1326006949, 0.269662261, 0.096247673, -0.2718694508, 0.2938762605, -0.2294586897, -0.3275020421, -0.4029333889, 0.0242942739, 0.1296151876, 0.0200842898, -0.4275779128, 0.0481217206, -0.4269217849, 0.1955786496, 0.1994587332, 0.2294970751, -0.0968036354, 0.2902588248, -0.1862219274, -0.0752581805, -0.2788595259, -0.1454714239, 0.2015165091, 0.1607854664, -0.0480513833, -0.0622588359, 0.215378046, -0.2433128953, 0.0800852478, 0.0955788642, 0.0832241625, -0.1688763201, -0.1338000298, 0.0844032094, -0.0721536726, -0.1631307006, -0.0209845863, -0.1953461468, -0.1867281199, -0.1678042412, 0.1126663238, 0.2799354494, 0.1018424034, 0.2763626873, 0.0417338498, 0.1626680046, 0.3086151481, 0.0054950528, -0.1270997971, 0.2509423196, 0.063558951, 0.3367977738, 0.0193025991, -0.0522144139, -0.1994487792, 0.2303505838, 0.0502555668, -0.0898418948, 0.4385260344, -0.1862916201, -0.1380280405, 0.0113268159, 0.1839900762, 0.2505348027, -0.312196672, -0.1436043233, -0.0271206591, 0.108014293, -0.1846054196, 0.1011112928, 0.1920876205, -0.1086048707, 0.126740098, 0.4136994183, -0.0296261907, 0.5634446144, -0.0397619195, -0.0074641779, 0.1605867147, -0.2002462596, 0.2797859609, 0.263199091, -0.0898274779, 0.1576704383, 0.2914582789, 0.0799151585, 0.1357218623, 0.143338576, 0.0475201681, 0.2089090943, 0.059794981, 0.0278488994, 0.4162576199, -0.1557864845, 0.1227426454, 0.0908497199, -0.4397898018, 0.2601423264, 0.3826687336, 0.0751200765, -0.305226177, -0.2359906286, -0.1305059344, 0.1503946781, -0.4149085879, 0.0025797896, -0.2539524436, 0.0859751105, -0.0619615614, -0.2450638115, -0.2087459266, -0.0473562516, 0.2073627561, 0.2033633888, -0.1458738595, 0.0916258693, -0.2127765268, -0.1652701348, 0.08959952, 0.0469890982, 0.3118065298, 0.3976687193, -0.190330416, -0.1640623808, 0.2795585692, 0.2608897388, 0.1900887787, -0.2551001608, 0.3188129663, 0.1368777901, -0.2822489142, 0.0019034073, 0.259434849, 0.0223127957, -0.0181629397, 0.2152704597, 0.1037105396, -0.0287231989, 0.1587678939, -0.0829132944, 0.2346436232, 0.0197712034, 0.6455356479, -0.0429135337, -0.1467235833, -0.0784748271, -0.207385391, -0.2123956978, -0.2224058956, 0.7710805535, -0.1710174084, 0.2005573809, -0.0775000826, 0.0657909364, 0.2825327218, 0.3724639416, 0.0875459909, -0.0523023382, -0.2187705934, 0.3671227694, -0.27810359, 0.1055852175, 0.2141847312, -0.0478555821, -0.0218304284, 0.2236845195, 0.4392274618, 0.2588707805, 0.0447612181, -0.4173417985, 0.2164114714, 0.0331137255, -0.1054255143, -0.2507251501, -0.4146488607, 0.1806094348, 0.0294061229, -0.3498019576, 0.3316437304, 0.1330267787, -0.0241066515, -0.4745330513, 0.0894951522, -0.0053971633, 0.2187124789, 0.2378495634, 0.4377464056, 0.1268993616, -0.1296064258, -0.255592823, 0.1482959688, -0.0524923131, -0.4803517461, 0.4116392136, -0.1601350904, 0.3657150269, -0.3011513948, -0.2907636166, -0.1951593459, 0.0258601829, 0.0851696581, -0.4364663959, -0.5503332615, 0.3894612789, -0.1322183162, -0.0845067501, 0.2017910331, 0.4802950025, 0.0986787528, 0.3818565309, -0.2019928843, -0.3419241309, 0.6405502558, -0.2690971494, -0.0412026197, -0.0623358898, -0.0725637078, -0.0648595989, 0.0509312525, -0.8520144224, -0.2399327308, 0.2974298, -0.2665521801, -0.2482232749, 0.309158653, 0.1196828038, 0.134145543, -0.3145253956, 0.5315724015, -0.045614209, -0.152200073, 0.094045639, -0.1882256418 ]
https://github.com/huggingface/datasets/issues/625
dtype of tensors should be preserved
@lhoestq Have you thought about this further? We have a use case where we're attempting to load data containing numpy arrays using the `datasets` library. When using one of the "standard" methods (`[Value(...)]` or `Sequence()`) we see ~200 samples processed per second during the call to `_prepare_split`. This slowdown is caused by the vast number of calls to `encode_nested_example` (each sequence is converted to a list, and each element in the sequence...). Using the `Feature` `ArrayND` improves this somewhat to ~500/s as it now uses numpy's `tolist()` rather than iterating over each value in the array and converting them individually. However, it's still pretty slow and in theory it should be possible to avoid the `numpy -> python -> arrow` dance altogether. To demonstrate this, if you keep the `Feature` set to an `ArrayND` but instead return a `pa_ndarray(...)` in `_generate_examples` it skips the conversion (`return obj, False`) and hits ~11_000/s. Two orders of magnitude speed up! The problem is this then fails later on when the `ArrowWriter` tries to write the examples to disk :-( It would be nice to have first-class support for user-defined PyArrow objects. Is this a possibility? We have _large_ datasets where even an order of magnitude difference is important so settling on the middle ~500/s is less than ideal! Is there a workaround for this or another method that should be used instead that gets near-to or equal performance to returning PyArrow arrays?
After switching to `datasets` my model just broke. After a weekend of debugging, the issue was that my model could not handle the double that the Dataset provided, as it expected a float (but didn't give a warning, which seems a [PyTorch issue](https://discuss.pytorch.org/t/is-it-required-that-input-and-hidden-for-gru-have-the-same-dtype-float32/96221)). As a user I did not expect this bug. I have a `map` function that I call on the Dataset that looks like this: ```python def preprocess(sentences: List[str]): token_ids = [[vocab.to_index(t) for t in s.split()] for s in sentences] sembeddings = stransformer.encode(sentences) print(sembeddings.dtype) return {"input_ids": token_ids, "sembedding": sembeddings} ``` Given a list of `sentences` (`List[str]`), it converts those into token_ids on the one hand (list of lists of ints; `List[List[int]]`) and into sentence embeddings on the other (Tensor of dtype `torch.float32`). That means that I actually set the column "sembedding" to a tensor that I as a user expect to be a float32. It appears though that behind the scenes, this tensor is converted into a **list**. I did not find this documented anywhere but I might have missed it. From a user's perspective this is incredibly important though, because it means you cannot do any data_type or tensor casting yourself in a mapping function! Furthermore, this can lead to issues, as was my case. My model expected float32 precision, which I thought `sembedding` was because that is what `stransformer.encode` outputs. But behind the scenes this tensor is first cast to a list, and when we then set its format, as below, this column is cast not to float32 but to double precision float64. ```python dataset.set_format(type="torch", columns=["input_ids", "sembedding"]) ``` This happens because apparently there is an intermediate step of casting to a **numpy** array (?) **whose dtype creation/deduction is different from torch dtypes** (see the snippet below). As you can see, this means that the dtype is not preserved: if I got it right, the dataset goes from torch.float32 -> list -> float64 (numpy) -> torch.float64. ```python import torch import numpy as np l = [-0.03010837361216545, -0.035979013890028, -0.016949838027358055] torch_tensor = torch.tensor(l) np_array = np.array(l) np_to_torch = torch.from_numpy(np_array) print(torch_tensor.dtype) # torch.float32 print(np_array.dtype) # float64 print(np_to_torch.dtype) # torch.float64 ``` This might lead to unwanted behaviour. I understand that the whole library is probably built around casting from numpy to other frameworks, so this might be difficult to solve. Perhaps `set_format` should include a `dtypes` option where for each input column the user can specify the wanted precision. The alternative is that the user needs to cast manually after loading data from the dataset but that does not seem user-friendly, makes the dataset less portable, and might use more space in memory as well as on disk than is actually needed.
239
dtype of tensors should be preserved After switching to `datasets` my model just broke. After a weekend of debugging, the issue was that my model could not handle the double that the Dataset provided, as it expected a float (but didn't give a warning, which seems a [PyTorch issue](https://discuss.pytorch.org/t/is-it-required-that-input-and-hidden-for-gru-have-the-same-dtype-float32/96221)). As a user I did not expect this bug. I have a `map` function that I call on the Dataset that looks like this: ```python def preprocess(sentences: List[str]): token_ids = [[vocab.to_index(t) for t in s.split()] for s in sentences] sembeddings = stransformer.encode(sentences) print(sembeddings.dtype) return {"input_ids": token_ids, "sembedding": sembeddings} ``` Given a list of `sentences` (`List[str]`), it converts those into token_ids on the one hand (list of lists of ints; `List[List[int]]`) and into sentence embeddings on the other (Tensor of dtype `torch.float32`). That means that I actually set the column "sembedding" to a tensor that I as a user expect to be a float32. It appears though that behind the scenes, this tensor is converted into a **list**. I did not find this documented anywhere but I might have missed it. From a user's perspective this is incredibly important though, because it means you cannot do any data_type or tensor casting yourself in a mapping function! Furthermore, this can lead to issues, as was my case. My model expected float32 precision, which I thought `sembedding` was because that is what `stransformer.encode` outputs. But behind the scenes this tensor is first cast to a list, and when we then set its format, as below, this column is cast not to float32 but to double precision float64. ```python dataset.set_format(type="torch", columns=["input_ids", "sembedding"]) ``` This happens because apparently there is an intermediate step of casting to a **numpy** array (?) **whose dtype creation/deduction is different from torch dtypes** (see the snippet below). As you can see, this means that the dtype is not preserved: if I got it right, the dataset goes from torch.float32 -> list -> float64 (numpy) -> torch.float64. ```python import torch import numpy as np l = [-0.03010837361216545, -0.035979013890028, -0.016949838027358055] torch_tensor = torch.tensor(l) np_array = np.array(l) np_to_torch = torch.from_numpy(np_array) print(torch_tensor.dtype) # torch.float32 print(np_array.dtype) # float64 print(np_to_torch.dtype) # torch.float64 ``` This might lead to unwanted behaviour. I understand that the whole library is probably built around casting from numpy to other frameworks, so this might be difficult to solve. Perhaps `set_format` should include a `dtypes` option where for each input column the user can specify the wanted precision. The alternative is that the user needs to cast manually after loading data from the dataset but that does not seem user-friendly, makes the dataset less portable, and might use more space in memory as well as on disk than is actually needed. @lhoestq Have you thought about this further? We have a use case where we're attempting to load data containing numpy arrays using the `datasets` library. When using one of the "standard" methods (`[Value(...)]` or `Sequence()`) we see ~200 samples processed per second during the call to `_prepare_split`. This slowdown is caused by the vast number of calls to `encode_nested_example` (each sequence is converted to a list, and each element in the sequence...). Using the `Feature` `ArrayND` improves this somewhat to ~500/s as it now uses numpy's `tolist()` rather than iterating over each value in the array and converting them individually. However, it's still pretty slow and in theory it should be possible to avoid the `numpy -> python -> arrow` dance altogether. To demonstrate this, if you keep the `Feature` set to an `ArrayND` but instead return a `pa_ndarray(...)` in `_generate_examples` it skips the conversion (`return obj, False`) and hits ~11_000/s. Two orders of magnitude speed up! The problem is this then fails later on when the `ArrowWriter` tries to write the examples to disk :-( It would be nice to have first-class support for user-defined PyArrow objects. Is this a possibility? We have _large_ datasets where even an order of magnitude difference is important so settling on the middle ~500/s is less than ideal! Is there a workaround for this or another method that should be used instead that gets near-to or equal performance to returning PyArrow arrays?
[ -0.1134336144, -0.2211150825, -0.0097108036, 0.2073050141, 0.5532286763, 0.1730129719, 0.5313700438, 0.1225808039, 0.1504826248, -0.0665392429, -0.0843995586, 0.2457151115, -0.1175517887, -0.1751454324, 0.1026160568, -0.2025274038, 0.2282531857, -0.065256834, -0.1441731602, -0.2050459087, -0.2274058163, -0.0888315886, -0.0027086586, -0.2082732171, -0.1601838768, -0.1640592217, 0.240316689, -0.1392810941, -0.1851638108, -0.0010074526, 0.1801276207, -0.165222764, 0.4167292118, 0.5717648268, -0.0001163156, 0.2217252254, -0.0266850367, -0.0754260197, -0.1886361241, -0.0001530126, 0.0046068206, -0.1931502819, -0.0723766983, -0.1798605621, -0.1541206837, -0.1833254099, -0.0845148191, -0.7935447097, 0.2889492214, 0.4108264744, 0.1519325674, 0.2570821345, 0.0712023601, 0.1525934786, -0.1351298988, 0.2725165784, -0.1030129716, 0.2802621126, -0.0208771788, 0.3604672551, -0.0614817999, 0.4459062815, -0.3614788353, -0.0750735849, 0.0291571636, 0.1127887666, 0.0541249253, -0.4639117122, -0.0662840605, 0.1654174775, 0.1973421127, -0.2320810258, -0.2843654752, -0.1672884375, -0.0933568627, -0.3671623468, 0.0662377477, -0.0845816582, 0.1454866976, 0.1031547934, -0.084708333, -0.1715694964, -0.0345693901, 0.132095933, -0.4753400683, 0.2185825408, -0.0089361612, 0.147419095, -0.1035436094, -0.0327194408, 0.0743178278, -0.1159509942, 0.220984593, -0.0332038254, -0.1700053811, -0.2236017585, 0.106810376, -0.4505302608, -0.1278471351, -0.4818674624, 0.2282645702, 0.1562226564, -0.2183919698, -0.0274901241, 0.2101505399, 0.3317387402, -0.2976430655, 0.3678762019, 0.2994464934, -0.0447379835, 0.2111326456, 0.0985192806, 0.1282294542, 0.0464339815, -0.0115372315, 0.1037944406, 0.5391174555, -0.1061273888, -0.3156950176, 0.2423449755, -0.4640918672, 0.0854397118, 0.0290503949, 0.0762219578, -0.1580674946, 0.4469192326, 0.1909371465, 0.0856407434, -0.1537803113, -0.0482111759, -0.1408703327, -0.2463302016, 0.0493122786, -0.1757228971, 0.0473481566, 0.1110769957, -0.0556813776, 0.290984422, 0.130494982, 0.1616605818, -0.0623957328, -0.1468246132, 0.515632689, 0.2943936288, -0.3300796151, 0.1609629989, 0.214812547, -0.3500459194, -0.1941234469, 0.3413830698, -0.2835557461, 0.0002672374, -0.2552977204, 0.0896735489, -0.062633574, -0.1942968667, 0.0383377224, 0.5168799162, 0.5945782065, 0.0745405406, 0.3335527182, -0.5255037546, -0.2444200218, -0.2121092826, 0.0651091039, -0.0023090839, -0.4651450515, -0.0084925443, 0.2366488278, 0.1842958629, 0.3561901748, 0.4137145579, 0.070492059, 0.1077741683, 0.0749109462, -0.0053548366, 0.422085762, 0.1117165983, -0.2204664946, -0.0649489835, -0.0586558543, 0.3635666966, -0.0807049125, 0.0079561472, 0.3194953799, -0.2035727054, 0.207832396, 0.0254101194, -0.2768172324, 0.081522055, -0.2061398327, 0.0387927443, 0.4647619426, -0.0825527012, -0.0355011635, -0.0381798372, -0.3315748572, 0.3626126051, 0.2424904704, -0.1368880272, 0.0368466824, 0.0449779406, 0.0548131429, -0.0328166448, 0.083296001, -0.0828899145, -0.523478508, -0.0470568091, 0.0362096056, 0.2670903206, -0.0503192022, -0.1362543404, 0.2245899439, 0.0831880346, -0.0103787761, 0.0391421914, 0.068150647, -0.1717520505, -0.4256087542, 0.0723533034, -0.0966649354, -0.2199125588, -0.0393037573, 0.1064352095, -0.497748524, 0.080811061, 0.0075051542, -0.196670413, -0.2570156753, 0.2851540446, 0.0231733844, -0.0891863108, -0.2323271185, 0.0862705931, 0.1931111217, -0.0923556909, -0.6234654784, 0.5602860451, 0.5223325491, 0.0226680264, 0.196981132, 0.417899996, 0.0167535916, 0.066881381, -0.1525681913, 0.166479975, 0.1083380207, 0.0184073355, -0.295819819, 0.0697329417, 0.2497302741, 0.1761268228, -0.2609066963, -0.1829362661, -0.14051193, -0.0761566609, -0.0279821493, 0.1387432218, -0.4120872617, 0.3359063864, 0.6934579611, -0.0322736055, 0.264243722, 0.0308322217, -0.3138769269, -0.1533609182, 0.2285328209, -0.121576719, 0.2470543087, 0.085140489, 0.3608531058, -0.1999171078, -0.1625251323, -0.0748272985, 0.2153990865, 0.0469215438, 0.0663606897, 0.0049414169, 0.075100854, 0.154191941, -0.1057291776, 0.2709604204, -0.0503993109, 0.2032074034, -0.4259135127, 0.2171183378, -0.401653856, 0.0512693822, -0.2296759188, -0.0917811692, -0.0777893066, -0.245613724, -0.0293585751, 0.0008401242, -0.2700946331, 0.393512845, 0.2614876032, 0.1166924238, 0.0873448104, -0.2350942492, -0.114613235, -0.1724228412, -0.2385483682, 0.0400344506, 0.2083546817, -0.3795831203, 0.1849565059, 0.3574537933, -0.2676814795, -0.2696792185, -0.7421148419, 0.0485986546, -0.2355481386, 0.0960824043, 0.0381220654, 0.129312247, -0.0246390998, 0.0510538854, 0.0254739635, 0.09447667, -0.018103987, 0.1180343628, -0.1916359961, 0.0291313007, -0.2872900963, -0.2801525295, -0.0367823467, -0.1219954342, 0.1812552363, -0.0798305348, -0.0676222444, 0.0514864661, 0.0912298709, 0.0310502388, -0.0581304803, 0.231089741, -0.4207526445, 0.0914394483, 0.334905386, -0.1597365737, -0.3068856299, -0.1432787329, 0.0234638304, -0.0182584673, 0.074514091, -0.2800922692, -0.0570331328, -0.2157622725, 0.2658356428, -0.0140345972, -0.1009146497, 0.5142833591, 0.2329592407, 0.0712705329, -0.0603920557, 0.015321821, 0.0630325675, 0.2177909315, 0.2721605599, -0.0094192941, 0.3699364364, 0.1042173952, 0.358104825, 0.2943380475, -0.6948934793, -0.1260359883, -0.0573533401, 0.2153843343, -0.2174500972, -0.2911496758, 0.0320031941, -0.2955265641, -0.1989386678, 0.0065346621, -0.2766370475, -0.2770921886, 0.1184649616, 0.2331217527, 0.1338419616, -0.4014646411, 0.4034070671, -0.3055262268, -0.0188107193, 0.0703195035, 0.1725529581, -0.1366490126, -0.1540073156, 0.1249558032, 0.0767138973, 0.1390782148, -0.0452389829, -0.1560621262, -0.1616585106, -0.5573343635, 0.2960476875, 0.0733386129, 0.3998683095, -0.101998888, -0.3138101101, -0.1236894354, 0.2658504844, 0.3583014607, -0.0651800931, -0.3085550964, -0.0340753198, 0.0468397588, -0.1119081452, 0.0908225179, -0.3705395162, 0.1399977207, 0.1598604321, 0.4799205959, -0.3152985573, -0.1501232237, 0.3407086432, 0.0990059972, -0.2651865184, 0.1257001013, -0.3059801161, -0.217253387, 0.1029197499, -0.1605291367, 0.3218427896, -0.1130135208, -0.1629984826, -0.1421162784, -0.1887492687, -0.0593866706, 0.3024404943, -0.0537968241, 0.174654305, 0.2735586166, -0.0524263158, 0.1738077998, 0.4484737515, 0.0710387602, 0.3958069086, -0.2539947331, -0.4725272655, 0.1205892116, 0.2626002133, 0.2191854417, 0.3360384107, -0.0665045381, -0.0107503273, 0.0495241545, 0.2013027668, -0.4152361155, 0.1450720429, 0.3002058864, 0.4395445585, -0.3818066716, -0.3916044831, 0.3398996592, 0.1049043238, -0.1118495092, 0.3452863693, -0.3406514525, -0.4210640788, 0.4126034677, 0.4866728783, 0.94391644, -0.051524777, 0.3623065948, 0.3886758685, 0.0395122543, 0.7027943134, -0.2195883691, 0.0953721404, -0.1092886031, -0.2003382593, -0.1821272969, 0.0283359066, 0.1922812462, -0.1768276691, -0.3934643269, -0.0114231706, 0.0450150669, 0.3950945139, 0.0918087959, 0.1370074153, 0.1337354779, -0.3839355707, 0.0010396261, 0.0962077603, -0.1420923024, -0.0245580226, 0.0258369744, -0.1296163052, -0.2271974683, -0.0388556868, -0.2351937592, -0.2038245648, -0.6721232533, 0.227905646, -0.2112895697, -0.3824329972, 0.2119166255, 0.3805316985, 0.4986085892, 0.0699417368, -0.1444133222, -0.0540998578, 0.1406935751, 0.250490725, 0.3674525619, -0.3385370374, 0.5262500048, 0.1788809597, -0.0170450509, -0.0523226112, 0.0190298371, 0.1195784956, -0.3839370608, -0.0444362238, 0.0104464153, -0.4486308694, 0.0180262439, 0.051819168, 0.1929473132, -0.0719196349, 0.0803572536, 0.0033199172, 0.0604200028, 0.4390888512, -0.2384051532, -0.2049400359, -0.1360176504, 0.2581453621, 0.3120487928, -0.0276789591, 0.3316819072, 0.1748650521, -0.2199060917, -0.1326006949, 0.269662261, 0.096247673, -0.2718694508, 0.2938762605, -0.2294586897, -0.3275020421, -0.4029333889, 0.0242942739, 0.1296151876, 0.0200842898, -0.4275779128, 0.0481217206, -0.4269217849, 0.1955786496, 0.1994587332, 0.2294970751, -0.0968036354, 0.2902588248, -0.1862219274, -0.0752581805, -0.2788595259, -0.1454714239, 0.2015165091, 0.1607854664, -0.0480513833, -0.0622588359, 0.215378046, -0.2433128953, 0.0800852478, 0.0955788642, 0.0832241625, -0.1688763201, -0.1338000298, 0.0844032094, -0.0721536726, -0.1631307006, -0.0209845863, -0.1953461468, -0.1867281199, -0.1678042412, 0.1126663238, 0.2799354494, 0.1018424034, 0.2763626873, 0.0417338498, 0.1626680046, 0.3086151481, 0.0054950528, -0.1270997971, 0.2509423196, 0.063558951, 0.3367977738, 0.0193025991, -0.0522144139, -0.1994487792, 0.2303505838, 0.0502555668, -0.0898418948, 0.4385260344, -0.1862916201, -0.1380280405, 0.0113268159, 0.1839900762, 0.2505348027, -0.312196672, -0.1436043233, -0.0271206591, 0.108014293, -0.1846054196, 0.1011112928, 0.1920876205, -0.1086048707, 0.126740098, 0.4136994183, -0.0296261907, 0.5634446144, -0.0397619195, -0.0074641779, 0.1605867147, -0.2002462596, 0.2797859609, 0.263199091, -0.0898274779, 0.1576704383, 0.2914582789, 0.0799151585, 0.1357218623, 0.143338576, 0.0475201681, 0.2089090943, 0.059794981, 0.0278488994, 0.4162576199, -0.1557864845, 0.1227426454, 0.0908497199, -0.4397898018, 0.2601423264, 0.3826687336, 0.0751200765, -0.305226177, -0.2359906286, -0.1305059344, 0.1503946781, -0.4149085879, 0.0025797896, -0.2539524436, 0.0859751105, -0.0619615614, -0.2450638115, -0.2087459266, -0.0473562516, 0.2073627561, 0.2033633888, -0.1458738595, 0.0916258693, -0.2127765268, -0.1652701348, 0.08959952, 0.0469890982, 0.3118065298, 0.3976687193, -0.190330416, -0.1640623808, 0.2795585692, 0.2608897388, 0.1900887787, -0.2551001608, 0.3188129663, 0.1368777901, -0.2822489142, 0.0019034073, 0.259434849, 0.0223127957, -0.0181629397, 0.2152704597, 0.1037105396, -0.0287231989, 0.1587678939, -0.0829132944, 0.2346436232, 0.0197712034, 0.6455356479, -0.0429135337, -0.1467235833, -0.0784748271, -0.207385391, -0.2123956978, -0.2224058956, 0.7710805535, -0.1710174084, 0.2005573809, -0.0775000826, 0.0657909364, 0.2825327218, 0.3724639416, 0.0875459909, -0.0523023382, -0.2187705934, 0.3671227694, -0.27810359, 0.1055852175, 0.2141847312, -0.0478555821, -0.0218304284, 0.2236845195, 0.4392274618, 0.2588707805, 0.0447612181, -0.4173417985, 0.2164114714, 0.0331137255, -0.1054255143, -0.2507251501, -0.4146488607, 0.1806094348, 0.0294061229, -0.3498019576, 0.3316437304, 0.1330267787, -0.0241066515, -0.4745330513, 0.0894951522, -0.0053971633, 0.2187124789, 0.2378495634, 0.4377464056, 0.1268993616, -0.1296064258, -0.255592823, 0.1482959688, -0.0524923131, -0.4803517461, 0.4116392136, -0.1601350904, 0.3657150269, -0.3011513948, -0.2907636166, -0.1951593459, 0.0258601829, 0.0851696581, -0.4364663959, -0.5503332615, 0.3894612789, -0.1322183162, -0.0845067501, 0.2017910331, 0.4802950025, 0.0986787528, 0.3818565309, -0.2019928843, -0.3419241309, 0.6405502558, -0.2690971494, -0.0412026197, -0.0623358898, -0.0725637078, -0.0648595989, 0.0509312525, -0.8520144224, -0.2399327308, 0.2974298, -0.2665521801, -0.2482232749, 0.309158653, 0.1196828038, 0.134145543, -0.3145253956, 0.5315724015, -0.045614209, -0.152200073, 0.094045639, -0.1882256418 ]
https://github.com/huggingface/datasets/issues/625
dtype of tensors should be preserved
Hi ! It would be awesome to achieve this speed for numpy arrays ! For now we have to use `encode_nested_example` to convert numpy arrays to python lists since pyarrow doesn't support multidimensional numpy arrays (only 1D). Maybe let's start a new PR from your PR @bhavitvyamalik (idk why we didn't answer your PR at that time, sorry about that). Basically the idea is to allow `TypedSequence` to support numpy arrays as you did, and remove the numpy->python casting in `_cast_to_python_objects`. This is really important since we are starting to have a focus on other modalities than text as well (audio, images). Though until then @samgd, there is another feature that may interest you and that may give you the speed you want: In a dataset script you can subclass either a GeneratorBasedBuilder (with the `_generate_examples ` method) or an ArrowBasedBuilder if you want. the ArrowBasedBuilder allows to yield arrow data by implementing the `_generate_tables` method (it's the same as `_generate_examples` except you must yield arrow tables). Since the data are already in arrow format, it doesn't call `encode_nested_example`. Let me know if that helps.
After switching to `datasets` my model just broke. After a weekend of debugging, the issue was that my model could not handle the double that the Dataset provided, as it expected a float (but didn't give a warning, which seems a [PyTorch issue](https://discuss.pytorch.org/t/is-it-required-that-input-and-hidden-for-gru-have-the-same-dtype-float32/96221)). As a user I did not expect this bug. I have a `map` function that I call on the Dataset that looks like this: ```python def preprocess(sentences: List[str]): token_ids = [[vocab.to_index(t) for t in s.split()] for s in sentences] sembeddings = stransformer.encode(sentences) print(sembeddings.dtype) return {"input_ids": token_ids, "sembedding": sembeddings} ``` Given a list of `sentences` (`List[str]`), it converts those into token_ids on the one hand (list of lists of ints; `List[List[int]]`) and into sentence embeddings on the other (Tensor of dtype `torch.float32`). That means that I actually set the column "sembedding" to a tensor that I as a user expect to be a float32. It appears though that behind the scenes, this tensor is converted into a **list**. I did not find this documented anywhere but I might have missed it. From a user's perspective this is incredibly important though, because it means you cannot do any data_type or tensor casting yourself in a mapping function! Furthermore, this can lead to issues, as was my case. My model expected float32 precision, which I thought `sembedding` was because that is what `stransformer.encode` outputs. But behind the scenes this tensor is first cast to a list, and when we then set its format, as below, this column is cast not to float32 but to double precision float64. ```python dataset.set_format(type="torch", columns=["input_ids", "sembedding"]) ``` This happens because apparently there is an intermediate step of casting to a **numpy** array (?) **whose dtype creation/deduction is different from torch dtypes** (see the snippet below). As you can see, this means that the dtype is not preserved: if I got it right, the dataset goes from torch.float32 -> list -> float64 (numpy) -> torch.float64. ```python import torch import numpy as np l = [-0.03010837361216545, -0.035979013890028, -0.016949838027358055] torch_tensor = torch.tensor(l) np_array = np.array(l) np_to_torch = torch.from_numpy(np_array) print(torch_tensor.dtype) # torch.float32 print(np_array.dtype) # float64 print(np_to_torch.dtype) # torch.float64 ``` This might lead to unwanted behaviour. I understand that the whole library is probably built around casting from numpy to other frameworks, so this might be difficult to solve. Perhaps `set_format` should include a `dtypes` option where for each input column the user can specify the wanted precision. The alternative is that the user needs to cast manually after loading data from the dataset but that does not seem user-friendly, makes the dataset less portable, and might use more space in memory as well as on disk than is actually needed.
185
dtype of tensors should be preserved After switching to `datasets` my model just broke. After a weekend of debugging, the issue was that my model could not handle the double that the Dataset provided, as it expected a float (but didn't give a warning, which seems a [PyTorch issue](https://discuss.pytorch.org/t/is-it-required-that-input-and-hidden-for-gru-have-the-same-dtype-float32/96221)). As a user I did not expect this bug. I have a `map` function that I call on the Dataset that looks like this: ```python def preprocess(sentences: List[str]): token_ids = [[vocab.to_index(t) for t in s.split()] for s in sentences] sembeddings = stransformer.encode(sentences) print(sembeddings.dtype) return {"input_ids": token_ids, "sembedding": sembeddings} ``` Given a list of `sentences` (`List[str]`), it converts those into token_ids on the one hand (list of lists of ints; `List[List[int]]`) and into sentence embeddings on the other (Tensor of dtype `torch.float32`). That means that I actually set the column "sembedding" to a tensor that I as a user expect to be a float32. It appears though that behind the scenes, this tensor is converted into a **list**. I did not find this documented anywhere but I might have missed it. From a user's perspective this is incredibly important though, because it means you cannot do any data_type or tensor casting yourself in a mapping function! Furthermore, this can lead to issues, as was my case. My model expected float32 precision, which I thought `sembedding` was because that is what `stransformer.encode` outputs. But behind the scenes this tensor is first cast to a list, and when we then set its format, as below, this column is cast not to float32 but to double precision float64. ```python dataset.set_format(type="torch", columns=["input_ids", "sembedding"]) ``` This happens because apparently there is an intermediate step of casting to a **numpy** array (?) **whose dtype creation/deduction is different from torch dtypes** (see the snippet below). As you can see, this means that the dtype is not preserved: if I got it right, the dataset goes from torch.float32 -> list -> float64 (numpy) -> torch.float64. ```python import torch import numpy as np l = [-0.03010837361216545, -0.035979013890028, -0.016949838027358055] torch_tensor = torch.tensor(l) np_array = np.array(l) np_to_torch = torch.from_numpy(np_array) print(torch_tensor.dtype) # torch.float32 print(np_array.dtype) # float64 print(np_to_torch.dtype) # torch.float64 ``` This might lead to unwanted behaviour. I understand that the whole library is probably built around casting from numpy to other frameworks, so this might be difficult to solve. Perhaps `set_format` should include a `dtypes` option where for each input column the user can specify the wanted precision. The alternative is that the user needs to cast manually after loading data from the dataset but that does not seem user-friendly, makes the dataset less portable, and might use more space in memory as well as on disk than is actually needed. Hi ! It would be awesome to achieve this speed for numpy arrays ! For now we have to use `encode_nested_example` to convert numpy arrays to python lists since pyarrow doesn't support multidimensional numpy arrays (only 1D). Maybe let's start a new PR from your PR @bhavitvyamalik (idk why we didn't answer your PR at that time, sorry about that). Basically the idea is to allow `TypedSequence` to support numpy arrays as you did, and remove the numpy->python casting in `_cast_to_python_objects`. This is really important since we are starting to have a focus on other modalities than text as well (audio, images). Though until then @samgd, there is another feature that may interest you and that may give you the speed you want: In a dataset script you can subclass either a GeneratorBasedBuilder (with the `_generate_examples ` method) or an ArrowBasedBuilder if you want. the ArrowBasedBuilder allows to yield arrow data by implementing the `_generate_tables` method (it's the same as `_generate_examples` except you must yield arrow tables). Since the data are already in arrow format, it doesn't call `encode_nested_example`. Let me know if that helps.
[ -0.1134336144, -0.2211150825, -0.0097108036, 0.2073050141, 0.5532286763, 0.1730129719, 0.5313700438, 0.1225808039, 0.1504826248, -0.0665392429, -0.0843995586, 0.2457151115, -0.1175517887, -0.1751454324, 0.1026160568, -0.2025274038, 0.2282531857, -0.065256834, -0.1441731602, -0.2050459087, -0.2274058163, -0.0888315886, -0.0027086586, -0.2082732171, -0.1601838768, -0.1640592217, 0.240316689, -0.1392810941, -0.1851638108, -0.0010074526, 0.1801276207, -0.165222764, 0.4167292118, 0.5717648268, -0.0001163156, 0.2217252254, -0.0266850367, -0.0754260197, -0.1886361241, -0.0001530126, 0.0046068206, -0.1931502819, -0.0723766983, -0.1798605621, -0.1541206837, -0.1833254099, -0.0845148191, -0.7935447097, 0.2889492214, 0.4108264744, 0.1519325674, 0.2570821345, 0.0712023601, 0.1525934786, -0.1351298988, 0.2725165784, -0.1030129716, 0.2802621126, -0.0208771788, 0.3604672551, -0.0614817999, 0.4459062815, -0.3614788353, -0.0750735849, 0.0291571636, 0.1127887666, 0.0541249253, -0.4639117122, -0.0662840605, 0.1654174775, 0.1973421127, -0.2320810258, -0.2843654752, -0.1672884375, -0.0933568627, -0.3671623468, 0.0662377477, -0.0845816582, 0.1454866976, 0.1031547934, -0.084708333, -0.1715694964, -0.0345693901, 0.132095933, -0.4753400683, 0.2185825408, -0.0089361612, 0.147419095, -0.1035436094, -0.0327194408, 0.0743178278, -0.1159509942, 0.220984593, -0.0332038254, -0.1700053811, -0.2236017585, 0.106810376, -0.4505302608, -0.1278471351, -0.4818674624, 0.2282645702, 0.1562226564, -0.2183919698, -0.0274901241, 0.2101505399, 0.3317387402, -0.2976430655, 0.3678762019, 0.2994464934, -0.0447379835, 0.2111326456, 0.0985192806, 0.1282294542, 0.0464339815, -0.0115372315, 0.1037944406, 0.5391174555, -0.1061273888, -0.3156950176, 0.2423449755, -0.4640918672, 0.0854397118, 0.0290503949, 0.0762219578, -0.1580674946, 0.4469192326, 0.1909371465, 0.0856407434, -0.1537803113, -0.0482111759, -0.1408703327, -0.2463302016, 0.0493122786, -0.1757228971, 0.0473481566, 0.1110769957, -0.0556813776, 0.290984422, 0.130494982, 0.1616605818, -0.0623957328, -0.1468246132, 0.515632689, 0.2943936288, -0.3300796151, 0.1609629989, 0.214812547, -0.3500459194, -0.1941234469, 0.3413830698, -0.2835557461, 0.0002672374, -0.2552977204, 0.0896735489, -0.062633574, -0.1942968667, 0.0383377224, 0.5168799162, 0.5945782065, 0.0745405406, 0.3335527182, -0.5255037546, -0.2444200218, -0.2121092826, 0.0651091039, -0.0023090839, -0.4651450515, -0.0084925443, 0.2366488278, 0.1842958629, 0.3561901748, 0.4137145579, 0.070492059, 0.1077741683, 0.0749109462, -0.0053548366, 0.422085762, 0.1117165983, -0.2204664946, -0.0649489835, -0.0586558543, 0.3635666966, -0.0807049125, 0.0079561472, 0.3194953799, -0.2035727054, 0.207832396, 0.0254101194, -0.2768172324, 0.081522055, -0.2061398327, 0.0387927443, 0.4647619426, -0.0825527012, -0.0355011635, -0.0381798372, -0.3315748572, 0.3626126051, 0.2424904704, -0.1368880272, 0.0368466824, 0.0449779406, 0.0548131429, -0.0328166448, 0.083296001, -0.0828899145, -0.523478508, -0.0470568091, 0.0362096056, 0.2670903206, -0.0503192022, -0.1362543404, 0.2245899439, 0.0831880346, -0.0103787761, 0.0391421914, 0.068150647, -0.1717520505, -0.4256087542, 0.0723533034, -0.0966649354, -0.2199125588, -0.0393037573, 0.1064352095, -0.497748524, 0.080811061, 0.0075051542, -0.196670413, -0.2570156753, 0.2851540446, 0.0231733844, -0.0891863108, -0.2323271185, 0.0862705931, 0.1931111217, -0.0923556909, -0.6234654784, 0.5602860451, 0.5223325491, 0.0226680264, 0.196981132, 0.417899996, 0.0167535916, 0.066881381, -0.1525681913, 0.166479975, 0.1083380207, 0.0184073355, -0.295819819, 0.0697329417, 0.2497302741, 0.1761268228, -0.2609066963, -0.1829362661, -0.14051193, -0.0761566609, -0.0279821493, 0.1387432218, -0.4120872617, 0.3359063864, 0.6934579611, -0.0322736055, 0.264243722, 0.0308322217, -0.3138769269, -0.1533609182, 0.2285328209, -0.121576719, 0.2470543087, 0.085140489, 0.3608531058, -0.1999171078, -0.1625251323, -0.0748272985, 0.2153990865, 0.0469215438, 0.0663606897, 0.0049414169, 0.075100854, 0.154191941, -0.1057291776, 0.2709604204, -0.0503993109, 0.2032074034, -0.4259135127, 0.2171183378, -0.401653856, 0.0512693822, -0.2296759188, -0.0917811692, -0.0777893066, -0.245613724, -0.0293585751, 0.0008401242, -0.2700946331, 0.393512845, 0.2614876032, 0.1166924238, 0.0873448104, -0.2350942492, -0.114613235, -0.1724228412, -0.2385483682, 0.0400344506, 0.2083546817, -0.3795831203, 0.1849565059, 0.3574537933, -0.2676814795, -0.2696792185, -0.7421148419, 0.0485986546, -0.2355481386, 0.0960824043, 0.0381220654, 0.129312247, -0.0246390998, 0.0510538854, 0.0254739635, 0.09447667, -0.018103987, 0.1180343628, -0.1916359961, 0.0291313007, -0.2872900963, -0.2801525295, -0.0367823467, -0.1219954342, 0.1812552363, -0.0798305348, -0.0676222444, 0.0514864661, 0.0912298709, 0.0310502388, -0.0581304803, 0.231089741, -0.4207526445, 0.0914394483, 0.334905386, -0.1597365737, -0.3068856299, -0.1432787329, 0.0234638304, -0.0182584673, 0.074514091, -0.2800922692, -0.0570331328, -0.2157622725, 0.2658356428, -0.0140345972, -0.1009146497, 0.5142833591, 0.2329592407, 0.0712705329, -0.0603920557, 0.015321821, 0.0630325675, 0.2177909315, 0.2721605599, -0.0094192941, 0.3699364364, 0.1042173952, 0.358104825, 0.2943380475, -0.6948934793, -0.1260359883, -0.0573533401, 0.2153843343, -0.2174500972, -0.2911496758, 0.0320031941, -0.2955265641, -0.1989386678, 0.0065346621, -0.2766370475, -0.2770921886, 0.1184649616, 0.2331217527, 0.1338419616, -0.4014646411, 0.4034070671, -0.3055262268, -0.0188107193, 0.0703195035, 0.1725529581, -0.1366490126, -0.1540073156, 0.1249558032, 0.0767138973, 0.1390782148, -0.0452389829, -0.1560621262, -0.1616585106, -0.5573343635, 0.2960476875, 0.0733386129, 0.3998683095, -0.101998888, -0.3138101101, -0.1236894354, 0.2658504844, 0.3583014607, -0.0651800931, -0.3085550964, -0.0340753198, 0.0468397588, -0.1119081452, 0.0908225179, -0.3705395162, 0.1399977207, 0.1598604321, 0.4799205959, -0.3152985573, -0.1501232237, 0.3407086432, 0.0990059972, -0.2651865184, 0.1257001013, -0.3059801161, -0.217253387, 0.1029197499, -0.1605291367, 0.3218427896, -0.1130135208, -0.1629984826, -0.1421162784, -0.1887492687, -0.0593866706, 0.3024404943, -0.0537968241, 0.174654305, 0.2735586166, -0.0524263158, 0.1738077998, 0.4484737515, 0.0710387602, 0.3958069086, -0.2539947331, -0.4725272655, 0.1205892116, 0.2626002133, 0.2191854417, 0.3360384107, -0.0665045381, -0.0107503273, 0.0495241545, 0.2013027668, -0.4152361155, 0.1450720429, 0.3002058864, 0.4395445585, -0.3818066716, -0.3916044831, 0.3398996592, 0.1049043238, -0.1118495092, 0.3452863693, -0.3406514525, -0.4210640788, 0.4126034677, 0.4866728783, 0.94391644, -0.051524777, 0.3623065948, 0.3886758685, 0.0395122543, 0.7027943134, -0.2195883691, 0.0953721404, -0.1092886031, -0.2003382593, -0.1821272969, 0.0283359066, 0.1922812462, -0.1768276691, -0.3934643269, -0.0114231706, 0.0450150669, 0.3950945139, 0.0918087959, 0.1370074153, 0.1337354779, -0.3839355707, 0.0010396261, 0.0962077603, -0.1420923024, -0.0245580226, 0.0258369744, -0.1296163052, -0.2271974683, -0.0388556868, -0.2351937592, -0.2038245648, -0.6721232533, 0.227905646, -0.2112895697, -0.3824329972, 0.2119166255, 0.3805316985, 0.4986085892, 0.0699417368, -0.1444133222, -0.0540998578, 0.1406935751, 0.250490725, 0.3674525619, -0.3385370374, 0.5262500048, 0.1788809597, -0.0170450509, -0.0523226112, 0.0190298371, 0.1195784956, -0.3839370608, -0.0444362238, 0.0104464153, -0.4486308694, 0.0180262439, 0.051819168, 0.1929473132, -0.0719196349, 0.0803572536, 0.0033199172, 0.0604200028, 0.4390888512, -0.2384051532, -0.2049400359, -0.1360176504, 0.2581453621, 0.3120487928, -0.0276789591, 0.3316819072, 0.1748650521, -0.2199060917, -0.1326006949, 0.269662261, 0.096247673, -0.2718694508, 0.2938762605, -0.2294586897, -0.3275020421, -0.4029333889, 0.0242942739, 0.1296151876, 0.0200842898, -0.4275779128, 0.0481217206, -0.4269217849, 0.1955786496, 0.1994587332, 0.2294970751, -0.0968036354, 0.2902588248, -0.1862219274, -0.0752581805, -0.2788595259, -0.1454714239, 0.2015165091, 0.1607854664, -0.0480513833, -0.0622588359, 0.215378046, -0.2433128953, 0.0800852478, 0.0955788642, 0.0832241625, -0.1688763201, -0.1338000298, 0.0844032094, -0.0721536726, -0.1631307006, -0.0209845863, -0.1953461468, -0.1867281199, -0.1678042412, 0.1126663238, 0.2799354494, 0.1018424034, 0.2763626873, 0.0417338498, 0.1626680046, 0.3086151481, 0.0054950528, -0.1270997971, 0.2509423196, 0.063558951, 0.3367977738, 0.0193025991, -0.0522144139, -0.1994487792, 0.2303505838, 0.0502555668, -0.0898418948, 0.4385260344, -0.1862916201, -0.1380280405, 0.0113268159, 0.1839900762, 0.2505348027, -0.312196672, -0.1436043233, -0.0271206591, 0.108014293, -0.1846054196, 0.1011112928, 0.1920876205, -0.1086048707, 0.126740098, 0.4136994183, -0.0296261907, 0.5634446144, -0.0397619195, -0.0074641779, 0.1605867147, -0.2002462596, 0.2797859609, 0.263199091, -0.0898274779, 0.1576704383, 0.2914582789, 0.0799151585, 0.1357218623, 0.143338576, 0.0475201681, 0.2089090943, 0.059794981, 0.0278488994, 0.4162576199, -0.1557864845, 0.1227426454, 0.0908497199, -0.4397898018, 0.2601423264, 0.3826687336, 0.0751200765, -0.305226177, -0.2359906286, -0.1305059344, 0.1503946781, -0.4149085879, 0.0025797896, -0.2539524436, 0.0859751105, -0.0619615614, -0.2450638115, -0.2087459266, -0.0473562516, 0.2073627561, 0.2033633888, -0.1458738595, 0.0916258693, -0.2127765268, -0.1652701348, 0.08959952, 0.0469890982, 0.3118065298, 0.3976687193, -0.190330416, -0.1640623808, 0.2795585692, 0.2608897388, 0.1900887787, -0.2551001608, 0.3188129663, 0.1368777901, -0.2822489142, 0.0019034073, 0.259434849, 0.0223127957, -0.0181629397, 0.2152704597, 0.1037105396, -0.0287231989, 0.1587678939, -0.0829132944, 0.2346436232, 0.0197712034, 0.6455356479, -0.0429135337, -0.1467235833, -0.0784748271, -0.207385391, -0.2123956978, -0.2224058956, 0.7710805535, -0.1710174084, 0.2005573809, -0.0775000826, 0.0657909364, 0.2825327218, 0.3724639416, 0.0875459909, -0.0523023382, -0.2187705934, 0.3671227694, -0.27810359, 0.1055852175, 0.2141847312, -0.0478555821, -0.0218304284, 0.2236845195, 0.4392274618, 0.2588707805, 0.0447612181, -0.4173417985, 0.2164114714, 0.0331137255, -0.1054255143, -0.2507251501, -0.4146488607, 0.1806094348, 0.0294061229, -0.3498019576, 0.3316437304, 0.1330267787, -0.0241066515, -0.4745330513, 0.0894951522, -0.0053971633, 0.2187124789, 0.2378495634, 0.4377464056, 0.1268993616, -0.1296064258, -0.255592823, 0.1482959688, -0.0524923131, -0.4803517461, 0.4116392136, -0.1601350904, 0.3657150269, -0.3011513948, -0.2907636166, -0.1951593459, 0.0258601829, 0.0851696581, -0.4364663959, -0.5503332615, 0.3894612789, -0.1322183162, -0.0845067501, 0.2017910331, 0.4802950025, 0.0986787528, 0.3818565309, -0.2019928843, -0.3419241309, 0.6405502558, -0.2690971494, -0.0412026197, -0.0623358898, -0.0725637078, -0.0648595989, 0.0509312525, -0.8520144224, -0.2399327308, 0.2974298, -0.2665521801, -0.2482232749, 0.309158653, 0.1196828038, 0.134145543, -0.3145253956, 0.5315724015, -0.045614209, -0.152200073, 0.094045639, -0.1882256418 ]
https://github.com/huggingface/datasets/issues/623
Custom feature types in `load_dataset` from CSV
Currently `csv` doesn't support the `features` attribute (unlike `json`). What you can do for now is cast the features using the in-place transform `cast_` ```python from datasets import load_dataset dataset = load_dataset('csv', data_files=file_dict, delimiter=';', column_names=['text', 'label']) dataset.cast_(emotion_features) ```
I am trying to load a local file with the `load_dataset` function and I want to predefine the feature types with the `features` argument. However, the types are always the same independent of the value of `features`. I am working with the local files from the emotion dataset. To get the data you can use the following code: ```Python from pathlib import Path import wget EMOTION_PATH = Path("./data/emotion") DOWNLOAD_URLS = [ "https://www.dropbox.com/s/1pzkadrvffbqw6o/train.txt?dl=1", "https://www.dropbox.com/s/2mzialpsgf9k5l3/val.txt?dl=1", "https://www.dropbox.com/s/ikkqxfdbdec3fuj/test.txt?dl=1", ] if not Path.is_dir(EMOTION_PATH): Path.mkdir(EMOTION_PATH) for url in DOWNLOAD_URLS: wget.download(url, str(EMOTION_PATH)) ``` The first five lines of the train set are: ``` i didnt feel humiliated;sadness i can go from feeling so hopeless to so damned hopeful just from being around someone who cares and is awake;sadness im grabbing a minute to post i feel greedy wrong;anger i am ever feeling nostalgic about the fireplace i will know that it is still on the property;love i am feeling grouchy;anger ``` Here the code to reproduce the issue: ```Python from datasets import Features, Value, ClassLabel, load_dataset class_names = ["sadness", "joy", "love", "anger", "fear", "surprise"] emotion_features = Features({'text': Value('string'), 'label': ClassLabel(names=class_names)}) file_dict = {'train': EMOTION_PATH/'train.txt'} dataset = load_dataset('csv', data_files=file_dict, delimiter=';', column_names=['text', 'label'], features=emotion_features) ``` **Observed behaviour:** ```Python dataset['train'].features ``` ```Python {'text': Value(dtype='string', id=None), 'label': Value(dtype='string', id=None)} ``` **Expected behaviour:** ```Python dataset['train'].features ``` ```Python {'text': Value(dtype='string', id=None), 'label': ClassLabel(num_classes=6, names=['sadness', 'joy', 'love', 'anger', 'fear', 'surprise'], names_file=None, id=None)} ``` **Things I've tried:** - deleting the cache - trying other types such as `int64` Am I missing anything? Thanks for any pointer in the right direction.
38
Custom feature types in `load_dataset` from CSV I am trying to load a local file with the `load_dataset` function and I want to predefine the feature types with the `features` argument. However, the types are always the same independent of the value of `features`. I am working with the local files from the emotion dataset. To get the data you can use the following code: ```Python from pathlib import Path import wget EMOTION_PATH = Path("./data/emotion") DOWNLOAD_URLS = [ "https://www.dropbox.com/s/1pzkadrvffbqw6o/train.txt?dl=1", "https://www.dropbox.com/s/2mzialpsgf9k5l3/val.txt?dl=1", "https://www.dropbox.com/s/ikkqxfdbdec3fuj/test.txt?dl=1", ] if not Path.is_dir(EMOTION_PATH): Path.mkdir(EMOTION_PATH) for url in DOWNLOAD_URLS: wget.download(url, str(EMOTION_PATH)) ``` The first five lines of the train set are: ``` i didnt feel humiliated;sadness i can go from feeling so hopeless to so damned hopeful just from being around someone who cares and is awake;sadness im grabbing a minute to post i feel greedy wrong;anger i am ever feeling nostalgic about the fireplace i will know that it is still on the property;love i am feeling grouchy;anger ``` Here the code to reproduce the issue: ```Python from datasets import Features, Value, ClassLabel, load_dataset class_names = ["sadness", "joy", "love", "anger", "fear", "surprise"] emotion_features = Features({'text': Value('string'), 'label': ClassLabel(names=class_names)}) file_dict = {'train': EMOTION_PATH/'train.txt'} dataset = load_dataset('csv', data_files=file_dict, delimiter=';', column_names=['text', 'label'], features=emotion_features) ``` **Observed behaviour:** ```Python dataset['train'].features ``` ```Python {'text': Value(dtype='string', id=None), 'label': Value(dtype='string', id=None)} ``` **Expected behaviour:** ```Python dataset['train'].features ``` ```Python {'text': Value(dtype='string', id=None), 'label': ClassLabel(num_classes=6, names=['sadness', 'joy', 'love', 'anger', 'fear', 'surprise'], names_file=None, id=None)} ``` **Things I've tried:** - deleting the cache - trying other types such as `int64` Am I missing anything? Thanks for any pointer in the right direction. Currently `csv` doesn't support the `features` attribute (unlike `json`). What you can do for now is cast the features using the in-place transform `cast_` ```python from datasets import load_dataset dataset = load_dataset('csv', data_files=file_dict, delimiter=';', column_names=['text', 'label']) dataset.cast_(emotion_features) ```
[ 0.0802029371, -0.2782895565, -0.0531791002, 0.3509228528, 0.3172290921, -0.1943103671, 0.5701336265, 0.1113843173, 0.446125567, 0.0253307968, 0.094743818, 0.3161779642, -0.0919102058, 0.3901149631, -0.0581879392, 0.0267207623, -0.1612750292, 0.3348048329, -0.0091219395, -0.3497941494, -0.2713248134, 0.1799537241, -0.0930470228, -0.013486281, -0.3072173595, 0.3395892084, 0.2221690118, 0.143438682, 0.0583121479, -0.3765773177, 0.4237108827, 0.1486945301, 0.3030694425, 0.0801386833, -0.0001151109, 0.076882951, 0.0293593034, -0.1740052551, -0.0040853135, -0.3799196482, -0.0387843475, -0.202906549, 0.4247084856, -0.3651529551, -0.2385733575, -0.4564034939, -0.2810540497, -0.1292716712, 0.2869226635, 0.3292907178, 0.1512023211, -0.0502676442, -0.2750738859, 0.2184948921, 0.3587994277, 0.6755331755, -0.2523880601, 0.1870210171, -0.0406346805, -0.1123656034, 0.0939736664, -0.1022861525, -0.1438640356, 0.353423357, 0.4926100373, 0.1710617542, -0.0643914789, -0.1303499043, -0.1733283252, 0.2814249694, 0.3474104702, 0.0562938415, -0.0464358665, -0.3231964707, -0.1238626838, -0.2973287404, 0.382417351, 0.0848604888, -0.1818488538, 0.1604423374, -0.0789410546, 0.4355747104, -0.0581504069, 0.325290978, 0.0186878741, -0.0715494901, -0.2392115891, 0.1055341363, 0.0308264177, -0.2173109949, 0.0974365473, -0.349162221, 0.2218383104, 0.1172011793, -0.0551007129, 0.0995839685, 0.003935298, -0.0604649857, -0.1210798174, 0.0352323502, 0.2292757034, 0.1932828128, -0.1932462752, 0.0317142308, 0.2777897716, 0.2441823781, 0.2002111226, -0.2317834347, 0.1527379453, 0.104715839, -0.423279196, 0.0183334798, 0.0277890079, -0.3080183268, 0.3964600563, 0.0869552791, 0.5022854805, -0.2398283482, 0.057662189, -0.0980123505, 0.0428094119, -0.0321919359, 0.1175668389, 0.1835573018, -0.0262655485, 0.580987215, -0.0656144395, 0.1488876641, -0.3211800456, 0.1283437163, -0.0581272393, -0.2226472348, -0.0374680161, -0.0493203774, 0.4030345678, 0.0941050053, 0.2478335947, 0.199026376, -0.0237025656, -0.3186685145, -0.1270062923, -0.095956482, 0.1245452017, 0.0443357415, -0.441672802, 0.3375269473, 0.2412355989, -0.2255857289, -0.2444109023, 0.0952860862, -0.2509181201, -0.2328005731, 0.1970371008, 0.155583322, -0.1784608662, -0.020963341, -0.1068923622, 0.0988962054, 0.1280108392, 0.258272469, 0.0263721272, -0.4864162505, -0.3809948564, -0.3249777257, -0.0254870169, 0.395431757, -0.4154388011, -0.2039044201, 0.0394422971, -0.0463084355, 0.1385591328, 0.2530902922, -0.3661009073, 0.1451618969, -0.2393054813, 0.2845746875, 0.5265449286, 0.0433983952, -0.2177204043, 0.4294439852, 0.1565900892, 0.3267546892, 0.186067909, 0.0742609054, 0.0775938481, 0.1260918379, 0.1594995856, 0.4206011593, 0.1657207608, 0.067381829, -0.1837767363, -0.1553835273, 0.1656042188, -0.0030713007, -0.3288176954, 0.3090257049, 0.2257002145, -0.5133963823, 0.2171115279, -0.1282680333, -0.2264052331, -0.0468893833, 0.4030795097, 0.5202860236, 0.0057925545, -0.0192435011, -0.4824975431, 0.2246333212, -0.1493486017, -0.0522528514, -0.0067957081, -0.3252949715, -0.4934935868, -0.0567670278, -0.2471701205, 0.1995168179, 0.0646642521, 0.3136295676, -0.3358725309, 0.158852458, -0.0870927498, 0.0709818974, -0.2088879943, -0.2521633804, -0.0801552907, -0.0217761993, 0.1720780581, -0.0900507942, 0.0157924816, 0.0775025189, 0.2781896293, 0.0767479241, -0.3883282542, 0.1277149916, 0.2795607746, 0.161347121, -0.1129958481, 0.2116078734, -0.003028132, -0.0095146522, -0.0278554689, 0.1732269526, 0.1935627162, -0.0937050357, -0.0290848613, 0.6565464735, 0.1539309025, 0.248978585, -0.267847687, -0.1128531545, 0.2808354199, -0.0006699339, -0.1841064841, -0.0670059621, -0.2146350145, -0.0485210344, 0.0014962032, 0.4216480851, -0.3299654424, -0.1157782227, 0.4788911045, 0.0954587162, 0.1291624606, -0.080403775, -0.0172786154, -0.029092107, 0.0167229306, -0.2370506525, 0.4729243517, -0.0610825047, -0.0803651214, -0.008282477, 0.0192122981, -0.2243368626, 0.0599996336, -0.1394981444, -0.1910201311, 0.2915715873, 0.0343267135, -0.1579158157, -0.3376893699, 0.2412527055, -0.1466074288, -0.2058868557, -0.6342836618, -0.0990627632, -0.5823050737, 0.1683030576, -0.4637621641, -0.0822290182, -0.0697951391, 0.0163359642, -0.2120193094, 0.1033788174, -0.0353965759, 0.0900433883, -0.1784473062, 0.3625626564, 0.0643080026, -0.8123733997, 0.2222117037, 0.1188195944, -0.4499817193, -0.0581740029, 0.1749914736, 0.2886118591, 0.035560213, 0.0289640799, -0.2870357931, 0.0080760773, -0.013293393, -0.1399184018, 0.1707452089, 0.5402771831, 0.1732706428, 0.1756926477, 0.1394028962, -0.1260670424, 0.398373872, -0.013117075, 0.0591222718, -0.125023067, -0.0263379999, 0.1114315242, -0.2428704202, -0.6793350577, -0.1501974165, -0.2003139257, 0.2844244242, 0.2139618099, 0.0511261038, 0.1971264333, 0.3683050275, -0.2396503687, 0.154696852, -0.0263568908, -0.2073857188, -0.0725122914, 0.4438351989, -0.1748046577, -0.2186149955, 0.0093016699, -0.3093701601, -0.3122235239, 0.1340721399, -0.311917603, 0.0612176731, -0.3492911756, 0.5237388611, 0.0541131832, 0.0240740459, 0.1082263291, -0.1130075902, 0.0396240093, -0.1758636534, -0.3381878734, 0.4521702528, 0.3127171993, 0.0991491079, 0.4245264828, -0.0252545495, -0.4129968882, 0.4985539615, -0.2855617702, -0.1919133514, 0.5953598619, -0.1800899059, 0.1787190437, -0.0728683174, -0.2670247853, -0.1126029119, -0.0425648987, -0.1116564125, 0.2236914933, -0.0085569602, -0.2700244188, -0.1810297817, 0.2123802006, -0.217617467, -0.2608530819, 0.2379056811, 0.1473992616, -0.0105134472, -0.2122009695, -0.0562984794, -0.3047746718, -0.0011162423, -0.0551236011, 0.4722041786, -0.1199495792, -0.0018577576, -0.3208261132, 0.2846589684, 0.0782866552, 0.1430095732, 0.2052050978, 0.0887331888, 0.051434949, -0.2681925297, -0.0241439305, 0.1990233064, 0.2873642147, 0.0269077923, 0.1157548428, 0.2087188661, -0.1225695312, -0.0485270135, -0.3683555722, 0.1046849564, -0.0980246589, -0.1456140876, 0.4273835123, -0.0174688399, -0.2047405243, 0.0689943209, 0.1473540962, -0.3237197995, -0.3247002661, -0.0766783357, -0.1038025171, 0.0663045123, -0.1572972089, 0.0566512868, 0.2637952864, -0.4567133784, -0.2949299812, -0.3002608418, -0.066938892, 0.4277196527, 0.0272210501, 0.3608856797, 0.0342603996, -0.1682980806, -0.0919502601, 0.3646996617, -0.2676113546, 0.5217867494, -0.0273074023, -0.3972116113, -0.1356137842, -0.2784286141, 0.1692483723, 0.2122795284, -0.3540075123, 0.0106415674, 0.0537487306, -0.1274204701, -0.3462891877, 0.4716668129, 0.468101114, -0.1359972954, 0.1705083102, -0.7704422474, 0.3951099813, -0.2586807311, -0.1935913265, -0.0233424045, -0.2719199955, -0.2325712293, 0.3815222383, 0.0811238885, 0.603358686, -0.1063162163, 0.1100441441, 0.2331689447, 0.2517562509, 0.0767837614, 0.0193489939, -0.1155477762, -0.225033462, 0.1044473052, -0.0204280987, -0.0284995362, 0.1987363398, 0.4217339456, 0.0612300932, 0.0763242468, -0.2039116919, 0.4627034068, -0.1451128274, 0.1012344435, 0.0826516822, -0.2062902749, -0.1408655196, 0.0782317147, -0.0795416087, 0.2758697867, 0.0676954389, -0.1689649671, 0.002944693, -0.2168521732, 0.0900498629, -0.1180128157, -0.3544569314, 0.1029872224, -0.0903590173, -0.1665311605, 0.049425222, 0.4377089143, 0.0453950651, 0.0724744201, 0.0026920699, -0.0892433226, 0.3455891609, 0.072393015, -0.3652020097, 0.0003103316, 0.1250232458, 0.1094889566, -0.4165462554, -0.2229630053, 0.0246598013, -0.2354820073, -0.1175188422, -0.1140048504, -0.0075319745, -0.591217041, -0.5814695358, -0.2086255103, 0.1575357318, 0.0230599046, 0.0865535364, -0.1536920965, 0.0544936135, 0.4086313546, 0.0256567709, -0.2936863303, -0.241415754, 0.1097035408, 0.0934434906, -0.1074568778, 0.4937072992, -0.1585173011, 0.0675675794, -0.233401373, 0.1656764448, 0.029134158, -0.2786064744, 0.1651809216, 0.2246357054, -0.2610379457, -0.1285685748, 0.6697030067, 0.099218294, 0.3524081111, 0.0855976492, -0.3124064803, -0.3024981618, -0.1660951823, 0.1188770458, 0.3957430422, 0.0464489907, 0.165283531, 0.1416230798, -0.2059973329, -0.2483381033, 0.0333888754, 0.1151636913, 0.0171486028, 0.1608875841, 0.1793146431, 0.0733209848, 0.2779511809, 0.1558929086, -0.0744741112, -0.054942064, -0.1906138361, -0.2437295616, 0.142841056, -0.101410687, 0.207324177, -0.1902380735, -0.2917286158, 0.1587088406, -0.2286087275, 0.1630496234, 0.2692561746, -0.0584335811, 0.5097593069, -0.1074661165, 0.2579496205, -0.039613951, 0.3279567361, -0.1831431091, 0.0911630243, 0.2839573622, -0.106813699, -0.0950942636, 0.0866204947, -0.138792783, -0.2092009187, -0.2366227657, 0.3477989435, -0.058427982, -0.14881666, -0.0404731855, 0.2657979131, 0.1655934155, 0.4483645856, -0.0675903261, 0.0337995067, 0.1700199097, 0.201385662, -0.137362197, 0.0610731207, 0.248262018, 0.0578231066, 0.1259520501, 0.5288022757, 0.4163002372, -0.0609546229, -0.1263371706, -0.2910684943, 0.1425026357, 0.1937829107, 0.1487296224, 0.3334578872, 0.0122656673, -0.0583514944, 0.0439712405, 0.1607937813, 0.1520165652, 0.3418112397, 0.0292780921, -0.2028440088, 0.000846345, 0.1873262376, 0.0220549256, -0.6729874015, 0.3362361491, 0.1451907903, -0.2824925184, 0.398786217, 0.0277415514, 0.3040840924, -0.2217609584, -0.2226588875, -0.1894534081, 0.0515003093, -0.0737111345, -0.2347967029, -0.1169681549, -0.1961243153, -0.3086970448, -0.120080322, 0.0403751805, 0.1298108846, 0.0409613103, 0.067796886, -0.0571396239, -0.1646540314, 0.1864990592, -0.2584417462, 0.2858180106, 0.0312477648, 0.0454407483, -0.0801776499, 0.1768497825, 0.2635410726, -0.0922868699, -0.0588778071, 0.6529514194, -0.0708786845, -0.2127463818, -0.0827961937, 0.0492597297, -0.1479175687, -0.1888008714, -0.0592486486, 0.5324171782, 0.2739832997, 0.1294229925, -0.1248222515, 0.226760447, -0.2451611161, 0.2631815672, -0.6641113758, 0.3068289161, 0.246119678, 0.0778738633, -0.1524757147, 0.0674320608, -0.1703533083, -0.1933409125, 0.4078934193, 0.3887724876, 0.538659811, 0.0947165489, 0.0588995665, -0.2939715087, 0.2791739404, -0.2840396464, 0.086139977, -0.057496123, 0.2992790043, -0.7319203019, -0.0987627059, 0.2255544513, -0.1211582422, 0.1216320246, 0.2576980293, 0.0417748876, 0.118810311, -0.1557261944, -0.0532571152, 0.3186097443, 0.0825062543, -0.130445987, -0.2680418789, -0.1397157907, 0.1819899678, 0.0978387147, -0.3734225929, -0.0446045436, 0.0930458978, 0.0255389065, -0.0383037329, -0.2225392312, 0.0113509074, 0.0509980135, 0.2269797325, 0.0419545323, 0.1949174106, -0.0109650046, 0.1166932881, -0.2026651502, -0.0816122591, -0.1552953273, 0.0546442531, -0.24560453, 0.5321160555, -0.3866824806, 0.5693586469, -0.4407523274, 0.1738170236, 0.2310235202, -0.4541362822, 0.0961409435, -0.0945512056, -0.1327077448, -0.1259728074, 0.0628547817, 0.3869293332, -0.1089812666, 0.1471090019, 0.1429864764, -0.3117980957, 0.3475706577, -0.0603824519, -0.2231801748, -0.0993921012, 0.2244296074, 0.1183506399, 0.3281213641, -0.3093108535, -0.037179336, 0.495208919, -0.1033606604, 0.2260008156, 0.1840551049, -0.3150680363, 0.0563165694, -0.1813355386, 0.142545566, 0.0503481925, 0.2422545999, -0.239048481, -0.477943033 ]
https://github.com/huggingface/datasets/issues/623
Custom feature types in `load_dataset` from CSV
Hi @lhoestq we've tried out your suggestion but are now running into the following error: ``` --------------------------------------------------------------------------- ValueError Traceback (most recent call last) <ipython-input-163-81ffd5ac18c9> in <module> ----> 1 dataset.cast_(emotion_features) /usr/local/lib/python3.6/dist-packages/datasets/dataset_dict.py in cast_(self, features) 125 self._check_values_type() 126 for dataset in self.values(): --> 127 dataset.cast_(features=features) 128 129 def remove_columns_(self, column_names: Union[str, List[str]]): /usr/local/lib/python3.6/dist-packages/datasets/fingerprint.py in wrapper(*args, **kwargs) 161 # Call actual function 162 --> 163 out = func(self, *args, **kwargs) 164 165 # Update fingerprint of in-place transforms + update in-place history of transforms /usr/local/lib/python3.6/dist-packages/datasets/arrow_dataset.py in cast_(self, features) 602 self._info.features = features 603 schema = pa.schema(features.type) --> 604 self._data = self._data.cast(schema) 605 606 @fingerprint(inplace=True) /usr/local/lib/python3.6/dist-packages/pyarrow/table.pxi in pyarrow.lib.Table.cast() ValueError: Target schema's field names are not matching the table's field names: ['text', 'label'], ['label', 'text'] ``` Looking at the types in `emotion_features` we see that `label` and `text` appear to be swapped in the Arrow table: ``` emotion_features.type StructType(struct<label: int64, text: string>) ``` Did we define the `emotion_features` incorrectly? We just followed the instructions from the [docs](https://huggingface.co/docs/datasets/features.html?highlight=features#dataset-features), but perhaps we misunderstood something 😬
I am trying to load a local file with the `load_dataset` function and I want to predefine the feature types with the `features` argument. However, the types are always the same independent of the value of `features`. I am working with the local files from the emotion dataset. To get the data you can use the following code: ```Python from pathlib import Path import wget EMOTION_PATH = Path("./data/emotion") DOWNLOAD_URLS = [ "https://www.dropbox.com/s/1pzkadrvffbqw6o/train.txt?dl=1", "https://www.dropbox.com/s/2mzialpsgf9k5l3/val.txt?dl=1", "https://www.dropbox.com/s/ikkqxfdbdec3fuj/test.txt?dl=1", ] if not Path.is_dir(EMOTION_PATH): Path.mkdir(EMOTION_PATH) for url in DOWNLOAD_URLS: wget.download(url, str(EMOTION_PATH)) ``` The first five lines of the train set are: ``` i didnt feel humiliated;sadness i can go from feeling so hopeless to so damned hopeful just from being around someone who cares and is awake;sadness im grabbing a minute to post i feel greedy wrong;anger i am ever feeling nostalgic about the fireplace i will know that it is still on the property;love i am feeling grouchy;anger ``` Here the code to reproduce the issue: ```Python from datasets import Features, Value, ClassLabel, load_dataset class_names = ["sadness", "joy", "love", "anger", "fear", "surprise"] emotion_features = Features({'text': Value('string'), 'label': ClassLabel(names=class_names)}) file_dict = {'train': EMOTION_PATH/'train.txt'} dataset = load_dataset('csv', data_files=file_dict, delimiter=';', column_names=['text', 'label'], features=emotion_features) ``` **Observed behaviour:** ```Python dataset['train'].features ``` ```Python {'text': Value(dtype='string', id=None), 'label': Value(dtype='string', id=None)} ``` **Expected behaviour:** ```Python dataset['train'].features ``` ```Python {'text': Value(dtype='string', id=None), 'label': ClassLabel(num_classes=6, names=['sadness', 'joy', 'love', 'anger', 'fear', 'surprise'], names_file=None, id=None)} ``` **Things I've tried:** - deleting the cache - trying other types such as `int64` Am I missing anything? Thanks for any pointer in the right direction.
168
Custom feature types in `load_dataset` from CSV I am trying to load a local file with the `load_dataset` function and I want to predefine the feature types with the `features` argument. However, the types are always the same independent of the value of `features`. I am working with the local files from the emotion dataset. To get the data you can use the following code: ```Python from pathlib import Path import wget EMOTION_PATH = Path("./data/emotion") DOWNLOAD_URLS = [ "https://www.dropbox.com/s/1pzkadrvffbqw6o/train.txt?dl=1", "https://www.dropbox.com/s/2mzialpsgf9k5l3/val.txt?dl=1", "https://www.dropbox.com/s/ikkqxfdbdec3fuj/test.txt?dl=1", ] if not Path.is_dir(EMOTION_PATH): Path.mkdir(EMOTION_PATH) for url in DOWNLOAD_URLS: wget.download(url, str(EMOTION_PATH)) ``` The first five lines of the train set are: ``` i didnt feel humiliated;sadness i can go from feeling so hopeless to so damned hopeful just from being around someone who cares and is awake;sadness im grabbing a minute to post i feel greedy wrong;anger i am ever feeling nostalgic about the fireplace i will know that it is still on the property;love i am feeling grouchy;anger ``` Here the code to reproduce the issue: ```Python from datasets import Features, Value, ClassLabel, load_dataset class_names = ["sadness", "joy", "love", "anger", "fear", "surprise"] emotion_features = Features({'text': Value('string'), 'label': ClassLabel(names=class_names)}) file_dict = {'train': EMOTION_PATH/'train.txt'} dataset = load_dataset('csv', data_files=file_dict, delimiter=';', column_names=['text', 'label'], features=emotion_features) ``` **Observed behaviour:** ```Python dataset['train'].features ``` ```Python {'text': Value(dtype='string', id=None), 'label': Value(dtype='string', id=None)} ``` **Expected behaviour:** ```Python dataset['train'].features ``` ```Python {'text': Value(dtype='string', id=None), 'label': ClassLabel(num_classes=6, names=['sadness', 'joy', 'love', 'anger', 'fear', 'surprise'], names_file=None, id=None)} ``` **Things I've tried:** - deleting the cache - trying other types such as `int64` Am I missing anything? Thanks for any pointer in the right direction. Hi @lhoestq we've tried out your suggestion but are now running into the following error: ``` --------------------------------------------------------------------------- ValueError Traceback (most recent call last) <ipython-input-163-81ffd5ac18c9> in <module> ----> 1 dataset.cast_(emotion_features) /usr/local/lib/python3.6/dist-packages/datasets/dataset_dict.py in cast_(self, features) 125 self._check_values_type() 126 for dataset in self.values(): --> 127 dataset.cast_(features=features) 128 129 def remove_columns_(self, column_names: Union[str, List[str]]): /usr/local/lib/python3.6/dist-packages/datasets/fingerprint.py in wrapper(*args, **kwargs) 161 # Call actual function 162 --> 163 out = func(self, *args, **kwargs) 164 165 # Update fingerprint of in-place transforms + update in-place history of transforms /usr/local/lib/python3.6/dist-packages/datasets/arrow_dataset.py in cast_(self, features) 602 self._info.features = features 603 schema = pa.schema(features.type) --> 604 self._data = self._data.cast(schema) 605 606 @fingerprint(inplace=True) /usr/local/lib/python3.6/dist-packages/pyarrow/table.pxi in pyarrow.lib.Table.cast() ValueError: Target schema's field names are not matching the table's field names: ['text', 'label'], ['label', 'text'] ``` Looking at the types in `emotion_features` we see that `label` and `text` appear to be swapped in the Arrow table: ``` emotion_features.type StructType(struct<label: int64, text: string>) ``` Did we define the `emotion_features` incorrectly? We just followed the instructions from the [docs](https://huggingface.co/docs/datasets/features.html?highlight=features#dataset-features), but perhaps we misunderstood something 😬
[ 0.0802029371, -0.2782895565, -0.0531791002, 0.3509228528, 0.3172290921, -0.1943103671, 0.5701336265, 0.1113843173, 0.446125567, 0.0253307968, 0.094743818, 0.3161779642, -0.0919102058, 0.3901149631, -0.0581879392, 0.0267207623, -0.1612750292, 0.3348048329, -0.0091219395, -0.3497941494, -0.2713248134, 0.1799537241, -0.0930470228, -0.013486281, -0.3072173595, 0.3395892084, 0.2221690118, 0.143438682, 0.0583121479, -0.3765773177, 0.4237108827, 0.1486945301, 0.3030694425, 0.0801386833, -0.0001151109, 0.076882951, 0.0293593034, -0.1740052551, -0.0040853135, -0.3799196482, -0.0387843475, -0.202906549, 0.4247084856, -0.3651529551, -0.2385733575, -0.4564034939, -0.2810540497, -0.1292716712, 0.2869226635, 0.3292907178, 0.1512023211, -0.0502676442, -0.2750738859, 0.2184948921, 0.3587994277, 0.6755331755, -0.2523880601, 0.1870210171, -0.0406346805, -0.1123656034, 0.0939736664, -0.1022861525, -0.1438640356, 0.353423357, 0.4926100373, 0.1710617542, -0.0643914789, -0.1303499043, -0.1733283252, 0.2814249694, 0.3474104702, 0.0562938415, -0.0464358665, -0.3231964707, -0.1238626838, -0.2973287404, 0.382417351, 0.0848604888, -0.1818488538, 0.1604423374, -0.0789410546, 0.4355747104, -0.0581504069, 0.325290978, 0.0186878741, -0.0715494901, -0.2392115891, 0.1055341363, 0.0308264177, -0.2173109949, 0.0974365473, -0.349162221, 0.2218383104, 0.1172011793, -0.0551007129, 0.0995839685, 0.003935298, -0.0604649857, -0.1210798174, 0.0352323502, 0.2292757034, 0.1932828128, -0.1932462752, 0.0317142308, 0.2777897716, 0.2441823781, 0.2002111226, -0.2317834347, 0.1527379453, 0.104715839, -0.423279196, 0.0183334798, 0.0277890079, -0.3080183268, 0.3964600563, 0.0869552791, 0.5022854805, -0.2398283482, 0.057662189, -0.0980123505, 0.0428094119, -0.0321919359, 0.1175668389, 0.1835573018, -0.0262655485, 0.580987215, -0.0656144395, 0.1488876641, -0.3211800456, 0.1283437163, -0.0581272393, -0.2226472348, -0.0374680161, -0.0493203774, 0.4030345678, 0.0941050053, 0.2478335947, 0.199026376, -0.0237025656, -0.3186685145, -0.1270062923, -0.095956482, 0.1245452017, 0.0443357415, -0.441672802, 0.3375269473, 0.2412355989, -0.2255857289, -0.2444109023, 0.0952860862, -0.2509181201, -0.2328005731, 0.1970371008, 0.155583322, -0.1784608662, -0.020963341, -0.1068923622, 0.0988962054, 0.1280108392, 0.258272469, 0.0263721272, -0.4864162505, -0.3809948564, -0.3249777257, -0.0254870169, 0.395431757, -0.4154388011, -0.2039044201, 0.0394422971, -0.0463084355, 0.1385591328, 0.2530902922, -0.3661009073, 0.1451618969, -0.2393054813, 0.2845746875, 0.5265449286, 0.0433983952, -0.2177204043, 0.4294439852, 0.1565900892, 0.3267546892, 0.186067909, 0.0742609054, 0.0775938481, 0.1260918379, 0.1594995856, 0.4206011593, 0.1657207608, 0.067381829, -0.1837767363, -0.1553835273, 0.1656042188, -0.0030713007, -0.3288176954, 0.3090257049, 0.2257002145, -0.5133963823, 0.2171115279, -0.1282680333, -0.2264052331, -0.0468893833, 0.4030795097, 0.5202860236, 0.0057925545, -0.0192435011, -0.4824975431, 0.2246333212, -0.1493486017, -0.0522528514, -0.0067957081, -0.3252949715, -0.4934935868, -0.0567670278, -0.2471701205, 0.1995168179, 0.0646642521, 0.3136295676, -0.3358725309, 0.158852458, -0.0870927498, 0.0709818974, -0.2088879943, -0.2521633804, -0.0801552907, -0.0217761993, 0.1720780581, -0.0900507942, 0.0157924816, 0.0775025189, 0.2781896293, 0.0767479241, -0.3883282542, 0.1277149916, 0.2795607746, 0.161347121, -0.1129958481, 0.2116078734, -0.003028132, -0.0095146522, -0.0278554689, 0.1732269526, 0.1935627162, -0.0937050357, -0.0290848613, 0.6565464735, 0.1539309025, 0.248978585, -0.267847687, -0.1128531545, 0.2808354199, -0.0006699339, -0.1841064841, -0.0670059621, -0.2146350145, -0.0485210344, 0.0014962032, 0.4216480851, -0.3299654424, -0.1157782227, 0.4788911045, 0.0954587162, 0.1291624606, -0.080403775, -0.0172786154, -0.029092107, 0.0167229306, -0.2370506525, 0.4729243517, -0.0610825047, -0.0803651214, -0.008282477, 0.0192122981, -0.2243368626, 0.0599996336, -0.1394981444, -0.1910201311, 0.2915715873, 0.0343267135, -0.1579158157, -0.3376893699, 0.2412527055, -0.1466074288, -0.2058868557, -0.6342836618, -0.0990627632, -0.5823050737, 0.1683030576, -0.4637621641, -0.0822290182, -0.0697951391, 0.0163359642, -0.2120193094, 0.1033788174, -0.0353965759, 0.0900433883, -0.1784473062, 0.3625626564, 0.0643080026, -0.8123733997, 0.2222117037, 0.1188195944, -0.4499817193, -0.0581740029, 0.1749914736, 0.2886118591, 0.035560213, 0.0289640799, -0.2870357931, 0.0080760773, -0.013293393, -0.1399184018, 0.1707452089, 0.5402771831, 0.1732706428, 0.1756926477, 0.1394028962, -0.1260670424, 0.398373872, -0.013117075, 0.0591222718, -0.125023067, -0.0263379999, 0.1114315242, -0.2428704202, -0.6793350577, -0.1501974165, -0.2003139257, 0.2844244242, 0.2139618099, 0.0511261038, 0.1971264333, 0.3683050275, -0.2396503687, 0.154696852, -0.0263568908, -0.2073857188, -0.0725122914, 0.4438351989, -0.1748046577, -0.2186149955, 0.0093016699, -0.3093701601, -0.3122235239, 0.1340721399, -0.311917603, 0.0612176731, -0.3492911756, 0.5237388611, 0.0541131832, 0.0240740459, 0.1082263291, -0.1130075902, 0.0396240093, -0.1758636534, -0.3381878734, 0.4521702528, 0.3127171993, 0.0991491079, 0.4245264828, -0.0252545495, -0.4129968882, 0.4985539615, -0.2855617702, -0.1919133514, 0.5953598619, -0.1800899059, 0.1787190437, -0.0728683174, -0.2670247853, -0.1126029119, -0.0425648987, -0.1116564125, 0.2236914933, -0.0085569602, -0.2700244188, -0.1810297817, 0.2123802006, -0.217617467, -0.2608530819, 0.2379056811, 0.1473992616, -0.0105134472, -0.2122009695, -0.0562984794, -0.3047746718, -0.0011162423, -0.0551236011, 0.4722041786, -0.1199495792, -0.0018577576, -0.3208261132, 0.2846589684, 0.0782866552, 0.1430095732, 0.2052050978, 0.0887331888, 0.051434949, -0.2681925297, -0.0241439305, 0.1990233064, 0.2873642147, 0.0269077923, 0.1157548428, 0.2087188661, -0.1225695312, -0.0485270135, -0.3683555722, 0.1046849564, -0.0980246589, -0.1456140876, 0.4273835123, -0.0174688399, -0.2047405243, 0.0689943209, 0.1473540962, -0.3237197995, -0.3247002661, -0.0766783357, -0.1038025171, 0.0663045123, -0.1572972089, 0.0566512868, 0.2637952864, -0.4567133784, -0.2949299812, -0.3002608418, -0.066938892, 0.4277196527, 0.0272210501, 0.3608856797, 0.0342603996, -0.1682980806, -0.0919502601, 0.3646996617, -0.2676113546, 0.5217867494, -0.0273074023, -0.3972116113, -0.1356137842, -0.2784286141, 0.1692483723, 0.2122795284, -0.3540075123, 0.0106415674, 0.0537487306, -0.1274204701, -0.3462891877, 0.4716668129, 0.468101114, -0.1359972954, 0.1705083102, -0.7704422474, 0.3951099813, -0.2586807311, -0.1935913265, -0.0233424045, -0.2719199955, -0.2325712293, 0.3815222383, 0.0811238885, 0.603358686, -0.1063162163, 0.1100441441, 0.2331689447, 0.2517562509, 0.0767837614, 0.0193489939, -0.1155477762, -0.225033462, 0.1044473052, -0.0204280987, -0.0284995362, 0.1987363398, 0.4217339456, 0.0612300932, 0.0763242468, -0.2039116919, 0.4627034068, -0.1451128274, 0.1012344435, 0.0826516822, -0.2062902749, -0.1408655196, 0.0782317147, -0.0795416087, 0.2758697867, 0.0676954389, -0.1689649671, 0.002944693, -0.2168521732, 0.0900498629, -0.1180128157, -0.3544569314, 0.1029872224, -0.0903590173, -0.1665311605, 0.049425222, 0.4377089143, 0.0453950651, 0.0724744201, 0.0026920699, -0.0892433226, 0.3455891609, 0.072393015, -0.3652020097, 0.0003103316, 0.1250232458, 0.1094889566, -0.4165462554, -0.2229630053, 0.0246598013, -0.2354820073, -0.1175188422, -0.1140048504, -0.0075319745, -0.591217041, -0.5814695358, -0.2086255103, 0.1575357318, 0.0230599046, 0.0865535364, -0.1536920965, 0.0544936135, 0.4086313546, 0.0256567709, -0.2936863303, -0.241415754, 0.1097035408, 0.0934434906, -0.1074568778, 0.4937072992, -0.1585173011, 0.0675675794, -0.233401373, 0.1656764448, 0.029134158, -0.2786064744, 0.1651809216, 0.2246357054, -0.2610379457, -0.1285685748, 0.6697030067, 0.099218294, 0.3524081111, 0.0855976492, -0.3124064803, -0.3024981618, -0.1660951823, 0.1188770458, 0.3957430422, 0.0464489907, 0.165283531, 0.1416230798, -0.2059973329, -0.2483381033, 0.0333888754, 0.1151636913, 0.0171486028, 0.1608875841, 0.1793146431, 0.0733209848, 0.2779511809, 0.1558929086, -0.0744741112, -0.054942064, -0.1906138361, -0.2437295616, 0.142841056, -0.101410687, 0.207324177, -0.1902380735, -0.2917286158, 0.1587088406, -0.2286087275, 0.1630496234, 0.2692561746, -0.0584335811, 0.5097593069, -0.1074661165, 0.2579496205, -0.039613951, 0.3279567361, -0.1831431091, 0.0911630243, 0.2839573622, -0.106813699, -0.0950942636, 0.0866204947, -0.138792783, -0.2092009187, -0.2366227657, 0.3477989435, -0.058427982, -0.14881666, -0.0404731855, 0.2657979131, 0.1655934155, 0.4483645856, -0.0675903261, 0.0337995067, 0.1700199097, 0.201385662, -0.137362197, 0.0610731207, 0.248262018, 0.0578231066, 0.1259520501, 0.5288022757, 0.4163002372, -0.0609546229, -0.1263371706, -0.2910684943, 0.1425026357, 0.1937829107, 0.1487296224, 0.3334578872, 0.0122656673, -0.0583514944, 0.0439712405, 0.1607937813, 0.1520165652, 0.3418112397, 0.0292780921, -0.2028440088, 0.000846345, 0.1873262376, 0.0220549256, -0.6729874015, 0.3362361491, 0.1451907903, -0.2824925184, 0.398786217, 0.0277415514, 0.3040840924, -0.2217609584, -0.2226588875, -0.1894534081, 0.0515003093, -0.0737111345, -0.2347967029, -0.1169681549, -0.1961243153, -0.3086970448, -0.120080322, 0.0403751805, 0.1298108846, 0.0409613103, 0.067796886, -0.0571396239, -0.1646540314, 0.1864990592, -0.2584417462, 0.2858180106, 0.0312477648, 0.0454407483, -0.0801776499, 0.1768497825, 0.2635410726, -0.0922868699, -0.0588778071, 0.6529514194, -0.0708786845, -0.2127463818, -0.0827961937, 0.0492597297, -0.1479175687, -0.1888008714, -0.0592486486, 0.5324171782, 0.2739832997, 0.1294229925, -0.1248222515, 0.226760447, -0.2451611161, 0.2631815672, -0.6641113758, 0.3068289161, 0.246119678, 0.0778738633, -0.1524757147, 0.0674320608, -0.1703533083, -0.1933409125, 0.4078934193, 0.3887724876, 0.538659811, 0.0947165489, 0.0588995665, -0.2939715087, 0.2791739404, -0.2840396464, 0.086139977, -0.057496123, 0.2992790043, -0.7319203019, -0.0987627059, 0.2255544513, -0.1211582422, 0.1216320246, 0.2576980293, 0.0417748876, 0.118810311, -0.1557261944, -0.0532571152, 0.3186097443, 0.0825062543, -0.130445987, -0.2680418789, -0.1397157907, 0.1819899678, 0.0978387147, -0.3734225929, -0.0446045436, 0.0930458978, 0.0255389065, -0.0383037329, -0.2225392312, 0.0113509074, 0.0509980135, 0.2269797325, 0.0419545323, 0.1949174106, -0.0109650046, 0.1166932881, -0.2026651502, -0.0816122591, -0.1552953273, 0.0546442531, -0.24560453, 0.5321160555, -0.3866824806, 0.5693586469, -0.4407523274, 0.1738170236, 0.2310235202, -0.4541362822, 0.0961409435, -0.0945512056, -0.1327077448, -0.1259728074, 0.0628547817, 0.3869293332, -0.1089812666, 0.1471090019, 0.1429864764, -0.3117980957, 0.3475706577, -0.0603824519, -0.2231801748, -0.0993921012, 0.2244296074, 0.1183506399, 0.3281213641, -0.3093108535, -0.037179336, 0.495208919, -0.1033606604, 0.2260008156, 0.1840551049, -0.3150680363, 0.0563165694, -0.1813355386, 0.142545566, 0.0503481925, 0.2422545999, -0.239048481, -0.477943033 ]
https://github.com/huggingface/datasets/issues/623
Custom feature types in `load_dataset` from CSV
In general, I don't think there is any hard reason we don't allow to use `features` in the csv script, right @lhoestq? Should I add it?
I am trying to load a local file with the `load_dataset` function and I want to predefine the feature types with the `features` argument. However, the types are always the same independent of the value of `features`. I am working with the local files from the emotion dataset. To get the data you can use the following code: ```Python from pathlib import Path import wget EMOTION_PATH = Path("./data/emotion") DOWNLOAD_URLS = [ "https://www.dropbox.com/s/1pzkadrvffbqw6o/train.txt?dl=1", "https://www.dropbox.com/s/2mzialpsgf9k5l3/val.txt?dl=1", "https://www.dropbox.com/s/ikkqxfdbdec3fuj/test.txt?dl=1", ] if not Path.is_dir(EMOTION_PATH): Path.mkdir(EMOTION_PATH) for url in DOWNLOAD_URLS: wget.download(url, str(EMOTION_PATH)) ``` The first five lines of the train set are: ``` i didnt feel humiliated;sadness i can go from feeling so hopeless to so damned hopeful just from being around someone who cares and is awake;sadness im grabbing a minute to post i feel greedy wrong;anger i am ever feeling nostalgic about the fireplace i will know that it is still on the property;love i am feeling grouchy;anger ``` Here the code to reproduce the issue: ```Python from datasets import Features, Value, ClassLabel, load_dataset class_names = ["sadness", "joy", "love", "anger", "fear", "surprise"] emotion_features = Features({'text': Value('string'), 'label': ClassLabel(names=class_names)}) file_dict = {'train': EMOTION_PATH/'train.txt'} dataset = load_dataset('csv', data_files=file_dict, delimiter=';', column_names=['text', 'label'], features=emotion_features) ``` **Observed behaviour:** ```Python dataset['train'].features ``` ```Python {'text': Value(dtype='string', id=None), 'label': Value(dtype='string', id=None)} ``` **Expected behaviour:** ```Python dataset['train'].features ``` ```Python {'text': Value(dtype='string', id=None), 'label': ClassLabel(num_classes=6, names=['sadness', 'joy', 'love', 'anger', 'fear', 'surprise'], names_file=None, id=None)} ``` **Things I've tried:** - deleting the cache - trying other types such as `int64` Am I missing anything? Thanks for any pointer in the right direction.
26
Custom feature types in `load_dataset` from CSV I am trying to load a local file with the `load_dataset` function and I want to predefine the feature types with the `features` argument. However, the types are always the same independent of the value of `features`. I am working with the local files from the emotion dataset. To get the data you can use the following code: ```Python from pathlib import Path import wget EMOTION_PATH = Path("./data/emotion") DOWNLOAD_URLS = [ "https://www.dropbox.com/s/1pzkadrvffbqw6o/train.txt?dl=1", "https://www.dropbox.com/s/2mzialpsgf9k5l3/val.txt?dl=1", "https://www.dropbox.com/s/ikkqxfdbdec3fuj/test.txt?dl=1", ] if not Path.is_dir(EMOTION_PATH): Path.mkdir(EMOTION_PATH) for url in DOWNLOAD_URLS: wget.download(url, str(EMOTION_PATH)) ``` The first five lines of the train set are: ``` i didnt feel humiliated;sadness i can go from feeling so hopeless to so damned hopeful just from being around someone who cares and is awake;sadness im grabbing a minute to post i feel greedy wrong;anger i am ever feeling nostalgic about the fireplace i will know that it is still on the property;love i am feeling grouchy;anger ``` Here the code to reproduce the issue: ```Python from datasets import Features, Value, ClassLabel, load_dataset class_names = ["sadness", "joy", "love", "anger", "fear", "surprise"] emotion_features = Features({'text': Value('string'), 'label': ClassLabel(names=class_names)}) file_dict = {'train': EMOTION_PATH/'train.txt'} dataset = load_dataset('csv', data_files=file_dict, delimiter=';', column_names=['text', 'label'], features=emotion_features) ``` **Observed behaviour:** ```Python dataset['train'].features ``` ```Python {'text': Value(dtype='string', id=None), 'label': Value(dtype='string', id=None)} ``` **Expected behaviour:** ```Python dataset['train'].features ``` ```Python {'text': Value(dtype='string', id=None), 'label': ClassLabel(num_classes=6, names=['sadness', 'joy', 'love', 'anger', 'fear', 'surprise'], names_file=None, id=None)} ``` **Things I've tried:** - deleting the cache - trying other types such as `int64` Am I missing anything? Thanks for any pointer in the right direction. In general, I don't think there is any hard reason we don't allow to use `features` in the csv script, right @lhoestq? Should I add it?
[ 0.0802029371, -0.2782895565, -0.0531791002, 0.3509228528, 0.3172290921, -0.1943103671, 0.5701336265, 0.1113843173, 0.446125567, 0.0253307968, 0.094743818, 0.3161779642, -0.0919102058, 0.3901149631, -0.0581879392, 0.0267207623, -0.1612750292, 0.3348048329, -0.0091219395, -0.3497941494, -0.2713248134, 0.1799537241, -0.0930470228, -0.013486281, -0.3072173595, 0.3395892084, 0.2221690118, 0.143438682, 0.0583121479, -0.3765773177, 0.4237108827, 0.1486945301, 0.3030694425, 0.0801386833, -0.0001151109, 0.076882951, 0.0293593034, -0.1740052551, -0.0040853135, -0.3799196482, -0.0387843475, -0.202906549, 0.4247084856, -0.3651529551, -0.2385733575, -0.4564034939, -0.2810540497, -0.1292716712, 0.2869226635, 0.3292907178, 0.1512023211, -0.0502676442, -0.2750738859, 0.2184948921, 0.3587994277, 0.6755331755, -0.2523880601, 0.1870210171, -0.0406346805, -0.1123656034, 0.0939736664, -0.1022861525, -0.1438640356, 0.353423357, 0.4926100373, 0.1710617542, -0.0643914789, -0.1303499043, -0.1733283252, 0.2814249694, 0.3474104702, 0.0562938415, -0.0464358665, -0.3231964707, -0.1238626838, -0.2973287404, 0.382417351, 0.0848604888, -0.1818488538, 0.1604423374, -0.0789410546, 0.4355747104, -0.0581504069, 0.325290978, 0.0186878741, -0.0715494901, -0.2392115891, 0.1055341363, 0.0308264177, -0.2173109949, 0.0974365473, -0.349162221, 0.2218383104, 0.1172011793, -0.0551007129, 0.0995839685, 0.003935298, -0.0604649857, -0.1210798174, 0.0352323502, 0.2292757034, 0.1932828128, -0.1932462752, 0.0317142308, 0.2777897716, 0.2441823781, 0.2002111226, -0.2317834347, 0.1527379453, 0.104715839, -0.423279196, 0.0183334798, 0.0277890079, -0.3080183268, 0.3964600563, 0.0869552791, 0.5022854805, -0.2398283482, 0.057662189, -0.0980123505, 0.0428094119, -0.0321919359, 0.1175668389, 0.1835573018, -0.0262655485, 0.580987215, -0.0656144395, 0.1488876641, -0.3211800456, 0.1283437163, -0.0581272393, -0.2226472348, -0.0374680161, -0.0493203774, 0.4030345678, 0.0941050053, 0.2478335947, 0.199026376, -0.0237025656, -0.3186685145, -0.1270062923, -0.095956482, 0.1245452017, 0.0443357415, -0.441672802, 0.3375269473, 0.2412355989, -0.2255857289, -0.2444109023, 0.0952860862, -0.2509181201, -0.2328005731, 0.1970371008, 0.155583322, -0.1784608662, -0.020963341, -0.1068923622, 0.0988962054, 0.1280108392, 0.258272469, 0.0263721272, -0.4864162505, -0.3809948564, -0.3249777257, -0.0254870169, 0.395431757, -0.4154388011, -0.2039044201, 0.0394422971, -0.0463084355, 0.1385591328, 0.2530902922, -0.3661009073, 0.1451618969, -0.2393054813, 0.2845746875, 0.5265449286, 0.0433983952, -0.2177204043, 0.4294439852, 0.1565900892, 0.3267546892, 0.186067909, 0.0742609054, 0.0775938481, 0.1260918379, 0.1594995856, 0.4206011593, 0.1657207608, 0.067381829, -0.1837767363, -0.1553835273, 0.1656042188, -0.0030713007, -0.3288176954, 0.3090257049, 0.2257002145, -0.5133963823, 0.2171115279, -0.1282680333, -0.2264052331, -0.0468893833, 0.4030795097, 0.5202860236, 0.0057925545, -0.0192435011, -0.4824975431, 0.2246333212, -0.1493486017, -0.0522528514, -0.0067957081, -0.3252949715, -0.4934935868, -0.0567670278, -0.2471701205, 0.1995168179, 0.0646642521, 0.3136295676, -0.3358725309, 0.158852458, -0.0870927498, 0.0709818974, -0.2088879943, -0.2521633804, -0.0801552907, -0.0217761993, 0.1720780581, -0.0900507942, 0.0157924816, 0.0775025189, 0.2781896293, 0.0767479241, -0.3883282542, 0.1277149916, 0.2795607746, 0.161347121, -0.1129958481, 0.2116078734, -0.003028132, -0.0095146522, -0.0278554689, 0.1732269526, 0.1935627162, -0.0937050357, -0.0290848613, 0.6565464735, 0.1539309025, 0.248978585, -0.267847687, -0.1128531545, 0.2808354199, -0.0006699339, -0.1841064841, -0.0670059621, -0.2146350145, -0.0485210344, 0.0014962032, 0.4216480851, -0.3299654424, -0.1157782227, 0.4788911045, 0.0954587162, 0.1291624606, -0.080403775, -0.0172786154, -0.029092107, 0.0167229306, -0.2370506525, 0.4729243517, -0.0610825047, -0.0803651214, -0.008282477, 0.0192122981, -0.2243368626, 0.0599996336, -0.1394981444, -0.1910201311, 0.2915715873, 0.0343267135, -0.1579158157, -0.3376893699, 0.2412527055, -0.1466074288, -0.2058868557, -0.6342836618, -0.0990627632, -0.5823050737, 0.1683030576, -0.4637621641, -0.0822290182, -0.0697951391, 0.0163359642, -0.2120193094, 0.1033788174, -0.0353965759, 0.0900433883, -0.1784473062, 0.3625626564, 0.0643080026, -0.8123733997, 0.2222117037, 0.1188195944, -0.4499817193, -0.0581740029, 0.1749914736, 0.2886118591, 0.035560213, 0.0289640799, -0.2870357931, 0.0080760773, -0.013293393, -0.1399184018, 0.1707452089, 0.5402771831, 0.1732706428, 0.1756926477, 0.1394028962, -0.1260670424, 0.398373872, -0.013117075, 0.0591222718, -0.125023067, -0.0263379999, 0.1114315242, -0.2428704202, -0.6793350577, -0.1501974165, -0.2003139257, 0.2844244242, 0.2139618099, 0.0511261038, 0.1971264333, 0.3683050275, -0.2396503687, 0.154696852, -0.0263568908, -0.2073857188, -0.0725122914, 0.4438351989, -0.1748046577, -0.2186149955, 0.0093016699, -0.3093701601, -0.3122235239, 0.1340721399, -0.311917603, 0.0612176731, -0.3492911756, 0.5237388611, 0.0541131832, 0.0240740459, 0.1082263291, -0.1130075902, 0.0396240093, -0.1758636534, -0.3381878734, 0.4521702528, 0.3127171993, 0.0991491079, 0.4245264828, -0.0252545495, -0.4129968882, 0.4985539615, -0.2855617702, -0.1919133514, 0.5953598619, -0.1800899059, 0.1787190437, -0.0728683174, -0.2670247853, -0.1126029119, -0.0425648987, -0.1116564125, 0.2236914933, -0.0085569602, -0.2700244188, -0.1810297817, 0.2123802006, -0.217617467, -0.2608530819, 0.2379056811, 0.1473992616, -0.0105134472, -0.2122009695, -0.0562984794, -0.3047746718, -0.0011162423, -0.0551236011, 0.4722041786, -0.1199495792, -0.0018577576, -0.3208261132, 0.2846589684, 0.0782866552, 0.1430095732, 0.2052050978, 0.0887331888, 0.051434949, -0.2681925297, -0.0241439305, 0.1990233064, 0.2873642147, 0.0269077923, 0.1157548428, 0.2087188661, -0.1225695312, -0.0485270135, -0.3683555722, 0.1046849564, -0.0980246589, -0.1456140876, 0.4273835123, -0.0174688399, -0.2047405243, 0.0689943209, 0.1473540962, -0.3237197995, -0.3247002661, -0.0766783357, -0.1038025171, 0.0663045123, -0.1572972089, 0.0566512868, 0.2637952864, -0.4567133784, -0.2949299812, -0.3002608418, -0.066938892, 0.4277196527, 0.0272210501, 0.3608856797, 0.0342603996, -0.1682980806, -0.0919502601, 0.3646996617, -0.2676113546, 0.5217867494, -0.0273074023, -0.3972116113, -0.1356137842, -0.2784286141, 0.1692483723, 0.2122795284, -0.3540075123, 0.0106415674, 0.0537487306, -0.1274204701, -0.3462891877, 0.4716668129, 0.468101114, -0.1359972954, 0.1705083102, -0.7704422474, 0.3951099813, -0.2586807311, -0.1935913265, -0.0233424045, -0.2719199955, -0.2325712293, 0.3815222383, 0.0811238885, 0.603358686, -0.1063162163, 0.1100441441, 0.2331689447, 0.2517562509, 0.0767837614, 0.0193489939, -0.1155477762, -0.225033462, 0.1044473052, -0.0204280987, -0.0284995362, 0.1987363398, 0.4217339456, 0.0612300932, 0.0763242468, -0.2039116919, 0.4627034068, -0.1451128274, 0.1012344435, 0.0826516822, -0.2062902749, -0.1408655196, 0.0782317147, -0.0795416087, 0.2758697867, 0.0676954389, -0.1689649671, 0.002944693, -0.2168521732, 0.0900498629, -0.1180128157, -0.3544569314, 0.1029872224, -0.0903590173, -0.1665311605, 0.049425222, 0.4377089143, 0.0453950651, 0.0724744201, 0.0026920699, -0.0892433226, 0.3455891609, 0.072393015, -0.3652020097, 0.0003103316, 0.1250232458, 0.1094889566, -0.4165462554, -0.2229630053, 0.0246598013, -0.2354820073, -0.1175188422, -0.1140048504, -0.0075319745, -0.591217041, -0.5814695358, -0.2086255103, 0.1575357318, 0.0230599046, 0.0865535364, -0.1536920965, 0.0544936135, 0.4086313546, 0.0256567709, -0.2936863303, -0.241415754, 0.1097035408, 0.0934434906, -0.1074568778, 0.4937072992, -0.1585173011, 0.0675675794, -0.233401373, 0.1656764448, 0.029134158, -0.2786064744, 0.1651809216, 0.2246357054, -0.2610379457, -0.1285685748, 0.6697030067, 0.099218294, 0.3524081111, 0.0855976492, -0.3124064803, -0.3024981618, -0.1660951823, 0.1188770458, 0.3957430422, 0.0464489907, 0.165283531, 0.1416230798, -0.2059973329, -0.2483381033, 0.0333888754, 0.1151636913, 0.0171486028, 0.1608875841, 0.1793146431, 0.0733209848, 0.2779511809, 0.1558929086, -0.0744741112, -0.054942064, -0.1906138361, -0.2437295616, 0.142841056, -0.101410687, 0.207324177, -0.1902380735, -0.2917286158, 0.1587088406, -0.2286087275, 0.1630496234, 0.2692561746, -0.0584335811, 0.5097593069, -0.1074661165, 0.2579496205, -0.039613951, 0.3279567361, -0.1831431091, 0.0911630243, 0.2839573622, -0.106813699, -0.0950942636, 0.0866204947, -0.138792783, -0.2092009187, -0.2366227657, 0.3477989435, -0.058427982, -0.14881666, -0.0404731855, 0.2657979131, 0.1655934155, 0.4483645856, -0.0675903261, 0.0337995067, 0.1700199097, 0.201385662, -0.137362197, 0.0610731207, 0.248262018, 0.0578231066, 0.1259520501, 0.5288022757, 0.4163002372, -0.0609546229, -0.1263371706, -0.2910684943, 0.1425026357, 0.1937829107, 0.1487296224, 0.3334578872, 0.0122656673, -0.0583514944, 0.0439712405, 0.1607937813, 0.1520165652, 0.3418112397, 0.0292780921, -0.2028440088, 0.000846345, 0.1873262376, 0.0220549256, -0.6729874015, 0.3362361491, 0.1451907903, -0.2824925184, 0.398786217, 0.0277415514, 0.3040840924, -0.2217609584, -0.2226588875, -0.1894534081, 0.0515003093, -0.0737111345, -0.2347967029, -0.1169681549, -0.1961243153, -0.3086970448, -0.120080322, 0.0403751805, 0.1298108846, 0.0409613103, 0.067796886, -0.0571396239, -0.1646540314, 0.1864990592, -0.2584417462, 0.2858180106, 0.0312477648, 0.0454407483, -0.0801776499, 0.1768497825, 0.2635410726, -0.0922868699, -0.0588778071, 0.6529514194, -0.0708786845, -0.2127463818, -0.0827961937, 0.0492597297, -0.1479175687, -0.1888008714, -0.0592486486, 0.5324171782, 0.2739832997, 0.1294229925, -0.1248222515, 0.226760447, -0.2451611161, 0.2631815672, -0.6641113758, 0.3068289161, 0.246119678, 0.0778738633, -0.1524757147, 0.0674320608, -0.1703533083, -0.1933409125, 0.4078934193, 0.3887724876, 0.538659811, 0.0947165489, 0.0588995665, -0.2939715087, 0.2791739404, -0.2840396464, 0.086139977, -0.057496123, 0.2992790043, -0.7319203019, -0.0987627059, 0.2255544513, -0.1211582422, 0.1216320246, 0.2576980293, 0.0417748876, 0.118810311, -0.1557261944, -0.0532571152, 0.3186097443, 0.0825062543, -0.130445987, -0.2680418789, -0.1397157907, 0.1819899678, 0.0978387147, -0.3734225929, -0.0446045436, 0.0930458978, 0.0255389065, -0.0383037329, -0.2225392312, 0.0113509074, 0.0509980135, 0.2269797325, 0.0419545323, 0.1949174106, -0.0109650046, 0.1166932881, -0.2026651502, -0.0816122591, -0.1552953273, 0.0546442531, -0.24560453, 0.5321160555, -0.3866824806, 0.5693586469, -0.4407523274, 0.1738170236, 0.2310235202, -0.4541362822, 0.0961409435, -0.0945512056, -0.1327077448, -0.1259728074, 0.0628547817, 0.3869293332, -0.1089812666, 0.1471090019, 0.1429864764, -0.3117980957, 0.3475706577, -0.0603824519, -0.2231801748, -0.0993921012, 0.2244296074, 0.1183506399, 0.3281213641, -0.3093108535, -0.037179336, 0.495208919, -0.1033606604, 0.2260008156, 0.1840551049, -0.3150680363, 0.0563165694, -0.1813355386, 0.142545566, 0.0503481925, 0.2422545999, -0.239048481, -0.477943033 ]
https://github.com/huggingface/datasets/issues/623
Custom feature types in `load_dataset` from CSV
> In general, I don't think there is any hard reason we don't allow to use `features` in the csv script, right @lhoestq? > > Should I add it? Sure let's add it. Setting the convert options should do the job > Hi @lhoestq we've tried out your suggestion but are now running into the following error: > > ``` > --------------------------------------------------------------------------- > ValueError Traceback (most recent call last) > <ipython-input-163-81ffd5ac18c9> in <module> > ----> 1 dataset.cast_(emotion_features) > > /usr/local/lib/python3.6/dist-packages/pyarrow/table.pxi in pyarrow.lib.Table.cast() > > ValueError: Target schema's field names are not matching the table's field names: ['text', 'label'], ['label', 'text'] > ``` > > Did we define the `emotion_features` incorrectly? We just followed the instructions from the [docs](https://huggingface.co/docs/datasets/features.html?highlight=features#dataset-features), but perhaps we misunderstood something 😬 Thanks for reporting, that's a bug :) I'm fixing it right now
I am trying to load a local file with the `load_dataset` function and I want to predefine the feature types with the `features` argument. However, the types are always the same independent of the value of `features`. I am working with the local files from the emotion dataset. To get the data you can use the following code: ```Python from pathlib import Path import wget EMOTION_PATH = Path("./data/emotion") DOWNLOAD_URLS = [ "https://www.dropbox.com/s/1pzkadrvffbqw6o/train.txt?dl=1", "https://www.dropbox.com/s/2mzialpsgf9k5l3/val.txt?dl=1", "https://www.dropbox.com/s/ikkqxfdbdec3fuj/test.txt?dl=1", ] if not Path.is_dir(EMOTION_PATH): Path.mkdir(EMOTION_PATH) for url in DOWNLOAD_URLS: wget.download(url, str(EMOTION_PATH)) ``` The first five lines of the train set are: ``` i didnt feel humiliated;sadness i can go from feeling so hopeless to so damned hopeful just from being around someone who cares and is awake;sadness im grabbing a minute to post i feel greedy wrong;anger i am ever feeling nostalgic about the fireplace i will know that it is still on the property;love i am feeling grouchy;anger ``` Here the code to reproduce the issue: ```Python from datasets import Features, Value, ClassLabel, load_dataset class_names = ["sadness", "joy", "love", "anger", "fear", "surprise"] emotion_features = Features({'text': Value('string'), 'label': ClassLabel(names=class_names)}) file_dict = {'train': EMOTION_PATH/'train.txt'} dataset = load_dataset('csv', data_files=file_dict, delimiter=';', column_names=['text', 'label'], features=emotion_features) ``` **Observed behaviour:** ```Python dataset['train'].features ``` ```Python {'text': Value(dtype='string', id=None), 'label': Value(dtype='string', id=None)} ``` **Expected behaviour:** ```Python dataset['train'].features ``` ```Python {'text': Value(dtype='string', id=None), 'label': ClassLabel(num_classes=6, names=['sadness', 'joy', 'love', 'anger', 'fear', 'surprise'], names_file=None, id=None)} ``` **Things I've tried:** - deleting the cache - trying other types such as `int64` Am I missing anything? Thanks for any pointer in the right direction.
136
Custom feature types in `load_dataset` from CSV I am trying to load a local file with the `load_dataset` function and I want to predefine the feature types with the `features` argument. However, the types are always the same independent of the value of `features`. I am working with the local files from the emotion dataset. To get the data you can use the following code: ```Python from pathlib import Path import wget EMOTION_PATH = Path("./data/emotion") DOWNLOAD_URLS = [ "https://www.dropbox.com/s/1pzkadrvffbqw6o/train.txt?dl=1", "https://www.dropbox.com/s/2mzialpsgf9k5l3/val.txt?dl=1", "https://www.dropbox.com/s/ikkqxfdbdec3fuj/test.txt?dl=1", ] if not Path.is_dir(EMOTION_PATH): Path.mkdir(EMOTION_PATH) for url in DOWNLOAD_URLS: wget.download(url, str(EMOTION_PATH)) ``` The first five lines of the train set are: ``` i didnt feel humiliated;sadness i can go from feeling so hopeless to so damned hopeful just from being around someone who cares and is awake;sadness im grabbing a minute to post i feel greedy wrong;anger i am ever feeling nostalgic about the fireplace i will know that it is still on the property;love i am feeling grouchy;anger ``` Here the code to reproduce the issue: ```Python from datasets import Features, Value, ClassLabel, load_dataset class_names = ["sadness", "joy", "love", "anger", "fear", "surprise"] emotion_features = Features({'text': Value('string'), 'label': ClassLabel(names=class_names)}) file_dict = {'train': EMOTION_PATH/'train.txt'} dataset = load_dataset('csv', data_files=file_dict, delimiter=';', column_names=['text', 'label'], features=emotion_features) ``` **Observed behaviour:** ```Python dataset['train'].features ``` ```Python {'text': Value(dtype='string', id=None), 'label': Value(dtype='string', id=None)} ``` **Expected behaviour:** ```Python dataset['train'].features ``` ```Python {'text': Value(dtype='string', id=None), 'label': ClassLabel(num_classes=6, names=['sadness', 'joy', 'love', 'anger', 'fear', 'surprise'], names_file=None, id=None)} ``` **Things I've tried:** - deleting the cache - trying other types such as `int64` Am I missing anything? Thanks for any pointer in the right direction. > In general, I don't think there is any hard reason we don't allow to use `features` in the csv script, right @lhoestq? > > Should I add it? Sure let's add it. Setting the convert options should do the job > Hi @lhoestq we've tried out your suggestion but are now running into the following error: > > ``` > --------------------------------------------------------------------------- > ValueError Traceback (most recent call last) > <ipython-input-163-81ffd5ac18c9> in <module> > ----> 1 dataset.cast_(emotion_features) > > /usr/local/lib/python3.6/dist-packages/pyarrow/table.pxi in pyarrow.lib.Table.cast() > > ValueError: Target schema's field names are not matching the table's field names: ['text', 'label'], ['label', 'text'] > ``` > > Did we define the `emotion_features` incorrectly? We just followed the instructions from the [docs](https://huggingface.co/docs/datasets/features.html?highlight=features#dataset-features), but perhaps we misunderstood something 😬 Thanks for reporting, that's a bug :) I'm fixing it right now
[ 0.0802029371, -0.2782895565, -0.0531791002, 0.3509228528, 0.3172290921, -0.1943103671, 0.5701336265, 0.1113843173, 0.446125567, 0.0253307968, 0.094743818, 0.3161779642, -0.0919102058, 0.3901149631, -0.0581879392, 0.0267207623, -0.1612750292, 0.3348048329, -0.0091219395, -0.3497941494, -0.2713248134, 0.1799537241, -0.0930470228, -0.013486281, -0.3072173595, 0.3395892084, 0.2221690118, 0.143438682, 0.0583121479, -0.3765773177, 0.4237108827, 0.1486945301, 0.3030694425, 0.0801386833, -0.0001151109, 0.076882951, 0.0293593034, -0.1740052551, -0.0040853135, -0.3799196482, -0.0387843475, -0.202906549, 0.4247084856, -0.3651529551, -0.2385733575, -0.4564034939, -0.2810540497, -0.1292716712, 0.2869226635, 0.3292907178, 0.1512023211, -0.0502676442, -0.2750738859, 0.2184948921, 0.3587994277, 0.6755331755, -0.2523880601, 0.1870210171, -0.0406346805, -0.1123656034, 0.0939736664, -0.1022861525, -0.1438640356, 0.353423357, 0.4926100373, 0.1710617542, -0.0643914789, -0.1303499043, -0.1733283252, 0.2814249694, 0.3474104702, 0.0562938415, -0.0464358665, -0.3231964707, -0.1238626838, -0.2973287404, 0.382417351, 0.0848604888, -0.1818488538, 0.1604423374, -0.0789410546, 0.4355747104, -0.0581504069, 0.325290978, 0.0186878741, -0.0715494901, -0.2392115891, 0.1055341363, 0.0308264177, -0.2173109949, 0.0974365473, -0.349162221, 0.2218383104, 0.1172011793, -0.0551007129, 0.0995839685, 0.003935298, -0.0604649857, -0.1210798174, 0.0352323502, 0.2292757034, 0.1932828128, -0.1932462752, 0.0317142308, 0.2777897716, 0.2441823781, 0.2002111226, -0.2317834347, 0.1527379453, 0.104715839, -0.423279196, 0.0183334798, 0.0277890079, -0.3080183268, 0.3964600563, 0.0869552791, 0.5022854805, -0.2398283482, 0.057662189, -0.0980123505, 0.0428094119, -0.0321919359, 0.1175668389, 0.1835573018, -0.0262655485, 0.580987215, -0.0656144395, 0.1488876641, -0.3211800456, 0.1283437163, -0.0581272393, -0.2226472348, -0.0374680161, -0.0493203774, 0.4030345678, 0.0941050053, 0.2478335947, 0.199026376, -0.0237025656, -0.3186685145, -0.1270062923, -0.095956482, 0.1245452017, 0.0443357415, -0.441672802, 0.3375269473, 0.2412355989, -0.2255857289, -0.2444109023, 0.0952860862, -0.2509181201, -0.2328005731, 0.1970371008, 0.155583322, -0.1784608662, -0.020963341, -0.1068923622, 0.0988962054, 0.1280108392, 0.258272469, 0.0263721272, -0.4864162505, -0.3809948564, -0.3249777257, -0.0254870169, 0.395431757, -0.4154388011, -0.2039044201, 0.0394422971, -0.0463084355, 0.1385591328, 0.2530902922, -0.3661009073, 0.1451618969, -0.2393054813, 0.2845746875, 0.5265449286, 0.0433983952, -0.2177204043, 0.4294439852, 0.1565900892, 0.3267546892, 0.186067909, 0.0742609054, 0.0775938481, 0.1260918379, 0.1594995856, 0.4206011593, 0.1657207608, 0.067381829, -0.1837767363, -0.1553835273, 0.1656042188, -0.0030713007, -0.3288176954, 0.3090257049, 0.2257002145, -0.5133963823, 0.2171115279, -0.1282680333, -0.2264052331, -0.0468893833, 0.4030795097, 0.5202860236, 0.0057925545, -0.0192435011, -0.4824975431, 0.2246333212, -0.1493486017, -0.0522528514, -0.0067957081, -0.3252949715, -0.4934935868, -0.0567670278, -0.2471701205, 0.1995168179, 0.0646642521, 0.3136295676, -0.3358725309, 0.158852458, -0.0870927498, 0.0709818974, -0.2088879943, -0.2521633804, -0.0801552907, -0.0217761993, 0.1720780581, -0.0900507942, 0.0157924816, 0.0775025189, 0.2781896293, 0.0767479241, -0.3883282542, 0.1277149916, 0.2795607746, 0.161347121, -0.1129958481, 0.2116078734, -0.003028132, -0.0095146522, -0.0278554689, 0.1732269526, 0.1935627162, -0.0937050357, -0.0290848613, 0.6565464735, 0.1539309025, 0.248978585, -0.267847687, -0.1128531545, 0.2808354199, -0.0006699339, -0.1841064841, -0.0670059621, -0.2146350145, -0.0485210344, 0.0014962032, 0.4216480851, -0.3299654424, -0.1157782227, 0.4788911045, 0.0954587162, 0.1291624606, -0.080403775, -0.0172786154, -0.029092107, 0.0167229306, -0.2370506525, 0.4729243517, -0.0610825047, -0.0803651214, -0.008282477, 0.0192122981, -0.2243368626, 0.0599996336, -0.1394981444, -0.1910201311, 0.2915715873, 0.0343267135, -0.1579158157, -0.3376893699, 0.2412527055, -0.1466074288, -0.2058868557, -0.6342836618, -0.0990627632, -0.5823050737, 0.1683030576, -0.4637621641, -0.0822290182, -0.0697951391, 0.0163359642, -0.2120193094, 0.1033788174, -0.0353965759, 0.0900433883, -0.1784473062, 0.3625626564, 0.0643080026, -0.8123733997, 0.2222117037, 0.1188195944, -0.4499817193, -0.0581740029, 0.1749914736, 0.2886118591, 0.035560213, 0.0289640799, -0.2870357931, 0.0080760773, -0.013293393, -0.1399184018, 0.1707452089, 0.5402771831, 0.1732706428, 0.1756926477, 0.1394028962, -0.1260670424, 0.398373872, -0.013117075, 0.0591222718, -0.125023067, -0.0263379999, 0.1114315242, -0.2428704202, -0.6793350577, -0.1501974165, -0.2003139257, 0.2844244242, 0.2139618099, 0.0511261038, 0.1971264333, 0.3683050275, -0.2396503687, 0.154696852, -0.0263568908, -0.2073857188, -0.0725122914, 0.4438351989, -0.1748046577, -0.2186149955, 0.0093016699, -0.3093701601, -0.3122235239, 0.1340721399, -0.311917603, 0.0612176731, -0.3492911756, 0.5237388611, 0.0541131832, 0.0240740459, 0.1082263291, -0.1130075902, 0.0396240093, -0.1758636534, -0.3381878734, 0.4521702528, 0.3127171993, 0.0991491079, 0.4245264828, -0.0252545495, -0.4129968882, 0.4985539615, -0.2855617702, -0.1919133514, 0.5953598619, -0.1800899059, 0.1787190437, -0.0728683174, -0.2670247853, -0.1126029119, -0.0425648987, -0.1116564125, 0.2236914933, -0.0085569602, -0.2700244188, -0.1810297817, 0.2123802006, -0.217617467, -0.2608530819, 0.2379056811, 0.1473992616, -0.0105134472, -0.2122009695, -0.0562984794, -0.3047746718, -0.0011162423, -0.0551236011, 0.4722041786, -0.1199495792, -0.0018577576, -0.3208261132, 0.2846589684, 0.0782866552, 0.1430095732, 0.2052050978, 0.0887331888, 0.051434949, -0.2681925297, -0.0241439305, 0.1990233064, 0.2873642147, 0.0269077923, 0.1157548428, 0.2087188661, -0.1225695312, -0.0485270135, -0.3683555722, 0.1046849564, -0.0980246589, -0.1456140876, 0.4273835123, -0.0174688399, -0.2047405243, 0.0689943209, 0.1473540962, -0.3237197995, -0.3247002661, -0.0766783357, -0.1038025171, 0.0663045123, -0.1572972089, 0.0566512868, 0.2637952864, -0.4567133784, -0.2949299812, -0.3002608418, -0.066938892, 0.4277196527, 0.0272210501, 0.3608856797, 0.0342603996, -0.1682980806, -0.0919502601, 0.3646996617, -0.2676113546, 0.5217867494, -0.0273074023, -0.3972116113, -0.1356137842, -0.2784286141, 0.1692483723, 0.2122795284, -0.3540075123, 0.0106415674, 0.0537487306, -0.1274204701, -0.3462891877, 0.4716668129, 0.468101114, -0.1359972954, 0.1705083102, -0.7704422474, 0.3951099813, -0.2586807311, -0.1935913265, -0.0233424045, -0.2719199955, -0.2325712293, 0.3815222383, 0.0811238885, 0.603358686, -0.1063162163, 0.1100441441, 0.2331689447, 0.2517562509, 0.0767837614, 0.0193489939, -0.1155477762, -0.225033462, 0.1044473052, -0.0204280987, -0.0284995362, 0.1987363398, 0.4217339456, 0.0612300932, 0.0763242468, -0.2039116919, 0.4627034068, -0.1451128274, 0.1012344435, 0.0826516822, -0.2062902749, -0.1408655196, 0.0782317147, -0.0795416087, 0.2758697867, 0.0676954389, -0.1689649671, 0.002944693, -0.2168521732, 0.0900498629, -0.1180128157, -0.3544569314, 0.1029872224, -0.0903590173, -0.1665311605, 0.049425222, 0.4377089143, 0.0453950651, 0.0724744201, 0.0026920699, -0.0892433226, 0.3455891609, 0.072393015, -0.3652020097, 0.0003103316, 0.1250232458, 0.1094889566, -0.4165462554, -0.2229630053, 0.0246598013, -0.2354820073, -0.1175188422, -0.1140048504, -0.0075319745, -0.591217041, -0.5814695358, -0.2086255103, 0.1575357318, 0.0230599046, 0.0865535364, -0.1536920965, 0.0544936135, 0.4086313546, 0.0256567709, -0.2936863303, -0.241415754, 0.1097035408, 0.0934434906, -0.1074568778, 0.4937072992, -0.1585173011, 0.0675675794, -0.233401373, 0.1656764448, 0.029134158, -0.2786064744, 0.1651809216, 0.2246357054, -0.2610379457, -0.1285685748, 0.6697030067, 0.099218294, 0.3524081111, 0.0855976492, -0.3124064803, -0.3024981618, -0.1660951823, 0.1188770458, 0.3957430422, 0.0464489907, 0.165283531, 0.1416230798, -0.2059973329, -0.2483381033, 0.0333888754, 0.1151636913, 0.0171486028, 0.1608875841, 0.1793146431, 0.0733209848, 0.2779511809, 0.1558929086, -0.0744741112, -0.054942064, -0.1906138361, -0.2437295616, 0.142841056, -0.101410687, 0.207324177, -0.1902380735, -0.2917286158, 0.1587088406, -0.2286087275, 0.1630496234, 0.2692561746, -0.0584335811, 0.5097593069, -0.1074661165, 0.2579496205, -0.039613951, 0.3279567361, -0.1831431091, 0.0911630243, 0.2839573622, -0.106813699, -0.0950942636, 0.0866204947, -0.138792783, -0.2092009187, -0.2366227657, 0.3477989435, -0.058427982, -0.14881666, -0.0404731855, 0.2657979131, 0.1655934155, 0.4483645856, -0.0675903261, 0.0337995067, 0.1700199097, 0.201385662, -0.137362197, 0.0610731207, 0.248262018, 0.0578231066, 0.1259520501, 0.5288022757, 0.4163002372, -0.0609546229, -0.1263371706, -0.2910684943, 0.1425026357, 0.1937829107, 0.1487296224, 0.3334578872, 0.0122656673, -0.0583514944, 0.0439712405, 0.1607937813, 0.1520165652, 0.3418112397, 0.0292780921, -0.2028440088, 0.000846345, 0.1873262376, 0.0220549256, -0.6729874015, 0.3362361491, 0.1451907903, -0.2824925184, 0.398786217, 0.0277415514, 0.3040840924, -0.2217609584, -0.2226588875, -0.1894534081, 0.0515003093, -0.0737111345, -0.2347967029, -0.1169681549, -0.1961243153, -0.3086970448, -0.120080322, 0.0403751805, 0.1298108846, 0.0409613103, 0.067796886, -0.0571396239, -0.1646540314, 0.1864990592, -0.2584417462, 0.2858180106, 0.0312477648, 0.0454407483, -0.0801776499, 0.1768497825, 0.2635410726, -0.0922868699, -0.0588778071, 0.6529514194, -0.0708786845, -0.2127463818, -0.0827961937, 0.0492597297, -0.1479175687, -0.1888008714, -0.0592486486, 0.5324171782, 0.2739832997, 0.1294229925, -0.1248222515, 0.226760447, -0.2451611161, 0.2631815672, -0.6641113758, 0.3068289161, 0.246119678, 0.0778738633, -0.1524757147, 0.0674320608, -0.1703533083, -0.1933409125, 0.4078934193, 0.3887724876, 0.538659811, 0.0947165489, 0.0588995665, -0.2939715087, 0.2791739404, -0.2840396464, 0.086139977, -0.057496123, 0.2992790043, -0.7319203019, -0.0987627059, 0.2255544513, -0.1211582422, 0.1216320246, 0.2576980293, 0.0417748876, 0.118810311, -0.1557261944, -0.0532571152, 0.3186097443, 0.0825062543, -0.130445987, -0.2680418789, -0.1397157907, 0.1819899678, 0.0978387147, -0.3734225929, -0.0446045436, 0.0930458978, 0.0255389065, -0.0383037329, -0.2225392312, 0.0113509074, 0.0509980135, 0.2269797325, 0.0419545323, 0.1949174106, -0.0109650046, 0.1166932881, -0.2026651502, -0.0816122591, -0.1552953273, 0.0546442531, -0.24560453, 0.5321160555, -0.3866824806, 0.5693586469, -0.4407523274, 0.1738170236, 0.2310235202, -0.4541362822, 0.0961409435, -0.0945512056, -0.1327077448, -0.1259728074, 0.0628547817, 0.3869293332, -0.1089812666, 0.1471090019, 0.1429864764, -0.3117980957, 0.3475706577, -0.0603824519, -0.2231801748, -0.0993921012, 0.2244296074, 0.1183506399, 0.3281213641, -0.3093108535, -0.037179336, 0.495208919, -0.1033606604, 0.2260008156, 0.1840551049, -0.3150680363, 0.0563165694, -0.1813355386, 0.142545566, 0.0503481925, 0.2422545999, -0.239048481, -0.477943033 ]
https://github.com/huggingface/datasets/issues/623
Custom feature types in `load_dataset` from CSV
PR is open for the `ValueError: Target schema's field names are not matching the table's field names` error. I'm adding the features parameter to csv
I am trying to load a local file with the `load_dataset` function and I want to predefine the feature types with the `features` argument. However, the types are always the same independent of the value of `features`. I am working with the local files from the emotion dataset. To get the data you can use the following code: ```Python from pathlib import Path import wget EMOTION_PATH = Path("./data/emotion") DOWNLOAD_URLS = [ "https://www.dropbox.com/s/1pzkadrvffbqw6o/train.txt?dl=1", "https://www.dropbox.com/s/2mzialpsgf9k5l3/val.txt?dl=1", "https://www.dropbox.com/s/ikkqxfdbdec3fuj/test.txt?dl=1", ] if not Path.is_dir(EMOTION_PATH): Path.mkdir(EMOTION_PATH) for url in DOWNLOAD_URLS: wget.download(url, str(EMOTION_PATH)) ``` The first five lines of the train set are: ``` i didnt feel humiliated;sadness i can go from feeling so hopeless to so damned hopeful just from being around someone who cares and is awake;sadness im grabbing a minute to post i feel greedy wrong;anger i am ever feeling nostalgic about the fireplace i will know that it is still on the property;love i am feeling grouchy;anger ``` Here the code to reproduce the issue: ```Python from datasets import Features, Value, ClassLabel, load_dataset class_names = ["sadness", "joy", "love", "anger", "fear", "surprise"] emotion_features = Features({'text': Value('string'), 'label': ClassLabel(names=class_names)}) file_dict = {'train': EMOTION_PATH/'train.txt'} dataset = load_dataset('csv', data_files=file_dict, delimiter=';', column_names=['text', 'label'], features=emotion_features) ``` **Observed behaviour:** ```Python dataset['train'].features ``` ```Python {'text': Value(dtype='string', id=None), 'label': Value(dtype='string', id=None)} ``` **Expected behaviour:** ```Python dataset['train'].features ``` ```Python {'text': Value(dtype='string', id=None), 'label': ClassLabel(num_classes=6, names=['sadness', 'joy', 'love', 'anger', 'fear', 'surprise'], names_file=None, id=None)} ``` **Things I've tried:** - deleting the cache - trying other types such as `int64` Am I missing anything? Thanks for any pointer in the right direction.
25
Custom feature types in `load_dataset` from CSV I am trying to load a local file with the `load_dataset` function and I want to predefine the feature types with the `features` argument. However, the types are always the same independent of the value of `features`. I am working with the local files from the emotion dataset. To get the data you can use the following code: ```Python from pathlib import Path import wget EMOTION_PATH = Path("./data/emotion") DOWNLOAD_URLS = [ "https://www.dropbox.com/s/1pzkadrvffbqw6o/train.txt?dl=1", "https://www.dropbox.com/s/2mzialpsgf9k5l3/val.txt?dl=1", "https://www.dropbox.com/s/ikkqxfdbdec3fuj/test.txt?dl=1", ] if not Path.is_dir(EMOTION_PATH): Path.mkdir(EMOTION_PATH) for url in DOWNLOAD_URLS: wget.download(url, str(EMOTION_PATH)) ``` The first five lines of the train set are: ``` i didnt feel humiliated;sadness i can go from feeling so hopeless to so damned hopeful just from being around someone who cares and is awake;sadness im grabbing a minute to post i feel greedy wrong;anger i am ever feeling nostalgic about the fireplace i will know that it is still on the property;love i am feeling grouchy;anger ``` Here the code to reproduce the issue: ```Python from datasets import Features, Value, ClassLabel, load_dataset class_names = ["sadness", "joy", "love", "anger", "fear", "surprise"] emotion_features = Features({'text': Value('string'), 'label': ClassLabel(names=class_names)}) file_dict = {'train': EMOTION_PATH/'train.txt'} dataset = load_dataset('csv', data_files=file_dict, delimiter=';', column_names=['text', 'label'], features=emotion_features) ``` **Observed behaviour:** ```Python dataset['train'].features ``` ```Python {'text': Value(dtype='string', id=None), 'label': Value(dtype='string', id=None)} ``` **Expected behaviour:** ```Python dataset['train'].features ``` ```Python {'text': Value(dtype='string', id=None), 'label': ClassLabel(num_classes=6, names=['sadness', 'joy', 'love', 'anger', 'fear', 'surprise'], names_file=None, id=None)} ``` **Things I've tried:** - deleting the cache - trying other types such as `int64` Am I missing anything? Thanks for any pointer in the right direction. PR is open for the `ValueError: Target schema's field names are not matching the table's field names` error. I'm adding the features parameter to csv
[ 0.0802029371, -0.2782895565, -0.0531791002, 0.3509228528, 0.3172290921, -0.1943103671, 0.5701336265, 0.1113843173, 0.446125567, 0.0253307968, 0.094743818, 0.3161779642, -0.0919102058, 0.3901149631, -0.0581879392, 0.0267207623, -0.1612750292, 0.3348048329, -0.0091219395, -0.3497941494, -0.2713248134, 0.1799537241, -0.0930470228, -0.013486281, -0.3072173595, 0.3395892084, 0.2221690118, 0.143438682, 0.0583121479, -0.3765773177, 0.4237108827, 0.1486945301, 0.3030694425, 0.0801386833, -0.0001151109, 0.076882951, 0.0293593034, -0.1740052551, -0.0040853135, -0.3799196482, -0.0387843475, -0.202906549, 0.4247084856, -0.3651529551, -0.2385733575, -0.4564034939, -0.2810540497, -0.1292716712, 0.2869226635, 0.3292907178, 0.1512023211, -0.0502676442, -0.2750738859, 0.2184948921, 0.3587994277, 0.6755331755, -0.2523880601, 0.1870210171, -0.0406346805, -0.1123656034, 0.0939736664, -0.1022861525, -0.1438640356, 0.353423357, 0.4926100373, 0.1710617542, -0.0643914789, -0.1303499043, -0.1733283252, 0.2814249694, 0.3474104702, 0.0562938415, -0.0464358665, -0.3231964707, -0.1238626838, -0.2973287404, 0.382417351, 0.0848604888, -0.1818488538, 0.1604423374, -0.0789410546, 0.4355747104, -0.0581504069, 0.325290978, 0.0186878741, -0.0715494901, -0.2392115891, 0.1055341363, 0.0308264177, -0.2173109949, 0.0974365473, -0.349162221, 0.2218383104, 0.1172011793, -0.0551007129, 0.0995839685, 0.003935298, -0.0604649857, -0.1210798174, 0.0352323502, 0.2292757034, 0.1932828128, -0.1932462752, 0.0317142308, 0.2777897716, 0.2441823781, 0.2002111226, -0.2317834347, 0.1527379453, 0.104715839, -0.423279196, 0.0183334798, 0.0277890079, -0.3080183268, 0.3964600563, 0.0869552791, 0.5022854805, -0.2398283482, 0.057662189, -0.0980123505, 0.0428094119, -0.0321919359, 0.1175668389, 0.1835573018, -0.0262655485, 0.580987215, -0.0656144395, 0.1488876641, -0.3211800456, 0.1283437163, -0.0581272393, -0.2226472348, -0.0374680161, -0.0493203774, 0.4030345678, 0.0941050053, 0.2478335947, 0.199026376, -0.0237025656, -0.3186685145, -0.1270062923, -0.095956482, 0.1245452017, 0.0443357415, -0.441672802, 0.3375269473, 0.2412355989, -0.2255857289, -0.2444109023, 0.0952860862, -0.2509181201, -0.2328005731, 0.1970371008, 0.155583322, -0.1784608662, -0.020963341, -0.1068923622, 0.0988962054, 0.1280108392, 0.258272469, 0.0263721272, -0.4864162505, -0.3809948564, -0.3249777257, -0.0254870169, 0.395431757, -0.4154388011, -0.2039044201, 0.0394422971, -0.0463084355, 0.1385591328, 0.2530902922, -0.3661009073, 0.1451618969, -0.2393054813, 0.2845746875, 0.5265449286, 0.0433983952, -0.2177204043, 0.4294439852, 0.1565900892, 0.3267546892, 0.186067909, 0.0742609054, 0.0775938481, 0.1260918379, 0.1594995856, 0.4206011593, 0.1657207608, 0.067381829, -0.1837767363, -0.1553835273, 0.1656042188, -0.0030713007, -0.3288176954, 0.3090257049, 0.2257002145, -0.5133963823, 0.2171115279, -0.1282680333, -0.2264052331, -0.0468893833, 0.4030795097, 0.5202860236, 0.0057925545, -0.0192435011, -0.4824975431, 0.2246333212, -0.1493486017, -0.0522528514, -0.0067957081, -0.3252949715, -0.4934935868, -0.0567670278, -0.2471701205, 0.1995168179, 0.0646642521, 0.3136295676, -0.3358725309, 0.158852458, -0.0870927498, 0.0709818974, -0.2088879943, -0.2521633804, -0.0801552907, -0.0217761993, 0.1720780581, -0.0900507942, 0.0157924816, 0.0775025189, 0.2781896293, 0.0767479241, -0.3883282542, 0.1277149916, 0.2795607746, 0.161347121, -0.1129958481, 0.2116078734, -0.003028132, -0.0095146522, -0.0278554689, 0.1732269526, 0.1935627162, -0.0937050357, -0.0290848613, 0.6565464735, 0.1539309025, 0.248978585, -0.267847687, -0.1128531545, 0.2808354199, -0.0006699339, -0.1841064841, -0.0670059621, -0.2146350145, -0.0485210344, 0.0014962032, 0.4216480851, -0.3299654424, -0.1157782227, 0.4788911045, 0.0954587162, 0.1291624606, -0.080403775, -0.0172786154, -0.029092107, 0.0167229306, -0.2370506525, 0.4729243517, -0.0610825047, -0.0803651214, -0.008282477, 0.0192122981, -0.2243368626, 0.0599996336, -0.1394981444, -0.1910201311, 0.2915715873, 0.0343267135, -0.1579158157, -0.3376893699, 0.2412527055, -0.1466074288, -0.2058868557, -0.6342836618, -0.0990627632, -0.5823050737, 0.1683030576, -0.4637621641, -0.0822290182, -0.0697951391, 0.0163359642, -0.2120193094, 0.1033788174, -0.0353965759, 0.0900433883, -0.1784473062, 0.3625626564, 0.0643080026, -0.8123733997, 0.2222117037, 0.1188195944, -0.4499817193, -0.0581740029, 0.1749914736, 0.2886118591, 0.035560213, 0.0289640799, -0.2870357931, 0.0080760773, -0.013293393, -0.1399184018, 0.1707452089, 0.5402771831, 0.1732706428, 0.1756926477, 0.1394028962, -0.1260670424, 0.398373872, -0.013117075, 0.0591222718, -0.125023067, -0.0263379999, 0.1114315242, -0.2428704202, -0.6793350577, -0.1501974165, -0.2003139257, 0.2844244242, 0.2139618099, 0.0511261038, 0.1971264333, 0.3683050275, -0.2396503687, 0.154696852, -0.0263568908, -0.2073857188, -0.0725122914, 0.4438351989, -0.1748046577, -0.2186149955, 0.0093016699, -0.3093701601, -0.3122235239, 0.1340721399, -0.311917603, 0.0612176731, -0.3492911756, 0.5237388611, 0.0541131832, 0.0240740459, 0.1082263291, -0.1130075902, 0.0396240093, -0.1758636534, -0.3381878734, 0.4521702528, 0.3127171993, 0.0991491079, 0.4245264828, -0.0252545495, -0.4129968882, 0.4985539615, -0.2855617702, -0.1919133514, 0.5953598619, -0.1800899059, 0.1787190437, -0.0728683174, -0.2670247853, -0.1126029119, -0.0425648987, -0.1116564125, 0.2236914933, -0.0085569602, -0.2700244188, -0.1810297817, 0.2123802006, -0.217617467, -0.2608530819, 0.2379056811, 0.1473992616, -0.0105134472, -0.2122009695, -0.0562984794, -0.3047746718, -0.0011162423, -0.0551236011, 0.4722041786, -0.1199495792, -0.0018577576, -0.3208261132, 0.2846589684, 0.0782866552, 0.1430095732, 0.2052050978, 0.0887331888, 0.051434949, -0.2681925297, -0.0241439305, 0.1990233064, 0.2873642147, 0.0269077923, 0.1157548428, 0.2087188661, -0.1225695312, -0.0485270135, -0.3683555722, 0.1046849564, -0.0980246589, -0.1456140876, 0.4273835123, -0.0174688399, -0.2047405243, 0.0689943209, 0.1473540962, -0.3237197995, -0.3247002661, -0.0766783357, -0.1038025171, 0.0663045123, -0.1572972089, 0.0566512868, 0.2637952864, -0.4567133784, -0.2949299812, -0.3002608418, -0.066938892, 0.4277196527, 0.0272210501, 0.3608856797, 0.0342603996, -0.1682980806, -0.0919502601, 0.3646996617, -0.2676113546, 0.5217867494, -0.0273074023, -0.3972116113, -0.1356137842, -0.2784286141, 0.1692483723, 0.2122795284, -0.3540075123, 0.0106415674, 0.0537487306, -0.1274204701, -0.3462891877, 0.4716668129, 0.468101114, -0.1359972954, 0.1705083102, -0.7704422474, 0.3951099813, -0.2586807311, -0.1935913265, -0.0233424045, -0.2719199955, -0.2325712293, 0.3815222383, 0.0811238885, 0.603358686, -0.1063162163, 0.1100441441, 0.2331689447, 0.2517562509, 0.0767837614, 0.0193489939, -0.1155477762, -0.225033462, 0.1044473052, -0.0204280987, -0.0284995362, 0.1987363398, 0.4217339456, 0.0612300932, 0.0763242468, -0.2039116919, 0.4627034068, -0.1451128274, 0.1012344435, 0.0826516822, -0.2062902749, -0.1408655196, 0.0782317147, -0.0795416087, 0.2758697867, 0.0676954389, -0.1689649671, 0.002944693, -0.2168521732, 0.0900498629, -0.1180128157, -0.3544569314, 0.1029872224, -0.0903590173, -0.1665311605, 0.049425222, 0.4377089143, 0.0453950651, 0.0724744201, 0.0026920699, -0.0892433226, 0.3455891609, 0.072393015, -0.3652020097, 0.0003103316, 0.1250232458, 0.1094889566, -0.4165462554, -0.2229630053, 0.0246598013, -0.2354820073, -0.1175188422, -0.1140048504, -0.0075319745, -0.591217041, -0.5814695358, -0.2086255103, 0.1575357318, 0.0230599046, 0.0865535364, -0.1536920965, 0.0544936135, 0.4086313546, 0.0256567709, -0.2936863303, -0.241415754, 0.1097035408, 0.0934434906, -0.1074568778, 0.4937072992, -0.1585173011, 0.0675675794, -0.233401373, 0.1656764448, 0.029134158, -0.2786064744, 0.1651809216, 0.2246357054, -0.2610379457, -0.1285685748, 0.6697030067, 0.099218294, 0.3524081111, 0.0855976492, -0.3124064803, -0.3024981618, -0.1660951823, 0.1188770458, 0.3957430422, 0.0464489907, 0.165283531, 0.1416230798, -0.2059973329, -0.2483381033, 0.0333888754, 0.1151636913, 0.0171486028, 0.1608875841, 0.1793146431, 0.0733209848, 0.2779511809, 0.1558929086, -0.0744741112, -0.054942064, -0.1906138361, -0.2437295616, 0.142841056, -0.101410687, 0.207324177, -0.1902380735, -0.2917286158, 0.1587088406, -0.2286087275, 0.1630496234, 0.2692561746, -0.0584335811, 0.5097593069, -0.1074661165, 0.2579496205, -0.039613951, 0.3279567361, -0.1831431091, 0.0911630243, 0.2839573622, -0.106813699, -0.0950942636, 0.0866204947, -0.138792783, -0.2092009187, -0.2366227657, 0.3477989435, -0.058427982, -0.14881666, -0.0404731855, 0.2657979131, 0.1655934155, 0.4483645856, -0.0675903261, 0.0337995067, 0.1700199097, 0.201385662, -0.137362197, 0.0610731207, 0.248262018, 0.0578231066, 0.1259520501, 0.5288022757, 0.4163002372, -0.0609546229, -0.1263371706, -0.2910684943, 0.1425026357, 0.1937829107, 0.1487296224, 0.3334578872, 0.0122656673, -0.0583514944, 0.0439712405, 0.1607937813, 0.1520165652, 0.3418112397, 0.0292780921, -0.2028440088, 0.000846345, 0.1873262376, 0.0220549256, -0.6729874015, 0.3362361491, 0.1451907903, -0.2824925184, 0.398786217, 0.0277415514, 0.3040840924, -0.2217609584, -0.2226588875, -0.1894534081, 0.0515003093, -0.0737111345, -0.2347967029, -0.1169681549, -0.1961243153, -0.3086970448, -0.120080322, 0.0403751805, 0.1298108846, 0.0409613103, 0.067796886, -0.0571396239, -0.1646540314, 0.1864990592, -0.2584417462, 0.2858180106, 0.0312477648, 0.0454407483, -0.0801776499, 0.1768497825, 0.2635410726, -0.0922868699, -0.0588778071, 0.6529514194, -0.0708786845, -0.2127463818, -0.0827961937, 0.0492597297, -0.1479175687, -0.1888008714, -0.0592486486, 0.5324171782, 0.2739832997, 0.1294229925, -0.1248222515, 0.226760447, -0.2451611161, 0.2631815672, -0.6641113758, 0.3068289161, 0.246119678, 0.0778738633, -0.1524757147, 0.0674320608, -0.1703533083, -0.1933409125, 0.4078934193, 0.3887724876, 0.538659811, 0.0947165489, 0.0588995665, -0.2939715087, 0.2791739404, -0.2840396464, 0.086139977, -0.057496123, 0.2992790043, -0.7319203019, -0.0987627059, 0.2255544513, -0.1211582422, 0.1216320246, 0.2576980293, 0.0417748876, 0.118810311, -0.1557261944, -0.0532571152, 0.3186097443, 0.0825062543, -0.130445987, -0.2680418789, -0.1397157907, 0.1819899678, 0.0978387147, -0.3734225929, -0.0446045436, 0.0930458978, 0.0255389065, -0.0383037329, -0.2225392312, 0.0113509074, 0.0509980135, 0.2269797325, 0.0419545323, 0.1949174106, -0.0109650046, 0.1166932881, -0.2026651502, -0.0816122591, -0.1552953273, 0.0546442531, -0.24560453, 0.5321160555, -0.3866824806, 0.5693586469, -0.4407523274, 0.1738170236, 0.2310235202, -0.4541362822, 0.0961409435, -0.0945512056, -0.1327077448, -0.1259728074, 0.0628547817, 0.3869293332, -0.1089812666, 0.1471090019, 0.1429864764, -0.3117980957, 0.3475706577, -0.0603824519, -0.2231801748, -0.0993921012, 0.2244296074, 0.1183506399, 0.3281213641, -0.3093108535, -0.037179336, 0.495208919, -0.1033606604, 0.2260008156, 0.1840551049, -0.3150680363, 0.0563165694, -0.1813355386, 0.142545566, 0.0503481925, 0.2422545999, -0.239048481, -0.477943033 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
@thomwolf Sure. I'll try downgrading to 3.7 now even though Arrow say they support >=3.5. Linux (Ubuntu 18.04) - Python 3.8 ====================== Package - Version --------------------- certifi 2020.6.20 chardet 3.0.4 click 7.1.2 datasets 1.0.1 dill 0.3.2 fasttext 0.9.2 filelock 3.0.12 future 0.18.2 idna 2.10 joblib 0.16.0 nltk 3.5 numpy 1.19.1 packaging 20.4 pandas 1.1.2 pip 20.0.2 protobuf 3.13.0 pyarrow 1.0.1 pybind11 2.5.0 pyparsing 2.4.7 python-dateutil 2.8.1 pytz 2020.1 regex 2020.7.14 requests 2.24.0 sacremoses 0.0.43 scikit-learn 0.23.2 scipy 1.5.2 sentence-transformers 0.3.6 sentencepiece 0.1.91 setuptools 46.1.3 six 1.15.0 stanza 1.1.1 threadpoolctl 2.1.0 tokenizers 0.8.1rc2 torch 1.6.0+cu101 tqdm 4.48.2 transformers 3.1.0 urllib3 1.25.10 wheel 0.34.2 xxhash 2.0.0 Windows 10 - Python 3.8 ================ Package - Version ---------------------------- certifi 2020.6.20 chardet 3.0.4 click 7.1.2 datasets 1.0.1 dill 0.3.2 fasttext 0.9.2 filelock 3.0.12 future 0.18.2 idna 2.10 joblib 0.16.0 nlp 0.4.0 nltk 3.5 numpy 1.19.1 packaging 20.4 pandas 1.1.1 pip 20.0.2 protobuf 3.13.0 pyarrow 1.0.1 pybind11 2.5.0 pyparsing 2.4.7 python-dateutil 2.8.1 pytz 2020.1 regex 2020.7.14 requests 2.24.0 sacremoses 0.0.43 scikit-learn 0.23.2 scipy 1.5.2 sentence-transformers 0.3.5.1 sentencepiece 0.1.91 setuptools 46.1.3 six 1.15.0 stanza 1.1.1 threadpoolctl 2.1.0 tokenizers 0.8.1rc1 torch 1.6.0+cu101 tqdm 4.48.2 transformers 3.0.2 urllib3 1.25.10 wheel 0.34.2 xxhash 2.0.0
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
194
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` @thomwolf Sure. I'll try downgrading to 3.7 now even though Arrow say they support >=3.5. Linux (Ubuntu 18.04) - Python 3.8 ====================== Package - Version --------------------- certifi 2020.6.20 chardet 3.0.4 click 7.1.2 datasets 1.0.1 dill 0.3.2 fasttext 0.9.2 filelock 3.0.12 future 0.18.2 idna 2.10 joblib 0.16.0 nltk 3.5 numpy 1.19.1 packaging 20.4 pandas 1.1.2 pip 20.0.2 protobuf 3.13.0 pyarrow 1.0.1 pybind11 2.5.0 pyparsing 2.4.7 python-dateutil 2.8.1 pytz 2020.1 regex 2020.7.14 requests 2.24.0 sacremoses 0.0.43 scikit-learn 0.23.2 scipy 1.5.2 sentence-transformers 0.3.6 sentencepiece 0.1.91 setuptools 46.1.3 six 1.15.0 stanza 1.1.1 threadpoolctl 2.1.0 tokenizers 0.8.1rc2 torch 1.6.0+cu101 tqdm 4.48.2 transformers 3.1.0 urllib3 1.25.10 wheel 0.34.2 xxhash 2.0.0 Windows 10 - Python 3.8 ================ Package - Version ---------------------------- certifi 2020.6.20 chardet 3.0.4 click 7.1.2 datasets 1.0.1 dill 0.3.2 fasttext 0.9.2 filelock 3.0.12 future 0.18.2 idna 2.10 joblib 0.16.0 nlp 0.4.0 nltk 3.5 numpy 1.19.1 packaging 20.4 pandas 1.1.1 pip 20.0.2 protobuf 3.13.0 pyarrow 1.0.1 pybind11 2.5.0 pyparsing 2.4.7 python-dateutil 2.8.1 pytz 2020.1 regex 2020.7.14 requests 2.24.0 sacremoses 0.0.43 scikit-learn 0.23.2 scipy 1.5.2 sentence-transformers 0.3.5.1 sentencepiece 0.1.91 setuptools 46.1.3 six 1.15.0 stanza 1.1.1 threadpoolctl 2.1.0 tokenizers 0.8.1rc1 torch 1.6.0+cu101 tqdm 4.48.2 transformers 3.0.2 urllib3 1.25.10 wheel 0.34.2 xxhash 2.0.0
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
Downgrading to 3.7 does not help. Here is a dummy text file: ```text Verzekering weigert vaker te betalen Bedrijven van verzekeringen erkennen steeds minder arbeidsongevallen . In 2012 weigerden de bedrijven te betalen voor 21.055 ongevallen op het werk . Dat is 11,8 % van alle ongevallen op het werk . Nog nooit weigerden verzekeraars zoveel zaken . In 2012 hadden 135.118 mensen een ongeval op het werk . Dat zijn elke werkdag 530 mensen . Bij die ongevallen stierven 67 mensen . Bijna 12.000 hebben een handicap na het ongeval . Geen echt arbeidsongeval Bedrijven moeten een verzekering hebben voor hun werknemers . ``` A temporary work around for the "text" type, is ```python dataset = Dataset.from_dict({"text": Path(dataset_f).read_text().splitlines()}) ```
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
120
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` Downgrading to 3.7 does not help. Here is a dummy text file: ```text Verzekering weigert vaker te betalen Bedrijven van verzekeringen erkennen steeds minder arbeidsongevallen . In 2012 weigerden de bedrijven te betalen voor 21.055 ongevallen op het werk . Dat is 11,8 % van alle ongevallen op het werk . Nog nooit weigerden verzekeraars zoveel zaken . In 2012 hadden 135.118 mensen een ongeval op het werk . Dat zijn elke werkdag 530 mensen . Bij die ongevallen stierven 67 mensen . Bijna 12.000 hebben een handicap na het ongeval . Geen echt arbeidsongeval Bedrijven moeten een verzekering hebben voor hun werknemers . ``` A temporary work around for the "text" type, is ```python dataset = Dataset.from_dict({"text": Path(dataset_f).read_text().splitlines()}) ```
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
@banunitte Please do not post screenshots in the future but copy-paste your code and the errors. That allows others to copy-and-paste your code and test it. You may also want to provide the Python version that you are using.
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
39
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` @banunitte Please do not post screenshots in the future but copy-paste your code and the errors. That allows others to copy-and-paste your code and test it. You may also want to provide the Python version that you are using.
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
I have the same problem on Linux of the script crashing with a CSV error. This may be caused by 'CRLF', when changed 'CRLF' to 'LF', the problem solved.
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
29
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` I have the same problem on Linux of the script crashing with a CSV error. This may be caused by 'CRLF', when changed 'CRLF' to 'LF', the problem solved.
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
I pushed a fix for `pyarrow.lib.ArrowInvalid: CSV parse error`. Let me know if you still have this issue. Not sure about the windows one yet
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
25
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` I pushed a fix for `pyarrow.lib.ArrowInvalid: CSV parse error`. Let me know if you still have this issue. Not sure about the windows one yet
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
To complete what @lhoestq is saying, I think that to use the new version of the `text` processing script (which is on master right now) you need to either specify the version of the script to be the `master` one or to install the lib from source (in which case it uses the `master` version of the script by default): ```python dataset = load_dataset('text', script_version='master', data_files=XXX) ``` We do versioning by default, i.e. your version of the dataset lib will use the script with the same version by default (i.e. only the `1.0.1` version of the script if you have the PyPI version `1.0.1` of the lib).
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
107
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` To complete what @lhoestq is saying, I think that to use the new version of the `text` processing script (which is on master right now) you need to either specify the version of the script to be the `master` one or to install the lib from source (in which case it uses the `master` version of the script by default): ```python dataset = load_dataset('text', script_version='master', data_files=XXX) ``` We do versioning by default, i.e. your version of the dataset lib will use the script with the same version by default (i.e. only the `1.0.1` version of the script if you have the PyPI version `1.0.1` of the lib).
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
![image](https://user-images.githubusercontent.com/36957508/93300760-fa9a8680-f829-11ea-9105-7a6f67ad8373.png) win10, py3.6 ``` from datasets import Features, Value, ClassLabel, load_dataset features = Features({'text': Value('string'), 'ctext': Value('string')}) file_dict = {'train': PATH/'summary.csv'} dataset = load_dataset('csv', data_files=file_dict, script_version='master', delimiter='\t', column_names=['text', 'ctext'], features=features) ```
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
31
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` ![image](https://user-images.githubusercontent.com/36957508/93300760-fa9a8680-f829-11ea-9105-7a6f67ad8373.png) win10, py3.6 ``` from datasets import Features, Value, ClassLabel, load_dataset features = Features({'text': Value('string'), 'ctext': Value('string')}) file_dict = {'train': PATH/'summary.csv'} dataset = load_dataset('csv', data_files=file_dict, script_version='master', delimiter='\t', column_names=['text', 'ctext'], features=features) ```
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
```python Traceback` (most recent call last): File "main.py", line 281, in <module> main() File "main.py", line 190, in main train_data, test_data = data_factory( File "main.py", line 129, in data_factory train_data = load_dataset('text', File "/home/me/Downloads/datasets/src/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/me/Downloads/datasets/src/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/me/Downloads/datasets/src/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/me/Downloads/datasets/src/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/me/.local/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/me/.cache/huggingface/modules/datasets_modules/datasets/text/512f465342e4f4cd07a8791428a629c043bb89d55ad7817cbf7fcc649178b014/text.py", line 103, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 617, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 123, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 85, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Unfortunately i am still getting this issue on Linux. I installed datasets from source and specified script_version to master.
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
135
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` ```python Traceback` (most recent call last): File "main.py", line 281, in <module> main() File "main.py", line 190, in main train_data, test_data = data_factory( File "main.py", line 129, in data_factory train_data = load_dataset('text', File "/home/me/Downloads/datasets/src/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/me/Downloads/datasets/src/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/me/Downloads/datasets/src/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/me/Downloads/datasets/src/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/me/.local/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/me/.cache/huggingface/modules/datasets_modules/datasets/text/512f465342e4f4cd07a8791428a629c043bb89d55ad7817cbf7fcc649178b014/text.py", line 103, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 617, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 123, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 85, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Unfortunately i am still getting this issue on Linux. I installed datasets from source and specified script_version to master.
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
> ![image](https://user-images.githubusercontent.com/36957508/93300760-fa9a8680-f829-11ea-9105-7a6f67ad8373.png) > win10, py3.6 > > ``` > from datasets import Features, Value, ClassLabel, load_dataset > > > features = Features({'text': Value('string'), 'ctext': Value('string')}) > file_dict = {'train': PATH/'summary.csv'} > > dataset = load_dataset('csv', data_files=file_dict, script_version='master', delimiter='\t', column_names=['text', 'ctext'], features=features) > ``` Since #644 it should now work on windows @ScottishFold007 > Trying the following snippet, I get different problems on Linux and Windows. > > ```python > dataset = load_dataset("text", data_files="data.txt") > # or > dataset = load_dataset("text", data_files=["data.txt"]) > ``` > > Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: > > ``` > Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. > Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text > Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 > Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py > Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json > Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json > Using custom data configuration default > ``` Same for you @BramVanroy . Not sure about the one on linux though
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
184
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` > ![image](https://user-images.githubusercontent.com/36957508/93300760-fa9a8680-f829-11ea-9105-7a6f67ad8373.png) > win10, py3.6 > > ``` > from datasets import Features, Value, ClassLabel, load_dataset > > > features = Features({'text': Value('string'), 'ctext': Value('string')}) > file_dict = {'train': PATH/'summary.csv'} > > dataset = load_dataset('csv', data_files=file_dict, script_version='master', delimiter='\t', column_names=['text', 'ctext'], features=features) > ``` Since #644 it should now work on windows @ScottishFold007 > Trying the following snippet, I get different problems on Linux and Windows. > > ```python > dataset = load_dataset("text", data_files="data.txt") > # or > dataset = load_dataset("text", data_files=["data.txt"]) > ``` > > Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: > > ``` > Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. > Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text > Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 > Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py > Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json > Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json > Using custom data configuration default > ``` Same for you @BramVanroy . Not sure about the one on linux though
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
> To complete what @lhoestq is saying, I think that to use the new version of the `text` processing script (which is on master right now) you need to either specify the version of the script to be the `master` one or to install the lib from source (in which case it uses the `master` version of the script by default): > > ```python > dataset = load_dataset('text', script_version='master', data_files=XXX) > ``` > > We do versioning by default, i.e. your version of the dataset lib will use the script with the same version by default (i.e. only the `1.0.1` version of the script if you have the PyPI version `1.0.1` of the lib). Linux here: I was using the 0.4.0 nlp library load_dataset to load a text dataset of 9-10Gb without collapsing the RAM memory. However, today I got the csv error message mentioned in this issue. After installing the new (datasets) library from source and specifying the script_verson = 'master' I'm still having this same error message. Furthermore, I cannot use the dictionary "trick" to load the dataset since the system kills the process due to a RAM out of memory problem. Is there any other solution to this error? Thank you in advance.
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
206
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` > To complete what @lhoestq is saying, I think that to use the new version of the `text` processing script (which is on master right now) you need to either specify the version of the script to be the `master` one or to install the lib from source (in which case it uses the `master` version of the script by default): > > ```python > dataset = load_dataset('text', script_version='master', data_files=XXX) > ``` > > We do versioning by default, i.e. your version of the dataset lib will use the script with the same version by default (i.e. only the `1.0.1` version of the script if you have the PyPI version `1.0.1` of the lib). Linux here: I was using the 0.4.0 nlp library load_dataset to load a text dataset of 9-10Gb without collapsing the RAM memory. However, today I got the csv error message mentioned in this issue. After installing the new (datasets) library from source and specifying the script_verson = 'master' I'm still having this same error message. Furthermore, I cannot use the dictionary "trick" to load the dataset since the system kills the process due to a RAM out of memory problem. Is there any other solution to this error? Thank you in advance.
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
Hi @raruidol To fix the RAM issue you'll need to shard your text files into smaller files (see https://github.com/huggingface/datasets/issues/610#issuecomment-691672919 for example) I'm not sure why you're having the csv error on linux. Do you think you could to to reproduce it on google colab for example ? Or send me a dummy .txt file that reproduces the issue ?
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
59
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` Hi @raruidol To fix the RAM issue you'll need to shard your text files into smaller files (see https://github.com/huggingface/datasets/issues/610#issuecomment-691672919 for example) I'm not sure why you're having the csv error on linux. Do you think you could to to reproduce it on google colab for example ? Or send me a dummy .txt file that reproduces the issue ?
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
@lhoestq The crash message shows up when loading the dataset: ``` print('Loading corpus...') files = glob.glob('corpora/shards/*') -> dataset = load_dataset('text', script_version='master', data_files=files) print('Corpus loaded.') ``` And this is the exact message: ``` Traceback (most recent call last): File "run_language_modeling.py", line 27, in <module> dataset = load_dataset('text', script_version='master', data_files=files) File "/home/jupyter-raruidol/DebatAnalyser/env/lib/python3.7/site-packages/datasets/load.py", line 611, in load_dataset ignore_verifications=ignore_verifications, File "/home/jupyter-raruidol/DebatAnalyser/env/lib/python3.7/site-packages/datasets/builder.py", line 471, in download_and_prepare dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs File "/home/jupyter-raruidol/DebatAnalyser/env/lib/python3.7/site-packages/datasets/builder.py", line 548, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/jupyter-raruidol/DebatAnalyser/env/lib/python3.7/site-packages/datasets/builder.py", line 892, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/jupyter-raruidol/DebatAnalyser/env/lib/python3.7/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/jupyter-raruidol/.cache/huggingface/modules/datasets_modules/datasets/text/512f465342e4f4cd07a8791428a629c043bb89d55ad7817cbf7fcc649178b014/text.py", line 107, in _generate_tables convert_options=self.config.convert_options, File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` And these are the pip packages I have atm and their versions: ``` Package Version Location --------------- --------- ------------------------------------------------------------- certifi 2020.6.20 chardet 3.0.4 click 7.1.2 datasets 1.0.2 dill 0.3.2 filelock 3.0.12 future 0.18.2 idna 2.10 joblib 0.16.0 numpy 1.19.1 packaging 20.4 pandas 1.1.1 pip 19.0.3 pyarrow 1.0.1 pyparsing 2.4.7 python-dateutil 2.8.1 pytz 2020.1 regex 2020.7.14 requests 2.24.0 sacremoses 0.0.43 sentencepiece 0.1.91 setuptools 40.8.0 six 1.15.0 tokenizers 0.8.1rc2 torch 1.6.0 tqdm 4.48.2 transformers 3.0.2 /home/jupyter-raruidol/DebatAnalyser/env/src/transformers/src ```
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
207
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` @lhoestq The crash message shows up when loading the dataset: ``` print('Loading corpus...') files = glob.glob('corpora/shards/*') -> dataset = load_dataset('text', script_version='master', data_files=files) print('Corpus loaded.') ``` And this is the exact message: ``` Traceback (most recent call last): File "run_language_modeling.py", line 27, in <module> dataset = load_dataset('text', script_version='master', data_files=files) File "/home/jupyter-raruidol/DebatAnalyser/env/lib/python3.7/site-packages/datasets/load.py", line 611, in load_dataset ignore_verifications=ignore_verifications, File "/home/jupyter-raruidol/DebatAnalyser/env/lib/python3.7/site-packages/datasets/builder.py", line 471, in download_and_prepare dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs File "/home/jupyter-raruidol/DebatAnalyser/env/lib/python3.7/site-packages/datasets/builder.py", line 548, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/jupyter-raruidol/DebatAnalyser/env/lib/python3.7/site-packages/datasets/builder.py", line 892, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/jupyter-raruidol/DebatAnalyser/env/lib/python3.7/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/jupyter-raruidol/.cache/huggingface/modules/datasets_modules/datasets/text/512f465342e4f4cd07a8791428a629c043bb89d55ad7817cbf7fcc649178b014/text.py", line 107, in _generate_tables convert_options=self.config.convert_options, File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` And these are the pip packages I have atm and their versions: ``` Package Version Location --------------- --------- ------------------------------------------------------------- certifi 2020.6.20 chardet 3.0.4 click 7.1.2 datasets 1.0.2 dill 0.3.2 filelock 3.0.12 future 0.18.2 idna 2.10 joblib 0.16.0 numpy 1.19.1 packaging 20.4 pandas 1.1.1 pip 19.0.3 pyarrow 1.0.1 pyparsing 2.4.7 python-dateutil 2.8.1 pytz 2020.1 regex 2020.7.14 requests 2.24.0 sacremoses 0.0.43 sentencepiece 0.1.91 setuptools 40.8.0 six 1.15.0 tokenizers 0.8.1rc2 torch 1.6.0 tqdm 4.48.2 transformers 3.0.2 /home/jupyter-raruidol/DebatAnalyser/env/src/transformers/src ```
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
I tested on google colab which is also linux using this code: - first download an arbitrary text file ```bash wget https://raw.githubusercontent.com/abisee/cnn-dailymail/master/url_lists/all_train.txt ``` - then run ```python from datasets import load_dataset d = load_dataset("text", data_files="all_train.txt", script_version='master') ``` And I don't get this issue. \> Could you test on your side if these lines work @raruidol ? also cc @Skyy93 as it seems you have the same issue If it works: It could mean that the issue could come from unexpected patterns in the files you want to use. In that case we should find a way to handle them. And if it doesn't work: It could mean that it comes from the way pyarrow reads text files on linux. In that case we should report it to pyarrow and find a workaround in the meantime Either way it should help to find where this bug comes from and fix it :) Thank you in advance !
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
156
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` I tested on google colab which is also linux using this code: - first download an arbitrary text file ```bash wget https://raw.githubusercontent.com/abisee/cnn-dailymail/master/url_lists/all_train.txt ``` - then run ```python from datasets import load_dataset d = load_dataset("text", data_files="all_train.txt", script_version='master') ``` And I don't get this issue. \> Could you test on your side if these lines work @raruidol ? also cc @Skyy93 as it seems you have the same issue If it works: It could mean that the issue could come from unexpected patterns in the files you want to use. In that case we should find a way to handle them. And if it doesn't work: It could mean that it comes from the way pyarrow reads text files on linux. In that case we should report it to pyarrow and find a workaround in the meantime Either way it should help to find where this bug comes from and fix it :) Thank you in advance !
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
Update: also tested the above code in a docker container from [jupyter/minimal-notebook](https://hub.docker.com/r/jupyter/minimal-notebook/) (based on ubuntu) and still not able to reproduce
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
21
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` Update: also tested the above code in a docker container from [jupyter/minimal-notebook](https://hub.docker.com/r/jupyter/minimal-notebook/) (based on ubuntu) and still not able to reproduce
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
It looks like with your text input file works without any problem. I have been doing some experiments this morning with my input files and I'm almost certain that the crash is caused by some unexpected pattern in the files. However, I've not been able to spot the main cause of it. What I find strange is that this same corpus was being loaded by the nlp 0.4.0 library without any problem... Where can I find the code where you structure the input text data in order to use it with pyarrow?
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
92
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` It looks like with your text input file works without any problem. I have been doing some experiments this morning with my input files and I'm almost certain that the crash is caused by some unexpected pattern in the files. However, I've not been able to spot the main cause of it. What I find strange is that this same corpus was being loaded by the nlp 0.4.0 library without any problem... Where can I find the code where you structure the input text data in order to use it with pyarrow?
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
Under the hood it does ```python import pyarrow as pa import pyarrow.csv # Use csv reader from Pyarrow with one column for text files # To force the one-column setting, we set an arbitrary character # that is not in text files as delimiter, such as \b or \v. # The bell character, \b, was used to make beeps back in the days parse_options = pa.csv.ParseOptions( delimiter="\b", quote_char=False, double_quote=False, escape_char=False, newlines_in_values=False, ignore_empty_lines=False, ) read_options= pa.csv.ReadOptions(use_threads=True, column_names=["text"]) pa_table = pa.csv.read_csv("all_train.txt", read_options=read_options, parse_options=parse_options) ``` Note that we changed the parse options with datasets 1.0 In particular the delimiter used to be `\r` but this delimiter doesn't work on windows.
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
107
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` Under the hood it does ```python import pyarrow as pa import pyarrow.csv # Use csv reader from Pyarrow with one column for text files # To force the one-column setting, we set an arbitrary character # that is not in text files as delimiter, such as \b or \v. # The bell character, \b, was used to make beeps back in the days parse_options = pa.csv.ParseOptions( delimiter="\b", quote_char=False, double_quote=False, escape_char=False, newlines_in_values=False, ignore_empty_lines=False, ) read_options= pa.csv.ReadOptions(use_threads=True, column_names=["text"]) pa_table = pa.csv.read_csv("all_train.txt", read_options=read_options, parse_options=parse_options) ``` Note that we changed the parse options with datasets 1.0 In particular the delimiter used to be `\r` but this delimiter doesn't work on windows.
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
Could you try with `\a` instead of `\b` ? It looks like the bell character is \a in python and not \b
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
22
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` Could you try with `\a` instead of `\b` ? It looks like the bell character is \a in python and not \b
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
I was just exploring if the crash was happening in every shard or not, and which shards were generating the error message. With \b I got the following list of shards crashing: ``` Errors on files: ['corpora/shards/shard_0069', 'corpora/shards/shard_0043', 'corpora/shards/shard_0014', 'corpora/shards/shard_0032', 'corpora/shards/shard_0088', 'corpora/shards/shard_0018', 'corpora/shards/shard_0073', 'corpora/shards/shard_0079', 'corpora/shards/shard_0038', 'corpora/shards/shard_0041', 'corpora/shards/shard_0007', 'corpora/shards/shard_0004', 'corpora/shards/shard_0102', 'corpora/shards/shard_0096', 'corpora/shards/shard_0030', 'corpora/shards/shard_0076', 'corpora/shards/shard_0067', 'corpora/shards/shard_0052', 'corpora/shards/shard_0026', 'corpora/shards/shard_0024', 'corpora/shards/shard_0064', 'corpora/shards/shard_0044', 'corpora/shards/shard_0013', 'corpora/shards/shard_0062', 'corpora/shards/shard_0057', 'corpora/shards/shard_0097', 'corpora/shards/shard_0094', 'corpora/shards/shard_0078', 'corpora/shards/shard_0075', 'corpora/shards/shard_0039', 'corpora/shards/shard_0077', 'corpora/shards/shard_0021', 'corpora/shards/shard_0040', 'corpora/shards/shard_0009', 'corpora/shards/shard_0023', 'corpora/shards/shard_0095', 'corpora/shards/shard_0107', 'corpora/shards/shard_0063', 'corpora/shards/shard_0086', 'corpora/shards/shard_0047', 'corpora/shards/shard_0089', 'corpora/shards/shard_0037', 'corpora/shards/shard_0101', 'corpora/shards/shard_0093', 'corpora/shards/shard_0082', 'corpora/shards/shard_0091', 'corpora/shards/shard_0065', 'corpora/shards/shard_0020', 'corpora/shards/shard_0070', 'corpora/shards/shard_0008', 'corpora/shards/shard_0058', 'corpora/shards/shard_0060', 'corpora/shards/shard_0022', 'corpora/shards/shard_0059', 'corpora/shards/shard_0100', 'corpora/shards/shard_0027', 'corpora/shards/shard_0072', 'corpora/shards/shard_0098', 'corpora/shards/shard_0019', 'corpora/shards/shard_0066', 'corpora/shards/shard_0042', 'corpora/shards/shard_0053'] ``` I also tried with \a and the list decreased but there were still several crashes: ``` Errors on files: ['corpora/shards/shard_0069', 'corpora/shards/shard_0055', 'corpora/shards/shard_0043', 'corpora/shards/shard_0014', 'corpora/shards/shard_0073', 'corpora/shards/shard_0025', 'corpora/shards/shard_0068', 'corpora/shards/shard_0102', 'corpora/shards/shard_0096', 'corpora/shards/shard_0076', 'corpora/shards/shard_0067', 'corpora/shards/shard_0026', 'corpora/shards/shard_0024', 'corpora/shards/shard_0044', 'corpora/shards/shard_0087', 'corpora/shards/shard_0092', 'corpora/shards/shard_0074', 'corpora/shards/shard_0094', 'corpora/shards/shard_0078', 'corpora/shards/shard_0039', 'corpora/shards/shard_0077', 'corpora/shards/shard_0040', 'corpora/shards/shard_0009', 'corpora/shards/shard_0107', 'corpora/shards/shard_0063', 'corpora/shards/shard_0103', 'corpora/shards/shard_0047', 'corpora/shards/shard_0033', 'corpora/shards/shard_0089', 'corpora/shards/shard_0037', 'corpora/shards/shard_0082', 'corpora/shards/shard_0071', 'corpora/shards/shard_0091', 'corpora/shards/shard_0065', 'corpora/shards/shard_0070', 'corpora/shards/shard_0058', 'corpora/shards/shard_0081', 'corpora/shards/shard_0060', 'corpora/shards/shard_0002', 'corpora/shards/shard_0059', 'corpora/shards/shard_0027', 'corpora/shards/shard_0072', 'corpora/shards/shard_0098', 'corpora/shards/shard_0019', 'corpora/shards/shard_0045', 'corpora/shards/shard_0036', 'corpora/shards/shard_0066', 'corpora/shards/shard_0053'] ``` Which means that it is quite possible that the assumption of that some unexpected pattern in the files is causing the crashes is true. If I am able to reach any conclusion I will post It here asap.
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
205
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` I was just exploring if the crash was happening in every shard or not, and which shards were generating the error message. With \b I got the following list of shards crashing: ``` Errors on files: ['corpora/shards/shard_0069', 'corpora/shards/shard_0043', 'corpora/shards/shard_0014', 'corpora/shards/shard_0032', 'corpora/shards/shard_0088', 'corpora/shards/shard_0018', 'corpora/shards/shard_0073', 'corpora/shards/shard_0079', 'corpora/shards/shard_0038', 'corpora/shards/shard_0041', 'corpora/shards/shard_0007', 'corpora/shards/shard_0004', 'corpora/shards/shard_0102', 'corpora/shards/shard_0096', 'corpora/shards/shard_0030', 'corpora/shards/shard_0076', 'corpora/shards/shard_0067', 'corpora/shards/shard_0052', 'corpora/shards/shard_0026', 'corpora/shards/shard_0024', 'corpora/shards/shard_0064', 'corpora/shards/shard_0044', 'corpora/shards/shard_0013', 'corpora/shards/shard_0062', 'corpora/shards/shard_0057', 'corpora/shards/shard_0097', 'corpora/shards/shard_0094', 'corpora/shards/shard_0078', 'corpora/shards/shard_0075', 'corpora/shards/shard_0039', 'corpora/shards/shard_0077', 'corpora/shards/shard_0021', 'corpora/shards/shard_0040', 'corpora/shards/shard_0009', 'corpora/shards/shard_0023', 'corpora/shards/shard_0095', 'corpora/shards/shard_0107', 'corpora/shards/shard_0063', 'corpora/shards/shard_0086', 'corpora/shards/shard_0047', 'corpora/shards/shard_0089', 'corpora/shards/shard_0037', 'corpora/shards/shard_0101', 'corpora/shards/shard_0093', 'corpora/shards/shard_0082', 'corpora/shards/shard_0091', 'corpora/shards/shard_0065', 'corpora/shards/shard_0020', 'corpora/shards/shard_0070', 'corpora/shards/shard_0008', 'corpora/shards/shard_0058', 'corpora/shards/shard_0060', 'corpora/shards/shard_0022', 'corpora/shards/shard_0059', 'corpora/shards/shard_0100', 'corpora/shards/shard_0027', 'corpora/shards/shard_0072', 'corpora/shards/shard_0098', 'corpora/shards/shard_0019', 'corpora/shards/shard_0066', 'corpora/shards/shard_0042', 'corpora/shards/shard_0053'] ``` I also tried with \a and the list decreased but there were still several crashes: ``` Errors on files: ['corpora/shards/shard_0069', 'corpora/shards/shard_0055', 'corpora/shards/shard_0043', 'corpora/shards/shard_0014', 'corpora/shards/shard_0073', 'corpora/shards/shard_0025', 'corpora/shards/shard_0068', 'corpora/shards/shard_0102', 'corpora/shards/shard_0096', 'corpora/shards/shard_0076', 'corpora/shards/shard_0067', 'corpora/shards/shard_0026', 'corpora/shards/shard_0024', 'corpora/shards/shard_0044', 'corpora/shards/shard_0087', 'corpora/shards/shard_0092', 'corpora/shards/shard_0074', 'corpora/shards/shard_0094', 'corpora/shards/shard_0078', 'corpora/shards/shard_0039', 'corpora/shards/shard_0077', 'corpora/shards/shard_0040', 'corpora/shards/shard_0009', 'corpora/shards/shard_0107', 'corpora/shards/shard_0063', 'corpora/shards/shard_0103', 'corpora/shards/shard_0047', 'corpora/shards/shard_0033', 'corpora/shards/shard_0089', 'corpora/shards/shard_0037', 'corpora/shards/shard_0082', 'corpora/shards/shard_0071', 'corpora/shards/shard_0091', 'corpora/shards/shard_0065', 'corpora/shards/shard_0070', 'corpora/shards/shard_0058', 'corpora/shards/shard_0081', 'corpora/shards/shard_0060', 'corpora/shards/shard_0002', 'corpora/shards/shard_0059', 'corpora/shards/shard_0027', 'corpora/shards/shard_0072', 'corpora/shards/shard_0098', 'corpora/shards/shard_0019', 'corpora/shards/shard_0045', 'corpora/shards/shard_0036', 'corpora/shards/shard_0066', 'corpora/shards/shard_0053'] ``` Which means that it is quite possible that the assumption of that some unexpected pattern in the files is causing the crashes is true. If I am able to reach any conclusion I will post It here asap.
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
Hmmm I was expecting it to work with \a, not sure why they appear in your text files though
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
19
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` Hmmm I was expecting it to work with \a, not sure why they appear in your text files though
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
Hi @lhoestq, is there any input length restriction which was not before the update of the nlp library?
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
18
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` Hi @lhoestq, is there any input length restriction which was not before the update of the nlp library?
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
No we never set any input length restriction on our side (maybe arrow but I don't think so)
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
18
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` No we never set any input length restriction on our side (maybe arrow but I don't think so)
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
@lhoestq Can you ever be certain that a delimiter character is not present in a plain text file? In other formats (e.g. CSV) , rules are set of what is allowed and what isn't so that it actually constitutes a CSV file. In a text file you basically have "anything goes", so I don't think you can ever be entirely sure that the chosen delimiter does not exist in the text file, or am I wrong? If I understand correctly you choose a delimiter that we hope does not exist in the file, so that when the CSV parser starts splitting into columns, it will only ever create one column? Why can't we use a newline character though?
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
118
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` @lhoestq Can you ever be certain that a delimiter character is not present in a plain text file? In other formats (e.g. CSV) , rules are set of what is allowed and what isn't so that it actually constitutes a CSV file. In a text file you basically have "anything goes", so I don't think you can ever be entirely sure that the chosen delimiter does not exist in the text file, or am I wrong? If I understand correctly you choose a delimiter that we hope does not exist in the file, so that when the CSV parser starts splitting into columns, it will only ever create one column? Why can't we use a newline character though?
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
Okay, I have splitted the crashing shards into individual sentences and some examples of the inputs that are causing the crashes are the following ones: _4. DE L’ORGANITZACIÓ ESTAMENTAL A L’ORGANITZACIÓ EN CLASSES A mesura que es desenvolupava un sistema econòmic capitalista i naixia una classe burgesa cada vegada més preparada per a substituir els dirigents de les velles monarquies absolutistes, es qüestionava l’abundància de béns amortitzats, que com s’ha dit estaven fora del mercat i no pagaven tributs, pels perjudicis que ocasionaven a les finances públiques i a l’economia en general. Aquest estat d’opinió revolucionari va desembocar en un conjunt de mesures pràctiques de caràcter liberal. D’una banda, les que intentaven desposseir les mans mortes del domini de béns acumulats, procés que acostumem a denominar desamortització, i que no és més que la nacionalització i venda d’aquests béns eclesiàstics o civils en subhasta pública al millor postor. D’altra banda, les que redimien o reduïen els censos i delmes o aixecaven les prohibicions de venda, és a dir, les vinculacions. La desamortització, que va afectar béns dels ordes religiosos, dels pobles i d’algunes corporacions civils, no va ser un camí fàcil, perquè costava i costa trobar algú que sigui indiferent a la pèrdua de béns, drets i privilegis. I té una gran transcendència, va privar els antics estaments de les Espanyes, clero i pobles —la noblesa en queda al marge—, de la força econòmica que els donaven bona part de les seves terres i, en última instància, va preparar el terreny per a la substitució de la vella societat estamental per la nova societat classista. En aquesta societat, en teoria, les agrupacions socials són obertes, no tenen cap estatut jurídic privilegiat i estan definides per la possessió o no d’uns béns econòmics que són lliurement alienables. A les Espanyes la transformació va afectar poc l’aristocràcia latifundista, allà on n’hi havia. Aquesta situació va afavorir, en part, la persistència de la vella cultura de la societat estamental en determinats ambients, i això ha influït decisivament en la manca de democràcia que caracteritza la majoria de règims polítics que s’han anat succeint. Una manera de pensar que sempre sura en un moment o altre, i que de fet no acaba de desaparèixer del tot. 5. INICI DE LA DESAMORTITZACIÓ A LES ESPANYES Durant el segle xviii, dins d’aquesta visió lliberal, va agafar força en alguns cercles de les Espanyes el corrent d’opinió contrari a les mans mortes. Durant el regnat de Carles III, s’arbitraren les primeres mesures desamortitzadores proposades per alguns ministres il·lustrats. Aquestes disposicions foren modestes i poc eficaces, no van aturar l’acumulació de terres per part dels estaments que constituïen les mans mortes i varen afectar principalment béns dels pobles. L’Església no va ser tocada, excepte en el cas de 110_ _la revolució liberal, perquè, encara que havia perdut els seus drets jurisdiccionals, havia conservat la majoria de terres i fins i tot les havia incrementat amb d’altres que procedien de la desamortització. En la nova situació, les mans mortes del bosc públic eren l’Estat, que no cerca mai l’autofinançament de les despeses de gestió; els diners que manquin ja els posarà l’Estat. 9. DEFENSA I INTENTS DE RECUPERACIÓ DELS BÉNS COMUNALS DESAMORTITZATS El procés de centralització no era senzill, perquè, d’una banda, la nova organització apartava de la gestió moltes corporacions locals i molts veïns que l’havien portada des de l’edat mitjana, i, de l’altra, era difícil de coordinar la nova silvicultura amb moltes pràctiques forestals i drets tradicionals, com la pastura, fer llenya o tallar un arbre aquí i un altre allà quan tenia el gruix suficient, les pràctiques que s’havien fet sempre. Les primeres passes de la nova organització centralitzada varen tenir moltes dificultats en aquells indrets en què els terrenys municipals i comunals tenien un paper important en l’economia local. La desobediència a determinades normes imposades varen prendre formes diferents. Algunes institucions, com, per exemple, la Diputació de Lleida, varen retardar la tramitació d’alguns expedients i varen evitar la venda de béns municipals. Molts pobles permeteren deixar que els veïns continuessin amb les seves pràctiques tradicionals, d’altres varen boicotejar les subhastes d’aprofitaments. L’Estat va reaccionar encomanant a la Guàrdia Civil el compliment de les noves directrius. Imposar el nou règim va costar a l’Administració un grapat d’anys, però de mica en mica, amb molta, molta guarderia i gens de negociació, ho va aconseguir. La nova gestió estatal dels béns municipals va deixar, com hem comentat, molta gent sense uns recursos necessaris per a la supervivència, sobre tot en àrees on predominaven les grans propietats, i on els pagesos sense terra treballaven de jornalers temporers. Això va afavorir que, a bona part de les Espanyes, les primeres lluites camperoles de la segona meitat del segle xix defensessin la recuperació dels comunals desamortitzats; per a molts aquella expropiació i venda dirigida pels governs monàrquics era la causa de molta misèria. D’altres, més radicalitzats, varen entendre que l’eliminació de la propietat col·lectiva i la gestió estatal dels boscos no desamortitzats suposava una usurpació pura i dura. En les zones més afectades per la desamortització això va donar lloc a un imaginari centrat en la defensa del comunal. La Segona República va arribar en una conjuntura econòmica de crisi, generada pel crac del 1929. Al camp, aquesta situació va produir una forta caiguda dels preus dels productes agraris i un increment important de l’atur. QUADERNS AGRARIS 42 (juny 2017), p. 105-126_ I think that the main difference between the crashing samples and the rest is their length. Therefore, couldn't the length be causing the message errors? I hope with these samples you can identify what is causing the crashes considering that the 0.4.0 nlp library was loading them properly.
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
949
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` Okay, I have splitted the crashing shards into individual sentences and some examples of the inputs that are causing the crashes are the following ones: _4. DE L’ORGANITZACIÓ ESTAMENTAL A L’ORGANITZACIÓ EN CLASSES A mesura que es desenvolupava un sistema econòmic capitalista i naixia una classe burgesa cada vegada més preparada per a substituir els dirigents de les velles monarquies absolutistes, es qüestionava l’abundància de béns amortitzats, que com s’ha dit estaven fora del mercat i no pagaven tributs, pels perjudicis que ocasionaven a les finances públiques i a l’economia en general. Aquest estat d’opinió revolucionari va desembocar en un conjunt de mesures pràctiques de caràcter liberal. D’una banda, les que intentaven desposseir les mans mortes del domini de béns acumulats, procés que acostumem a denominar desamortització, i que no és més que la nacionalització i venda d’aquests béns eclesiàstics o civils en subhasta pública al millor postor. D’altra banda, les que redimien o reduïen els censos i delmes o aixecaven les prohibicions de venda, és a dir, les vinculacions. La desamortització, que va afectar béns dels ordes religiosos, dels pobles i d’algunes corporacions civils, no va ser un camí fàcil, perquè costava i costa trobar algú que sigui indiferent a la pèrdua de béns, drets i privilegis. I té una gran transcendència, va privar els antics estaments de les Espanyes, clero i pobles —la noblesa en queda al marge—, de la força econòmica que els donaven bona part de les seves terres i, en última instància, va preparar el terreny per a la substitució de la vella societat estamental per la nova societat classista. En aquesta societat, en teoria, les agrupacions socials són obertes, no tenen cap estatut jurídic privilegiat i estan definides per la possessió o no d’uns béns econòmics que són lliurement alienables. A les Espanyes la transformació va afectar poc l’aristocràcia latifundista, allà on n’hi havia. Aquesta situació va afavorir, en part, la persistència de la vella cultura de la societat estamental en determinats ambients, i això ha influït decisivament en la manca de democràcia que caracteritza la majoria de règims polítics que s’han anat succeint. Una manera de pensar que sempre sura en un moment o altre, i que de fet no acaba de desaparèixer del tot. 5. INICI DE LA DESAMORTITZACIÓ A LES ESPANYES Durant el segle xviii, dins d’aquesta visió lliberal, va agafar força en alguns cercles de les Espanyes el corrent d’opinió contrari a les mans mortes. Durant el regnat de Carles III, s’arbitraren les primeres mesures desamortitzadores proposades per alguns ministres il·lustrats. Aquestes disposicions foren modestes i poc eficaces, no van aturar l’acumulació de terres per part dels estaments que constituïen les mans mortes i varen afectar principalment béns dels pobles. L’Església no va ser tocada, excepte en el cas de 110_ _la revolució liberal, perquè, encara que havia perdut els seus drets jurisdiccionals, havia conservat la majoria de terres i fins i tot les havia incrementat amb d’altres que procedien de la desamortització. En la nova situació, les mans mortes del bosc públic eren l’Estat, que no cerca mai l’autofinançament de les despeses de gestió; els diners que manquin ja els posarà l’Estat. 9. DEFENSA I INTENTS DE RECUPERACIÓ DELS BÉNS COMUNALS DESAMORTITZATS El procés de centralització no era senzill, perquè, d’una banda, la nova organització apartava de la gestió moltes corporacions locals i molts veïns que l’havien portada des de l’edat mitjana, i, de l’altra, era difícil de coordinar la nova silvicultura amb moltes pràctiques forestals i drets tradicionals, com la pastura, fer llenya o tallar un arbre aquí i un altre allà quan tenia el gruix suficient, les pràctiques que s’havien fet sempre. Les primeres passes de la nova organització centralitzada varen tenir moltes dificultats en aquells indrets en què els terrenys municipals i comunals tenien un paper important en l’economia local. La desobediència a determinades normes imposades varen prendre formes diferents. Algunes institucions, com, per exemple, la Diputació de Lleida, varen retardar la tramitació d’alguns expedients i varen evitar la venda de béns municipals. Molts pobles permeteren deixar que els veïns continuessin amb les seves pràctiques tradicionals, d’altres varen boicotejar les subhastes d’aprofitaments. L’Estat va reaccionar encomanant a la Guàrdia Civil el compliment de les noves directrius. Imposar el nou règim va costar a l’Administració un grapat d’anys, però de mica en mica, amb molta, molta guarderia i gens de negociació, ho va aconseguir. La nova gestió estatal dels béns municipals va deixar, com hem comentat, molta gent sense uns recursos necessaris per a la supervivència, sobre tot en àrees on predominaven les grans propietats, i on els pagesos sense terra treballaven de jornalers temporers. Això va afavorir que, a bona part de les Espanyes, les primeres lluites camperoles de la segona meitat del segle xix defensessin la recuperació dels comunals desamortitzats; per a molts aquella expropiació i venda dirigida pels governs monàrquics era la causa de molta misèria. D’altres, més radicalitzats, varen entendre que l’eliminació de la propietat col·lectiva i la gestió estatal dels boscos no desamortitzats suposava una usurpació pura i dura. En les zones més afectades per la desamortització això va donar lloc a un imaginari centrat en la defensa del comunal. La Segona República va arribar en una conjuntura econòmica de crisi, generada pel crac del 1929. Al camp, aquesta situació va produir una forta caiguda dels preus dels productes agraris i un increment important de l’atur. QUADERNS AGRARIS 42 (juny 2017), p. 105-126_ I think that the main difference between the crashing samples and the rest is their length. Therefore, couldn't the length be causing the message errors? I hope with these samples you can identify what is causing the crashes considering that the 0.4.0 nlp library was loading them properly.
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
So we're using the csv reader to read text files because arrow doesn't have a text reader. To workaround the fact that text files are just csv with one column, we want to set a delimiter that doesn't appear in text files. Until now I thought that it would do the job but unfortunately it looks like even characters like \a appear in text files. So we have to option: - find another delimiter that does the job (maybe `\x1b` esc or `\x18` cancel) - don't use the csv reader from arrow but the text reader from pandas instead (or any other reader). The only important thing is that it must be fast (arrow's reader has a nice and fast multithreaded for csv that we're using now but hopefully we can find an alternative) > @lhoestq Can you ever be certain that a delimiter character is not present in a plain text file? In other formats (e.g. CSV) , rules are set of what is allowed and what isn't so that it actually constitutes a CSV file. In a text file you basically have "anything goes", so I don't think you can ever be entirely sure that the chosen delimiter does not exist in the text file, or am I wrong? As long as the text file follows some encoding it wouldn't make sense to have characters such as the bell character. However I agree it can happen. > If I understand correctly you choose a delimiter that we hope does not exist in the file, so that when the CSV parser starts splitting into columns, it will only ever create one column? Why can't we use a newline character though? Exactly. Arrow doesn't allow the newline character unfortunately.
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
289
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` So we're using the csv reader to read text files because arrow doesn't have a text reader. To workaround the fact that text files are just csv with one column, we want to set a delimiter that doesn't appear in text files. Until now I thought that it would do the job but unfortunately it looks like even characters like \a appear in text files. So we have to option: - find another delimiter that does the job (maybe `\x1b` esc or `\x18` cancel) - don't use the csv reader from arrow but the text reader from pandas instead (or any other reader). The only important thing is that it must be fast (arrow's reader has a nice and fast multithreaded for csv that we're using now but hopefully we can find an alternative) > @lhoestq Can you ever be certain that a delimiter character is not present in a plain text file? In other formats (e.g. CSV) , rules are set of what is allowed and what isn't so that it actually constitutes a CSV file. In a text file you basically have "anything goes", so I don't think you can ever be entirely sure that the chosen delimiter does not exist in the text file, or am I wrong? As long as the text file follows some encoding it wouldn't make sense to have characters such as the bell character. However I agree it can happen. > If I understand correctly you choose a delimiter that we hope does not exist in the file, so that when the CSV parser starts splitting into columns, it will only ever create one column? Why can't we use a newline character though? Exactly. Arrow doesn't allow the newline character unfortunately.
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
> Okay, I have splitted the crashing shards into individual sentences and some examples of the inputs that are causing the crashes are the following ones Thanks for digging into it ! Characters like \a or \b are not shown when printing the text, so as it is I can't tell if it contains unexpected characters. Maybe could could open the file in python and check if `"\b" in open("path/to/file", "r").read()` ? > I think that the main difference between the crashing samples and the rest is their length. Therefore, couldn't the length be causing the message errors? I hope with these samples you can identify what is causing the crashes considering that the 0.4.0 nlp library was loading them properly. To check that you could try to run ```python import pyarrow as pa import pyarrow.csv open("dummy.txt", "w").write((("a" * 10_000) + "\n") * 4) # 4 lines of 10 000 'a' parse_options = pa.csv.ParseOptions( delimiter="\b", quote_char=False, double_quote=False, escape_char=False, newlines_in_values=False, ignore_empty_lines=False, ) read_options= pa.csv.ReadOptions(use_threads=True, column_names=["text"]) pa_table = pa.csv.read_csv("dummy.txt", read_options=read_options, parse_options=parse_options) ``` on my side it runs without error though
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
178
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` > Okay, I have splitted the crashing shards into individual sentences and some examples of the inputs that are causing the crashes are the following ones Thanks for digging into it ! Characters like \a or \b are not shown when printing the text, so as it is I can't tell if it contains unexpected characters. Maybe could could open the file in python and check if `"\b" in open("path/to/file", "r").read()` ? > I think that the main difference between the crashing samples and the rest is their length. Therefore, couldn't the length be causing the message errors? I hope with these samples you can identify what is causing the crashes considering that the 0.4.0 nlp library was loading them properly. To check that you could try to run ```python import pyarrow as pa import pyarrow.csv open("dummy.txt", "w").write((("a" * 10_000) + "\n") * 4) # 4 lines of 10 000 'a' parse_options = pa.csv.ParseOptions( delimiter="\b", quote_char=False, double_quote=False, escape_char=False, newlines_in_values=False, ignore_empty_lines=False, ) read_options= pa.csv.ReadOptions(use_threads=True, column_names=["text"]) pa_table = pa.csv.read_csv("dummy.txt", read_options=read_options, parse_options=parse_options) ``` on my side it runs without error though
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
That's true, It was my error printing the text that way. Maybe as a workaround, I can force all my input samples to have "\b" at the end?
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
28
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` That's true, It was my error printing the text that way. Maybe as a workaround, I can force all my input samples to have "\b" at the end?
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/622
load_dataset for text files not working
> That's true, It was my error printing the text that way. Maybe as a workaround, I can force all my input samples to have "\b" at the end? I don't think it would work since we only want one column, and "\b" is set to be the delimiter between two columns, so it will raise the same issue again. Pyarrow would think that there is more than one column if the delimiter is found somewhere. Anyway, I I'll work on a new text reader if we don't find the right workaround about this delimiter issue.
Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ```
96
load_dataset for text files not working Trying the following snippet, I get different problems on Linux and Windows. ```python dataset = load_dataset("text", data_files="data.txt") # or dataset = load_dataset("text", data_files=["data.txt"]) ``` (ps [This example](https://huggingface.co/docs/datasets/loading_datasets.html#json-files) shows that you can use a string as input for data_files, but the signature is `Union[Dict, List]`.) The problem on Linux is that the script crashes with a CSV error (even though it isn't a CSV file). On Windows the script just seems to freeze or get stuck after loading the config file. Linux stack trace: ``` PyTorch version 1.6.0+cu101 available. Checking /home/bram/.cache/huggingface/datasets/b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at /home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.json Using custom data configuration default Generating dataset text (/home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7) Downloading and preparing dataset text/default-0907112cc6cd2a38 (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /home/bram/.cache/huggingface/datasets/text/default-0907112cc6cd2a38/0.0.0/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7... Dataset not on Hf google storage. Downloading and preparing it from source Downloading took 0.0 min Checksum Computation took 0.0 min Unable to verify checksums. Generating split train Traceback (most recent call last): File "/home/bram/Python/projects/dutch-simplification/utils.py", line 45, in prepare_data dataset = load_dataset("text", data_files=dataset_f) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/load.py", line 608, in load_dataset builder_instance.download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 468, in download_and_prepare self._download_and_prepare( File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 546, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/datasets/builder.py", line 888, in _prepare_split for key, table in utils.tqdm(generator, unit=" tables", leave=False, disable=not_verbose): File "/home/bram/.local/share/virtualenvs/dutch-simplification-NcpPZtDF/lib/python3.8/site-packages/tqdm/std.py", line 1130, in __iter__ for obj in iterable: File "/home/bram/.cache/huggingface/modules/datasets_modules/datasets/text/7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7/text.py", line 100, in _generate_tables pa_table = pac.read_csv( File "pyarrow/_csv.pyx", line 714, in pyarrow._csv.read_csv File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 2 ``` Windows just seems to get stuck. Even with a tiny dataset of 10 lines, it has been stuck for 15 minutes already at this message: ``` Checking C:\Users\bramv\.cache\huggingface\datasets\b1d50a0e74da9a7b9822cea8ff4e4f217dd892e09eb14f6274a2169e5436e2ea.30c25842cda32b0540d88b7195147decf9671ee442f4bc2fb6ad74016852978e.py for additional imports. Found main folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7 Found script file from https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py to C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.py Couldn't find dataset infos file at https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text\dataset_infos.json Found metadata file for dataset https://raw.githubusercontent.com/huggingface/datasets/1.0.1/datasets/text/text.py at C:\Users\bramv\.cache\huggingface\modules\datasets_modules\datasets\text\7e13bc0fa76783d4ef197f079dc8acfe54c3efda980f2c9adfab046ede2f0ff7\text.json Using custom data configuration default ``` > That's true, It was my error printing the text that way. Maybe as a workaround, I can force all my input samples to have "\b" at the end? I don't think it would work since we only want one column, and "\b" is set to be the delimiter between two columns, so it will raise the same issue again. Pyarrow would think that there is more than one column if the delimiter is found somewhere. Anyway, I I'll work on a new text reader if we don't find the right workaround about this delimiter issue.
[ -0.2746650875, -0.4020574093, 0.0175604746, 0.3872523904, 0.2696424723, -0.0386613235, 0.318888396, -0.054356873, 0.4263595045, -0.0580488294, 0.0659725443, 0.1455247551, -0.1557630301, 0.2742006779, 0.0635565817, -0.0350759588, 0.157183066, -0.0138411485, -0.2914433777, 0.0402486697, -0.1144547909, 0.343120873, -0.2153361142, -0.1469300091, -0.4160704017, 0.274004221, -0.1050167531, 0.4538783133, -0.074261561, -0.2378563732, 0.2710005641, 0.1282080114, 0.1639643312, 0.6136494279, -0.0001269617, 0.1100324541, 0.2756476998, -0.161011219, -0.2998278439, -0.5695471764, 0.158515662, -0.2294367552, 0.2088484764, -0.0287809223, 0.130192399, -0.0045288876, 0.1527261734, -0.3340620995, 0.3551646769, 0.3254780471, 0.1009779423, 0.3193371892, 0.0620949492, -0.0146580972, 0.060962107, 0.4765507281, -0.0917445719, 0.3906493783, 0.3137861192, -0.1942872405, 0.1169527173, -0.0824010223, -0.1730933785, -0.1696533263, 0.3285271823, 0.2871708572, -0.6134480238, -0.2057264149, 0.0768415481, 0.1859920323, 0.4352323413, -0.4339180589, -0.2217481732, -0.1591289937, -0.0756906047, -0.1254789531, 0.4388751984, 0.178399086, -0.1985728741, 0.0581206605, -0.2503566444, -0.0692395493, -0.1769828647, 0.3230665922, -0.0084306747, 0.0508894362, -0.2191820592, 0.1040093303, 0.3328264058, -0.0709384903, -0.2063762248, -0.2793429196, -0.0293536745, 0.0611279123, -0.2661669254, 0.155729115, -0.265876323, -0.0245269239, 0.111754097, 0.1747618765, -0.0047684107, 0.1417199671, 0.0951443166, 0.2306068838, 0.2007871717, 0.0978597999, 0.408153981, 0.0843546391, 0.2615750432, 0.0263591781, -0.0160578191, -0.0952473581, -0.1963427067, -0.544190526, -0.0697910711, -0.2300524414, 0.4797003865, -0.2050438672, -0.179035306, -0.0498552397, -0.0808123797, -0.0455940217, 0.2161370218, 0.6197091341, -0.0843389705, 0.0338772945, 0.1204904765, 0.3182839751, -0.1168015301, 0.1911229938, 0.0475793257, -0.0604269058, -0.0889261588, 0.1795252562, 0.5387114286, -0.3721973598, 0.3424326777, 0.1996888071, 0.4671031833, -0.1905284226, -0.151450485, -0.1865483224, -0.0174012259, 0.1886943281, 0.0076280385, 0.0419399217, 0.2405287474, -0.1590058953, -0.1147047877, 0.094007507, -0.2652539313, -0.0938474312, 0.0322780088, 0.0448435955, -0.0952585563, -0.2330177724, -0.3073214889, 0.0929255933, 0.0373645574, -0.1054283977, 0.0938849896, -0.1803103387, -0.2380834073, -0.1588562131, 0.2716286778, 0.6603158116, -0.2988539636, -0.1454448402, 0.325542599, -0.1225559488, -0.1192729026, 0.2550255358, -0.0943288952, 0.0099467598, -0.2610240281, 0.1993147135, 0.187488541, -0.4092431068, -0.1465578675, 0.3963532448, 0.0584072769, 0.1357299089, 0.2080580145, 0.0618105568, 0.0415841043, 0.0460110866, 0.2806778252, 0.0083059669, 0.1047207639, -0.0915357769, -0.0034150407, -0.2037541419, 0.0910754874, 0.3670079708, -0.2113380581, 0.0550339222, 0.1075052619, -0.11178343, 0.1765403897, -0.1039658338, -0.0344482921, 0.5095412731, 0.1362370998, 0.3201458454, 0.0793993995, -0.2116696239, -0.5979290009, 0.1366194487, 0.2275741994, -0.0535318851, -0.2638645172, -0.0957268104, -0.1770205796, 0.0601342618, -0.2095396519, -0.0141129373, -0.0720960647, 0.1533232778, 0.2246877551, 0.0292827561, -0.2047467679, 0.4062134922, -0.1693809628, 0.2454877198, -0.2644482255, 0.1788316667, -0.035665676, -0.2023018897, -0.0517952628, 0.1018910557, -0.0046760775, -0.2937568128, 0.0954742432, 0.4632968307, 0.0546922348, 0.0856851041, -0.1847889423, -0.095308736, 0.2269648314, -0.0808334574, -0.0273767319, 0.2919704914, 0.2026349604, -0.191780746, -0.2744134963, 0.191270113, -0.3355852067, 0.1634297818, 0.0298718065, -0.1083327979, -0.0484335348, 0.1079697087, -0.3023560941, -0.0635970831, 0.4508516788, -0.2454562485, 0.2605041862, 0.0018220283, -0.4244134426, -0.2174916416, 0.4196654558, -0.0460518077, 0.1226552725, 0.1877404451, -0.2147011012, 0.2120123804, -0.0877685249, -0.0419770852, 0.5764296055, 0.1378011703, -0.2983498275, 0.2298149467, -0.1166208833, -0.2212795913, 0.2890957594, -0.0472781733, -0.0982489288, 0.0960121155, -0.199883312, 0.008325614, -0.2600297034, -0.0678928196, -0.101543121, 0.0715264753, -0.4127509296, 0.2223608941, -0.3261347413, -0.1620814502, -0.3533615768, 0.1199703068, -0.2934476733, -0.035252884, -0.2882405818, 0.2256353348, 0.1631825864, 0.0368442088, -0.025598444, 0.0039332658, 0.1004286557, -0.5494058132, -0.1985673308, 0.0081633367, -0.2308937907, -0.0934240445, 0.38237679, 0.0441116206, 0.23747769, -0.3961959183, -0.1094296128, -0.0985935256, -0.1599083692, 0.0557672083, -0.0614239685, 0.1960320026, 0.053164836, 0.228019923, -0.1237517744, -0.1310474277, 0.4299978018, -0.0381336957, -0.18856287, 0.2453442812, 0.4184463024, -0.3097181916, -0.2610919476, -0.3623776436, -0.1279266924, -0.284201175, 0.3581484556, 0.1559178829, -0.0295689367, 0.5024069548, 0.4287554324, 0.1569625586, -0.1803516448, 0.2165790796, 0.0652968585, -0.1683053374, 0.47165066, -0.1643847227, -0.6319024563, 0.0775604397, 0.4496119022, -0.2843748033, 0.217541486, -0.4422394633, -0.0620806888, -0.1192547604, 0.0920901299, 0.0410849266, 0.3046549261, 0.1864622831, 0.0804126561, 0.0956785232, -0.0544944741, -0.321420908, 0.115875572, -0.2110432386, 0.03335388, 0.2350959778, 0.3713712692, -0.2012349069, 0.4552717507, 0.3391943276, -0.066408962, 0.2473823726, -0.5043381453, 0.5094946623, -0.1640395224, -0.5347825289, 0.056173455, -0.2177342772, 0.1750390977, 0.2766557932, 0.1536438018, 0.3985486627, -0.3371795714, 0.1042309403, -0.0745856762, -0.199773401, 0.2598343492, -0.1172346026, 0.0794806927, -0.1508024633, 0.1711595953, 0.1823589653, -0.2328865081, 0.0390384793, 0.606995523, -0.1873314828, 0.0515494347, -0.3987099826, 0.0692128688, -0.3697609305, 0.3847952783, -0.0440683998, 0.3335776627, -0.2729939818, -0.1051140726, 0.0177629218, -0.037273027, 0.554443419, 0.3075833023, -0.19524014, 0.0176450834, -0.2582425773, -0.3879302442, 0.097062692, -0.2831111848, 0.3912248611, 0.1797812581, 0.6442825794, -0.3002550602, -0.2145058662, -0.0911265686, 0.4764971733, -0.0228323266, -0.3114878237, -0.2785248458, -0.163437292, -0.3262645006, -0.2423543632, 0.0241881162, 0.2216681987, -0.1815150678, 0.2211253643, -0.0578096956, -0.1648265421, 0.2988139391, -0.0309883505, 0.2692630291, -0.3283639848, 0.2320740819, 0.1062323302, 0.4693133831, 0.2969626188, 0.55152601, -0.0897570252, -0.4601888955, 0.0208050273, -0.1581924856, 0.3489819169, 0.1660989821, -0.271594137, 0.1426502466, 0.2977322638, 0.1101863533, -0.4385164678, -0.0669692159, 0.5130982399, 0.211311698, -0.2971881032, -0.4682970941, 0.3646911383, 0.0281750113, 0.1499666274, 0.2716008425, 0.1743868887, -0.2929803133, 0.0181154422, -0.3032900691, 0.8145563602, -0.1351298988, 0.4673897028, 0.2345220447, -0.0871992409, 0.3978255093, -0.0533246994, 0.0656357408, -0.2903504074, -0.0147632752, -0.0129347928, -0.1748354733, 0.443472445, 0.1701349616, -0.4242364466, 0.2192279249, -0.01338467, 0.3101128638, -0.2710384429, 0.1851523519, -0.4233769774, -0.3067673445, -0.3461345732, 0.0314103439, 0.1470376998, 0.3159359396, -0.019304961, 0.0686779022, -0.1788803488, -0.2796398401, -0.3666662574, 0.0574039519, -0.3217287362, 0.0584379695, 0.2108013183, -0.3453938067, 0.3396155238, 0.4470130503, 0.2227250189, 0.1258746088, -0.1699059904, -0.0880351067, -0.2285287231, -0.1075693071, 0.0017924495, -0.2334612906, 0.0256202724, 0.0866710544, -0.2706850469, -0.03422831, -0.0761411265, -0.1780478954, -0.0112771243, 0.1302926242, -0.3514167964, -0.4001739323, -0.4443775117, -0.1643959582, 0.1622947305, -0.062214911, 0.0125064962, 0.1251551211, 0.1535156816, 0.0424737744, 0.0863073766, -0.1513923109, -0.1063952073, 0.3808279634, -0.432557404, -0.2435593307, 0.6115915775, 0.4478025734, -0.1363860667, -0.2047024667, 0.3153441548, -0.0241038725, -0.4706379771, 0.0600290485, 0.3476566076, 0.0469819009, 0.0210847408, 0.275324285, 0.0409547463, -0.1721464247, 0.1079839766, -0.5703998804, -0.343737036, 0.1958447993, 0.2991571128, 0.1324545741, 0.2247096002, -0.1516831815, -0.0144614764, -0.2124667764, -0.1799510717, 0.0283626541, 0.0711825565, -0.2090125978, 0.4280437231, 0.0813553035, 0.344661504, -0.242889896, 0.0180170387, -0.0082558766, 0.0614189804, -0.0559107512, -0.1323170215, 0.1357379705, -0.0764976293, -0.1801906377, -0.169544518, -0.3009044528, -0.0054441616, -0.3061546683, 0.1830039918, 0.4905686378, -0.1322171688, 0.1660417467, -0.2308709174, 0.2402744293, -0.3294675052, 0.2159641236, -0.3301633, 0.3099445105, 0.1199944094, 0.0645245388, -0.0035627894, 0.076214999, -0.1796475947, 0.280446589, 0.19380638, -0.0876487345, 0.2333925813, -0.4173030555, 0.1791360378, -0.1308649778, 0.4586464167, 0.6418237686, -0.2389523089, 0.1770923138, 0.3237513602, 0.0375071242, -0.1066856831, 0.0253852904, 0.1342609525, -0.0756320953, 0.0557791218, 0.2839002013, -0.1099475101, -0.0767708346, 0.0140855778, 0.0420108289, 0.3017364442, 0.135049358, 0.0422457159, 0.3179254234, 0.2821860909, 0.2172951251, -0.0789585412, 0.3243650496, 0.1688097119, 0.3248775005, -0.1270243824, 0.1264022589, -0.1879324615, 0.3269270062, -0.0585653819, -0.3316078484, 0.1236125976, 0.0540437959, -0.2286889702, 0.0388125703, -0.2160565555, 0.549175024, -0.5157108307, -0.0090487339, -0.1283324659, 0.0684653074, -0.050738968, -0.3157412112, 0.0320399031, -0.2380465865, -0.0311428756, 0.0246614367, -0.0586086772, -0.1354768127, 0.0637516081, 0.1260184795, -0.0337767452, -0.3310150802, 0.1216677353, 0.0174294896, 0.0148991924, 0.0049085617, 0.2710103393, 0.151932165, 0.2196569145, 0.3230520189, 0.3475780487, 0.5777656436, 0.4073304236, 0.1060493663, 0.2475762218, -0.2693929672, -0.012606632, -0.0394144803, 0.3580521643, 0.1128152907, 0.0378046222, 0.216445297, 0.0713133216, -0.0373924226, -0.0545141995, 0.0653119981, -0.1178253293, -0.2989547253, 0.2328229398, -0.253652364, -0.1291727871, -0.1700772941, -0.0467462242, -0.4412434101, -0.0388390347, 0.5237542391, 0.2658582032, 0.1927792579, -0.0427042581, 0.0265392922, 0.340023756, 0.4671336412, 0.3830901384, 0.2377201617, -0.0570629649, -0.0111458749, -0.6012915373, -0.0115616061, 0.0097206626, -0.1373762637, -0.238687858, -0.1536644995, 0.1978025734, 0.1551499069, 0.052321814, -0.1281722635, 0.2108153701, -0.012030106, -0.0534493998, -0.1669252068, -0.0151777416, 0.1381872594, 0.0049078017, -0.3023659587, 0.0613832474, -0.1089830846, -0.0564865209, -0.3146945834, 0.4238255918, -0.1058528945, -0.3355432749, 0.1364643574, -0.0730644763, 0.4701802731, 0.0425092652, -0.1654962897, -0.1798643172, -0.2382646948, 0.0213488825, 0.4302535355, 0.154534936, 0.00509312, -0.4908644557, -0.4926252365, -0.2662901878, 0.2439434528, -0.059263695, -0.373033762, 0.0246140137, -0.0186504982, -0.0354576111, 0.0011530854, 0.2333131135, 0.4785374999, -0.2721091807, 0.0106193945, -0.1478195637, -0.1278118193, 0.3023914695, -0.4510453343, -0.2669306993, 0.0307971574, 0.0089578144, 0.0157024339, 0.0496491939, -0.5663858056, 0.125086993, 0.1359772086, -0.1033067256, -0.1399748325, 0.0530340262, -0.1825751215, -0.0363835245, -0.1389524192, 0.4549027979, 0.0883839726, -0.2588398159, 0.0973683894, -0.205696255 ]
https://github.com/huggingface/datasets/issues/620
map/filter multiprocessing raises errors and corrupts datasets
It seems that I ran into the same problem ``` def tokenize(cols, example): for in_col, out_col in cols.items(): example[out_col] = hf_tokenizer.convert_tokens_to_ids(hf_tokenizer.tokenize(example[in_col])) return example cola = datasets.load_dataset('glue', 'cola') tokenized_cola = cola.map(partial(tokenize, {'sentence': 'text_idxs'}), num_proc=2,) ``` and it outpus (exceprts) ``` Concatenating 2 shards from multiprocessing Set __getitem__(key) output type to python objects for ['idx', 'label', 'sentence', 'text_idxs'] columns (when key is int or slice) and don't output other (un-formatted) columns. Testing the mapped function outputs Testing finished, running the mapping function on the dataset Done writing 532 indices in 4256 bytes . Done writing 531 indices in 4248 bytes . Process #0 will write at /home/yisiang/.cache/huggingface/datasets/glue/cola/1.0.0/930e9d141872db65102cabb9fa8ac01c11ffc8a1b72c2e364d8cdda4610df542/tokenized_test_00000_of_00002.arrow Process #1 will write at /home/yisiang/.cache/huggingface/datasets/glue/cola/1.0.0/930e9d141872db65102cabb9fa8ac01c11ffc8a1b72c2e364d8cdda4610df542/tokenized_test_00001_of_00002.arrow Spawning 2 processes ``` and then the program never stop.
After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ```
121
map/filter multiprocessing raises errors and corrupts datasets After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ``` It seems that I ran into the same problem ``` def tokenize(cols, example): for in_col, out_col in cols.items(): example[out_col] = hf_tokenizer.convert_tokens_to_ids(hf_tokenizer.tokenize(example[in_col])) return example cola = datasets.load_dataset('glue', 'cola') tokenized_cola = cola.map(partial(tokenize, {'sentence': 'text_idxs'}), num_proc=2,) ``` and it outpus (exceprts) ``` Concatenating 2 shards from multiprocessing Set __getitem__(key) output type to python objects for ['idx', 'label', 'sentence', 'text_idxs'] columns (when key is int or slice) and don't output other (un-formatted) columns. Testing the mapped function outputs Testing finished, running the mapping function on the dataset Done writing 532 indices in 4256 bytes . Done writing 531 indices in 4248 bytes . Process #0 will write at /home/yisiang/.cache/huggingface/datasets/glue/cola/1.0.0/930e9d141872db65102cabb9fa8ac01c11ffc8a1b72c2e364d8cdda4610df542/tokenized_test_00000_of_00002.arrow Process #1 will write at /home/yisiang/.cache/huggingface/datasets/glue/cola/1.0.0/930e9d141872db65102cabb9fa8ac01c11ffc8a1b72c2e364d8cdda4610df542/tokenized_test_00001_of_00002.arrow Spawning 2 processes ``` and then the program never stop.
[ -0.4042775631, -0.076164186, 0.0068099946, 0.146670714, 0.147502467, -0.189604491, 0.3224095106, 0.3429278135, 0.1341748089, 0.1369511932, 0.0367483571, 0.3943181634, -0.4054726064, 0.3103387654, -0.3056830168, 0.0618966296, 0.1283410192, -0.0522316173, -0.2669126689, 0.1578020155, -0.2797974646, 0.2109819055, -0.3103197217, 0.2153626382, -0.5755158067, 0.0590775236, -0.0540032871, 0.2517107725, 0.0042291768, -0.5668842793, 0.309784472, 0.3128777146, 0.0841088444, 0.4231202006, -0.000125346, 0.1279418468, 0.3116935492, -0.1059477106, -0.036428906, -0.3833858967, -0.0744345561, -0.2384931743, 0.2321605086, 0.1469332129, 0.2514846921, -0.0637061149, -0.1770323813, -0.1964328438, 0.1855878979, 0.3598253131, 0.0798643231, 0.2393264174, 0.2558634579, 0.1061846688, 0.0308737941, 0.1623826623, -0.0530021861, -0.0189512037, 0.2539157569, -0.4286130667, -0.083004564, 0.2592656016, -0.2532222271, -0.2118037194, -0.1340042502, -0.220843643, 0.5308259726, -0.5402561426, 0.2115586698, 0.0486417897, -0.2676086426, -0.0402109884, -0.4429704547, -0.1768541038, -0.2546719015, -0.5399870872, 0.170960933, 0.0307247937, -0.1698824763, 0.1184879467, -0.439244777, -0.2035619915, 0.0078962073, -0.029882282, -0.1458258182, 0.6477391124, 0.2138986588, 0.3055249751, 0.1323187351, 0.1305703968, 0.1838416755, -0.0163391344, -0.0709622875, 0.2070081532, -0.4304938614, 0.0555330217, 0.0561708994, -0.2127606869, -0.1796365678, -0.0278098769, -0.2530760765, 0.1307653785, -0.1172351316, 0.0927314013, 0.4056193531, -0.0962311178, 0.1677094549, 0.3280447125, 0.2209370583, -0.1989022344, -0.0220489204, 0.0117405914, 0.2665808201, -0.0622577108, -0.0979441181, 0.2686053514, 0.0859606266, -0.1967088729, -0.2017097771, 0.1901547909, -0.1704505533, -0.1557845175, 0.0763079822, 0.2368405908, 0.0790322423, 0.7875008583, 0.1054011062, 0.2169734985, -0.3290950656, -0.1062659174, -0.0690491498, -0.0170262177, -0.3850461245, 0.1155861989, 0.2116107494, 0.1460444033, -0.0303650405, 0.0347132348, -0.3158592582, -0.2540468574, 0.2134022713, -0.3203783631, 0.1630383432, 0.6764132977, -0.0766723827, 0.0704787076, 0.0608325973, -0.2051454633, 0.0493656993, 0.1968803406, -0.528068006, -0.1739724278, -0.2313922048, 0.0630154312, 0.0921194851, 0.2983641922, -0.1968150437, 0.2527911067, 0.3752554059, -0.2726264596, -0.293933183, -0.4000751376, -0.3631501496, -0.2814877331, -0.0012881532, 0.2978387773, -0.4101809561, -0.0337190405, 0.1389254779, -0.1741316915, 0.3642456532, 0.258012265, -0.0363180526, 0.08022964, -0.1671099216, 0.1751137972, 0.1112992465, -0.2420785427, -0.0535120592, 0.2751083076, 0.0924492255, 0.3352849483, -0.0884901285, -0.3326950073, 0.3726422787, -0.1397299916, 0.2458259761, -0.0516838171, -0.1721584201, 0.0256981999, -0.4304660559, -0.0167046748, -0.073843956, -0.1338192374, 0.4579971731, 0.2050950825, -0.1281721443, -0.4877860248, 0.3257188201, -0.1268089712, 0.3116861582, 0.0617726929, 0.1279980391, 0.1767355949, 0.1328582168, -0.2497060448, -0.3686508238, 0.1384370029, -0.2625173628, 0.0193940159, -0.0266865306, -0.0783317313, 0.0837067962, 0.2017332762, -0.2120802552, -0.2150040567, 0.0355653763, 0.0074173044, -0.3615641594, -0.0625980794, -0.1797473431, 0.6259472966, 0.0919978544, 0.1794136167, 0.013110403, 0.2692207396, -0.1198471114, -0.435510844, -0.2855913341, 0.3075082004, 0.0316628516, -0.0748287588, -0.2006152719, 0.4490846395, 0.5101369619, -0.1541980952, -0.026039172, 0.1566488892, 0.2412419766, -0.1208865643, -0.1833622307, 0.0375626497, 0.0654286891, -0.1723883003, 0.3006967902, 0.4833922386, 0.2013999522, 0.3852224648, 0.2566984892, 0.2287191749, 0.3226881623, 0.0051960796, 0.0316116661, -0.1580839753, 0.12465702, -0.1312189549, 0.1465569437, 0.0016203485, -0.1420027912, -0.1955578476, 0.1355320364, 0.059509851, -0.1583059728, 0.0607258417, 0.095202744, -0.1042192578, 0.1780013442, -0.013248831, 0.3717619181, -0.0238192528, -0.2368369699, 0.1435662508, -0.2938763499, 0.0356035903, 0.0739234686, -0.0384560674, 0.3780092001, 0.3359931707, 0.0807106644, 0.0478109568, -0.2073846757, -0.3678787947, 0.071659252, 0.3440573514, -0.5090511441, 0.2327925116, -0.2995890975, 0.3964599669, 0.1238908023, -0.1198128164, -0.2803929746, -0.5962171555, -0.131013155, 0.534403801, -0.0878548101, 0.140951395, -0.1044902503, 0.1036587358, -0.2177424431, 0.4064004719, -0.1127117723, -0.2866864204, -0.2238862216, -0.1438709497, 0.3367702961, -0.2551405728, 0.237195313, 0.1972362995, -0.2832108736, 0.1011272222, -0.3656712174, 0.1703361273, -0.0298880637, 0.0133357849, 0.1109328121, -0.0670835972, -0.0422331095, -0.2127739936, 0.15843229, -0.1304796338, -0.2130654454, 0.3102202415, -0.1451124698, 0.0382953808, -0.1842953265, -0.2188407332, -0.4120934308, -0.1025575995, -0.1735724807, -0.2267585099, 0.3224337995, 0.0706637204, -0.0400240943, -0.0154564828, -0.094897069, -0.0137674939, -0.1022755504, -0.0214021578, -0.1632138789, 0.0092591718, -0.1443978697, -0.0533852912, 0.0301885456, 0.1384505183, 0.4326984584, -0.3306240439, 0.0789943784, 0.1064508706, -0.1260216236, 0.1771830618, -0.190312624, 0.3407358229, 0.3817727864, 0.0495489351, -0.0437270701, -0.1391005814, 0.1047362536, -0.0750268847, 0.05078578, 0.0907465667, 0.3363544345, 0.1733751446, 0.6802123189, 0.3181330562, 0.0758749694, 0.4411139488, -0.0873886272, 0.1266504228, -0.201642096, -0.4437971711, -0.1668882817, -0.3328515291, 0.0288299918, -0.1490574181, -0.1336887926, -0.4281877875, -0.2398666888, 0.5580081344, -0.3731170893, -0.2661550343, -0.0114972936, -0.4910700321, 0.2942598164, -0.0721300542, 0.0983667448, -0.0680478886, -0.0286518354, -0.1811806709, 0.2417280227, 0.1317722797, -0.2868564427, -0.3133726716, -0.2984550297, -0.4311942756, 0.3018600643, 0.200512737, 0.7738987207, 0.0186731815, -0.0527883396, 0.011161603, -0.186825946, 0.9173188806, -0.569034934, -0.3967111409, 0.3345103264, -0.4721474349, -0.2147322595, -0.2099568248, -0.2013519853, 0.5055504441, 0.3819587231, 0.5023779273, -0.1614257097, -0.3780630231, 0.153083384, -0.2570290267, -0.1191148907, 0.0627681166, -0.3022754788, -0.0154424012, -0.2401996553, 0.0927342251, -0.2525689602, 0.1114538461, -0.2179809511, -0.1276561916, -0.1641672105, -0.1095554829, 0.2863309681, 0.0590847731, -0.0361823067, -0.169426918, 0.0408692136, -0.0105613545, 0.1652774215, 0.3650978804, 0.1900334954, -0.3747362196, -0.1727474928, 0.1968026459, 0.0158034824, 0.3392804861, 0.31747967, 0.2047661841, 0.0000434816, -0.046710223, 0.254527837, -0.3137809038, 0.0875372589, 0.3993989527, -0.0182166584, -0.5664705634, -0.0736857951, -0.0969472229, 0.3861201406, -0.1427973509, 0.5104855895, -0.3036780953, -0.2402492166, 0.4625009298, 0.0840424746, 0.8406618834, -0.3637130558, 0.0132837342, 0.0892838687, -0.0050920844, 0.0694170445, -0.0628985837, -0.0200168751, -0.2953960001, 0.0041265748, -0.0972204208, 0.0550887808, 0.3370895982, 0.2183580548, -0.1210359037, 0.4495357573, -0.0287256911, -0.0096250474, -0.0391501412, 0.1835769117, 0.4047142267, -0.1066560894, 0.1088414639, -0.0186672229, -0.0423480794, -0.1334840059, -0.0020875223, 0.0470466763, 0.290999949, -0.1496485174, -0.388209939, -0.0935218409, -0.1085726023, 0.1244037598, 0.222088784, 0.1331909001, 0.0084558353, -0.0236120056, -0.0216733292, 0.1052146405, -0.0715592653, -0.1477669775, 0.1101584285, 0.3326568305, 0.1448520124, 0.0032652803, 0.3560245633, 0.1236727089, -0.2299614549, 0.0653378218, -0.2153463066, -0.2262952179, 0.0706189871, 0.0178393982, 0.1855374277, 0.0778600127, 0.1521413773, -0.0188030638, -0.111550726, -0.2047353536, 0.0256377012, -0.1125883535, -0.1400842667, 0.4181121886, -0.028293021, -0.3238860965, -0.0096673928, 0.4117645323, 0.1827358454, -0.0839909911, 0.1819816679, 0.0767891183, -0.0807370469, -0.2318515331, -0.1778570712, 0.3159959316, -0.1251134872, 0.2839005291, 0.0257559232, -0.2696926594, 0.1455887407, 0.0623941012, 0.0602632016, 0.2800892293, -0.1935099661, -0.4024764895, -0.434191227, 0.3008685112, -0.0027207739, 0.0832220167, -0.0944677144, 0.2488300651, -0.172197938, 0.3336556256, -0.2000491619, 0.091954574, -0.1073138416, 0.2210526913, -0.2320585102, 0.0096008368, -0.1644632518, -0.0914378017, 0.0653491169, 0.0121605974, -0.2207488716, -0.0092056021, -0.0578297302, 0.1484887302, 0.1470882744, -0.1971916258, 0.017915789, -0.0090584084, -0.1415073425, -0.2273658216, 0.1838725209, 0.3537550867, -0.2231852561, 0.1348133236, 0.3172185123, 0.1472459137, 0.0894244462, 0.2858356833, -0.1238539815, 0.0725883543, 0.0483570434, 0.4121779799, 0.1053223014, -0.1657753885, 0.0245059095, 0.1498435438, -0.0505778566, -0.0450950898, 0.3554060459, -0.2137866616, 0.0687398463, -0.0204051957, 0.373939395, 0.385463357, -0.1282384992, 0.013078019, 0.2367487699, 0.0610653237, 0.0084674358, -0.1731076241, 0.4683960378, 0.0407815613, 0.0509880632, 0.3392883539, 0.3380612135, -0.1434531063, 0.165693447, 0.1148950756, -0.0518439189, -0.0311293304, 0.1816533059, 0.1359545738, 0.0247088671, 0.3240543306, 0.1971983016, -0.2997606397, -0.1530558467, 0.2907683551, 0.0458550379, 0.5280524492, 0.0002223169, 0.1636800468, -0.1134029478, -0.6981276274, 0.02081386, 0.1347814798, -0.4347140789, 0.2099058032, -0.0852791518, 0.2224347889, -0.2267049551, -0.4809595942, -0.1522948444, 0.4796299636, -0.2569243908, -0.312117368, -0.2767160535, -0.1225165427, -0.1777711809, 0.0527485684, -0.0898242742, 0.1594699025, 0.7063724995, 0.0123733534, -0.0512496829, -0.2058636248, -0.2766355872, -0.0424799174, 0.3352662027, -0.0077606514, 0.4334902465, -0.0677994713, 0.1059245169, -0.0294921733, 0.1589468718, 0.4662957191, 0.7281749845, -0.4269512594, -0.1200887859, 0.1187562048, -0.064589262, -0.070649758, 0.384275943, 0.0765926987, 0.2253123671, 0.3098931611, -0.0882093906, 0.0227854066, -0.2762960792, 0.3280374408, 0.3067686856, 0.0891913399, 0.4277428091, 0.0766935945, -0.0542829037, 0.2583427131, 0.0579860695, -0.3257125616, -0.2193247676, 0.5333288908, -0.4387635887, 0.2860152125, 0.2186449617, 0.0412961468, 0.0746774077, 0.3803934753, 0.2904959917, 0.0493486263, -0.2989240289, -0.2100402713, -0.4146660566, 0.0077229813, -0.2228163183, 0.2766835988, 0.0260398462, 0.0152188018, -0.0235642828, 0.017725246, -0.0496932566, -0.2223454863, 0.3785585761, 0.3334754407, -0.5266216993, 0.1982719153, -0.005443424, -0.0264132023, -0.0235637873, -0.338791132, 0.3580263257, 0.083352983, -0.1214981675, 0.143286556, 0.1241791397, 0.0873632059, 0.0999922901, 0.0031535216, 0.2782767117, 0.5841789842, -0.0505017191, 0.0479514599, -0.2000543624, 0.0347691439, -0.1239148751, -0.0102263354, -0.0064630834, 0.3861199915, -0.1547225118, -0.0461017527, 0.0063525923, 0.1711960137, 0.0356913358, 0.1453520805, -0.4596276879, 0.3033058345, 0.1619607359, 0.0160801299, -0.1337858588, 0.4253679812, 0.1762628257, 0.3895447552, -0.3170859814, -0.2272405028, 0.3292976022, -0.0643050447, -0.1734614074, -0.6152847409, 0.2426213026, -0.1712377369, 0.0917394832, -0.504958868, -0.3762702048, 0.1952891201, -0.1857611686, -0.5628415346, 0.2360655218, -0.0989014432, -0.1997236013, -0.0967539623, -0.1907402873, 0.0916334689, -0.102065362, 0.2155117542, -0.0274183825 ]
https://github.com/huggingface/datasets/issues/620
map/filter multiprocessing raises errors and corrupts datasets
same problem. `encoded_dataset = core_data.map(lambda examples: tokenizer(examples["query"], examples["document"], padding=True, truncation='longest_first', return_tensors="pt", max_length=384), num_proc=16, keep_in_memory=True)` it outputs: ``` Set __getitem__(key) output type to python objects for ['document', 'is_random', 'query'] columns (when key is int or slice) and don't output other (un-formatted) columns. Done writing 1787500 indices in 25568400000 bytes . Set __getitem__(key) output type to python objects for ['document', 'is_random', 'query'] columns (when key is int or slice) and don't output other (un-formatted) columns. Done writing 1787500 indices in 25568400000 bytes . Set __getitem__(key) output type to python objects for ['document', 'is_random', 'query'] columns (when key is int or slice) and don't output other (un-formatted) columns. Done writing 1787500 indices in 25568400000 bytes . Set __getitem__(key) output type to python objects for ['document', 'is_random', 'query'] columns (when key is int or slice) and don't output other (un-formatted) columns. Done writing 1787500 indices in 25568400000 bytes . Set __getitem__(key) output type to python objects for ['document', 'is_random', 'query'] columns (when key is int or slice) and don't output other (un-formatted) columns. Done writing 1787500 indices in 25568400000 bytes . Set __getitem__(key) output type to python objects for ['document', 'is_random', 'query'] columns (when key is int or slice) and don't output other (un-formatted) columns. Done writing 1787500 indices in 25568400000 bytes . Set __getitem__(key) output type to python objects for ['document', 'is_random', 'query'] columns (when key is int or slice) and don't output other (un-formatted) columns. Done writing 1787500 indices in 25568400000 bytes . Set __getitem__(key) output type to python objects for ['document', 'is_random', 'query'] columns (when key is int or slice) and don't output other (un-formatted) columns. Done writing 1787499 indices in 25568385696 bytes . Set __getitem__(key) output type to python objects for ['document', 'is_random', 'query'] columns (when key is int or slice) and don't output other (un-formatted) columns. Spawning 16 processes ```
After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ```
301
map/filter multiprocessing raises errors and corrupts datasets After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ``` same problem. `encoded_dataset = core_data.map(lambda examples: tokenizer(examples["query"], examples["document"], padding=True, truncation='longest_first', return_tensors="pt", max_length=384), num_proc=16, keep_in_memory=True)` it outputs: ``` Set __getitem__(key) output type to python objects for ['document', 'is_random', 'query'] columns (when key is int or slice) and don't output other (un-formatted) columns. Done writing 1787500 indices in 25568400000 bytes . Set __getitem__(key) output type to python objects for ['document', 'is_random', 'query'] columns (when key is int or slice) and don't output other (un-formatted) columns. Done writing 1787500 indices in 25568400000 bytes . Set __getitem__(key) output type to python objects for ['document', 'is_random', 'query'] columns (when key is int or slice) and don't output other (un-formatted) columns. Done writing 1787500 indices in 25568400000 bytes . Set __getitem__(key) output type to python objects for ['document', 'is_random', 'query'] columns (when key is int or slice) and don't output other (un-formatted) columns. Done writing 1787500 indices in 25568400000 bytes . Set __getitem__(key) output type to python objects for ['document', 'is_random', 'query'] columns (when key is int or slice) and don't output other (un-formatted) columns. Done writing 1787500 indices in 25568400000 bytes . Set __getitem__(key) output type to python objects for ['document', 'is_random', 'query'] columns (when key is int or slice) and don't output other (un-formatted) columns. Done writing 1787500 indices in 25568400000 bytes . Set __getitem__(key) output type to python objects for ['document', 'is_random', 'query'] columns (when key is int or slice) and don't output other (un-formatted) columns. Done writing 1787500 indices in 25568400000 bytes . Set __getitem__(key) output type to python objects for ['document', 'is_random', 'query'] columns (when key is int or slice) and don't output other (un-formatted) columns. Done writing 1787499 indices in 25568385696 bytes . Set __getitem__(key) output type to python objects for ['document', 'is_random', 'query'] columns (when key is int or slice) and don't output other (un-formatted) columns. Spawning 16 processes ```
[ -0.3812629282, -0.1599824876, -0.0257185381, 0.2450914234, 0.1385165453, -0.124413535, 0.2277839482, 0.355183512, 0.0971070528, 0.1650544107, 0.0904944912, 0.4780940115, -0.4379878044, 0.3225881457, -0.3071635962, 0.0743547603, 0.0885861069, -0.0340096429, -0.2489497364, 0.2080956697, -0.2408304513, 0.2539782822, -0.3136201501, 0.1324146092, -0.5448880196, 0.0372942314, -0.0100328028, 0.2946363688, 0.0041616373, -0.5604772568, 0.3350958824, 0.1862284392, 0.0239844844, 0.4058553576, -0.0001238069, 0.1744508445, 0.3322082758, -0.1320037097, -0.0576931499, -0.3747684956, -0.1845240742, -0.1774747074, 0.2513702512, 0.0584546477, 0.1948400736, 0.0846877247, -0.0990758985, -0.2675629854, 0.2351568639, 0.4671221077, 0.0967173278, 0.2684923708, 0.2412028611, 0.1626385748, -0.0700324401, 0.1293650866, -0.0586700849, -0.0073626377, 0.2534246445, -0.2842436135, -0.0843153298, 0.2644785643, -0.2236346155, -0.2193568349, -0.1762109399, -0.1897660345, 0.5956932902, -0.6405863762, 0.2028380185, 0.031608317, -0.2833523452, -0.0602126867, -0.3970499635, -0.1658372581, -0.2814686298, -0.6259972453, 0.2206950784, 0.0547294691, -0.1740382016, 0.1226081401, -0.4526483119, -0.2315922976, -0.0487773269, -0.0581441112, -0.1401830763, 0.5158823729, 0.2073032558, 0.3076471686, 0.1674177051, 0.1691038758, 0.140595153, -0.0439474136, -0.0630423054, 0.1547047943, -0.395968318, 0.0444798023, 0.057254903, -0.1537994295, -0.2437874824, -0.0653125197, -0.0806686431, 0.1007319093, -0.1338204741, 0.0974462852, 0.381777823, -0.0635501295, 0.2099065483, 0.3211017549, 0.2443679273, -0.198280856, -0.0408754982, 0.0844120234, 0.3090510964, -0.0766795352, -0.0829107165, 0.322401613, 0.0780440867, -0.150715664, -0.2008779347, 0.1684620678, -0.1229617819, -0.1415814161, 0.0955838859, 0.1618544459, 0.0173612945, 0.7640120387, 0.1510403752, 0.1550602913, -0.3262266219, -0.1650592685, -0.0813882202, -0.0162957422, -0.3839084506, 0.1201189607, 0.2784776092, -0.0458315834, -0.0025162511, -0.0125778951, -0.3755195141, -0.2901799083, 0.1588219553, -0.269151628, 0.1290227771, 0.6434385777, -0.0847710595, 0.0378687233, -0.0311035141, -0.2311685234, 0.0893547237, 0.270310849, -0.5454252362, -0.1998849511, -0.1913017482, 0.0830652937, -0.0090252012, 0.2907004654, -0.1352848113, 0.2626661658, 0.4345033169, -0.3390247226, -0.3026489317, -0.3347219229, -0.4246217012, -0.3029886484, -0.0268524475, 0.2384120226, -0.388689518, -0.0208522528, 0.0716140717, -0.1603696197, 0.3210648596, 0.2287081331, -0.0067514591, 0.0228167772, -0.1003551483, 0.1779854894, 0.1289136559, -0.2334319651, -0.1171933338, 0.2298336923, 0.0215957016, 0.3336876631, -0.0718952119, -0.2325260192, 0.3409837484, -0.1377682537, 0.2706865966, -0.0474335924, -0.1684701294, 0.0419763774, -0.4458485544, -0.047539901, -0.0104413591, -0.0907257646, 0.3886667192, 0.1698449552, -0.1258530766, -0.4694308937, 0.3113692999, -0.1126031429, 0.2983156145, 0.1176052913, 0.0700896978, -0.0038852151, 0.1334570795, -0.2639537454, -0.4916706681, 0.197585538, -0.211062938, 0.059897054, -0.0548029989, 0.0016003773, 0.1356360316, 0.1703980863, -0.2089830786, -0.2314668, 0.0527523719, -0.0362777077, -0.2970741093, -0.0225937106, -0.1978597939, 0.606513083, 0.0651131943, 0.1337963939, -0.1734605283, 0.2274014354, -0.1264915317, -0.4327240288, -0.3220405579, 0.2733948231, -0.0145869087, -0.0576650351, -0.2054568827, 0.412061125, 0.492293179, -0.0928070471, -0.1072132438, 0.2081221342, 0.2767287195, -0.2052213848, -0.198612228, 0.0897645354, 0.0358791761, -0.175560981, 0.243249163, 0.4486860335, 0.2031271607, 0.3599848449, 0.2320652008, 0.2206025124, 0.2997630835, 0.0055637509, -0.0274569504, -0.1561451554, 0.1594946533, -0.143987298, 0.138449505, 0.0725642517, -0.1243332699, -0.2617251873, 0.1637732834, 0.0659065545, -0.1640601158, 0.0554417633, 0.0439122617, -0.1209330112, 0.1442406625, 0.0353032202, 0.3846460283, -0.0169441197, -0.1841905713, 0.1875163615, -0.2923648655, -0.0041846819, 0.1214141622, -0.060346324, 0.4126349688, 0.3325658739, 0.0933479145, 0.012138512, -0.2061804533, -0.3470262289, 0.1262338459, 0.3581795394, -0.4770468473, 0.2011185586, -0.2688544989, 0.3921818435, 0.1077219173, -0.0824889392, -0.2518575788, -0.5163869858, -0.1252588332, 0.5447285175, -0.089441821, 0.1139661223, -0.0750089139, 0.0922439992, -0.1993820965, 0.3349090219, -0.1236949041, -0.3494331241, -0.185538426, -0.1118903086, 0.3578803539, -0.3373329639, 0.2036116868, 0.1893170476, -0.2535941005, 0.1800450832, -0.3561491966, 0.0978302509, 0.0027096802, 0.0007185396, 0.0373477191, -0.038086623, 0.0021096841, -0.2566198707, 0.1516730189, -0.0926753506, -0.234215945, 0.3143596351, -0.1155798733, 0.0484062657, -0.2818495631, -0.3552569151, -0.4357711375, -0.1446247101, -0.1148806959, -0.2469652593, 0.3538689613, 0.2138128877, 0.027924668, 0.0080097131, -0.0402047336, 0.0558433235, -0.0504350103, -0.0469998494, -0.1569674462, 0.0019503683, -0.1449823081, -0.0489736386, 0.0109179616, 0.2259827107, 0.4416415393, -0.3533616364, 0.1268648803, 0.0998178571, -0.110559091, 0.2188432813, -0.2389687002, 0.3640262187, 0.3983313739, 0.0564016923, -0.0557651334, -0.1833662987, 0.1532928199, 0.0631325841, 0.036705792, 0.1003944427, 0.3216460049, 0.1986772716, 0.6136779785, 0.4047030807, -0.0415372029, 0.4049916267, -0.0742647052, 0.064705044, -0.1796073765, -0.3545347452, -0.1869883537, -0.3202336729, -0.0462963283, -0.1521203518, -0.1201904714, -0.4881998301, -0.2875450552, 0.59731704, -0.398958236, -0.22201702, -0.0257760286, -0.4343621731, 0.2957082689, -0.0880858228, 0.0619181767, -0.0284436904, -0.0533101521, -0.2206205726, 0.1752650887, 0.1265857369, -0.3411028385, -0.2711304724, -0.3298737407, -0.4660466313, 0.2708948255, 0.2392299771, 0.6756355166, -0.0031588823, -0.0472989045, 0.0279798824, -0.1610238254, 0.9008379579, -0.5975758433, -0.3845258951, 0.2998616993, -0.4416367114, -0.280539453, -0.1464083791, -0.1916859299, 0.5185415149, 0.417861104, 0.4736287594, -0.1258112639, -0.2736027539, 0.0931888074, -0.1973989457, -0.1291915625, 0.0807638019, -0.2861380577, -0.0162170008, -0.1824306399, 0.0195708387, -0.220948711, 0.1472903341, -0.196747601, -0.1496363878, -0.1174031347, -0.0945646018, 0.2924088836, 0.0564406887, 0.0141237937, -0.1687942147, 0.0732966512, 0.0023314422, 0.1798490286, 0.2803764641, 0.2534049749, -0.4210766852, -0.1872627586, 0.1640625596, 0.0486612283, 0.2841586173, 0.2582041025, 0.2260948867, -0.0701888353, 0.0519925468, 0.1801945269, -0.3139066994, 0.0786773562, 0.3798127174, 0.0034896582, -0.5512812734, -0.1097423434, -0.0697431713, 0.4393830597, -0.1443437487, 0.5627088547, -0.4565911293, -0.2755085826, 0.3744552433, 0.102812916, 0.7571626902, -0.3558432758, -0.0179323405, 0.1034671813, 0.0332228988, 0.1023247615, -0.0298274904, 0.0912707299, -0.2797409296, -0.0282336362, -0.0703882203, 0.0556830019, 0.2865791917, 0.1877317429, -0.1459402144, 0.3820241094, 0.0152319223, -0.0896092951, -0.0244375728, 0.2167614102, 0.4826402664, -0.0553400517, 0.1282262951, -0.0104418993, -0.0659208298, -0.1713444591, 0.0468596481, 0.023915939, 0.2783242464, -0.1352673471, -0.3766930103, -0.0313111171, -0.175974384, 0.1755466312, 0.1576109231, 0.1429028511, 0.00674133, 0.2085311115, -0.0955411047, 0.044745069, -0.1262193173, -0.1333512664, 0.0964359492, 0.3388732076, 0.2169423252, 0.0244920738, 0.3706183732, 0.0890446007, -0.2019162178, 0.0859586746, -0.2295129448, -0.2251862288, 0.117514506, 0.0538832396, 0.0550669543, 0.1243347228, 0.1317675412, -0.0027601421, -0.1668456793, -0.1685528457, 0.0406235345, -0.0621809624, -0.1579102129, 0.4621092379, -0.0469860956, -0.3780898154, 0.0285824351, 0.4684545398, 0.2376080602, -0.0665787682, 0.1323937923, 0.0757082999, -0.0553673655, -0.2345048785, -0.1947103441, 0.2041234225, -0.1385969073, 0.3187772036, 0.0618275665, -0.2533581555, 0.160747543, 0.1019411013, 0.0467942096, 0.3771519959, -0.2085136324, -0.3417022824, -0.4462184906, 0.2933820486, 0.0455736965, 0.1013232321, -0.1182618663, 0.1660128832, -0.1374007463, 0.311611563, -0.2197200209, 0.1450559497, -0.0788098276, 0.2332206368, -0.1382721961, 0.0145995691, -0.0921098739, -0.1046824902, 0.0761474743, 0.0331906006, -0.2527059317, -0.034475252, 0.0060305223, 0.1244489774, 0.1173036844, -0.2294043452, -0.042126663, -0.0710527822, -0.1012941599, -0.2299922705, 0.183063373, 0.3245706856, -0.21184282, 0.1726920754, 0.215828985, 0.2231037021, 0.0967913195, 0.3091025352, -0.2309209108, 0.0686678514, 0.0627948791, 0.4110857248, 0.1045889407, -0.1674642861, -0.0104248524, 0.0753285885, -0.0851460025, -0.0226067696, 0.3046310842, -0.2346469909, 0.0113802999, 0.0177812092, 0.3636149764, 0.3809759915, -0.1609293222, 0.0975751877, 0.1289528906, 0.0942887962, 0.0734338164, -0.1789153814, 0.5030883551, 0.071181044, 0.0788214505, 0.2968463898, 0.2957261205, -0.1840436012, 0.1694110483, 0.0986318588, -0.0717844442, -0.0473621637, 0.2588549554, 0.1763658524, 0.0519252717, 0.2686648071, 0.1383132637, -0.3351878226, -0.1445139199, 0.2858963609, 0.0185992569, 0.5381389856, 0.0261384621, 0.1412222534, -0.0872339085, -0.5816609263, 0.0283281449, 0.0647228807, -0.4965640306, 0.2716894746, -0.0133593418, 0.2167379558, -0.2738088369, -0.4163638651, -0.1943261027, 0.4032900333, -0.2760808766, -0.2871991992, -0.2953165472, -0.1381361037, -0.0973220319, 0.078092292, -0.091977343, 0.1750066131, 0.7415684462, 0.0666247159, -0.1233506873, -0.0933169648, -0.3392291963, 0.0101926867, 0.3148337007, -0.0212051123, 0.4458224177, 0.0185701214, 0.0203919355, -0.0606146306, 0.0977775306, 0.4438871443, 0.6963078976, -0.4553192258, -0.0343558043, 0.1794540584, -0.0795086101, -0.0911583304, 0.4181698263, 0.1046567485, 0.3097866178, 0.2685208321, -0.0613813475, 0.0192768052, -0.2396441549, 0.3282627761, 0.3821766973, 0.0546373911, 0.3476730287, 0.1113522053, -0.0831608027, 0.229930073, 0.0156460684, -0.326849103, -0.2684850693, 0.5566340089, -0.4823425114, 0.2635056376, 0.2357004285, 0.0446099229, 0.0398490578, 0.4240773916, 0.2635848522, 0.0626650155, -0.309312582, -0.1838684529, -0.3888840079, 0.0516262874, -0.1601488441, 0.2233796716, -0.005142618, 0.0271893442, -0.0624478608, 0.0350039341, -0.1013172567, -0.2399440259, 0.3998097777, 0.3118210435, -0.5812191367, 0.1934362501, 0.0182105359, -0.0433615521, 0.0252559334, -0.3325001895, 0.3341384232, 0.0686638653, -0.1056523025, 0.0980463922, 0.1510054171, 0.0682480186, 0.171723038, 0.0574419871, 0.2719835639, 0.6113704443, -0.0759607404, -0.0164804943, -0.2286667377, 0.1080480665, -0.1091561615, -0.0437676273, 0.0593832768, 0.4581900239, -0.1972042322, -0.0211739521, -0.0006724037, 0.1767244041, 0.0470644906, 0.1126625016, -0.4025764167, 0.2158270925, 0.1916213632, 0.0649235845, -0.1029515266, 0.4361128807, 0.1671412885, 0.3154137135, -0.2531250715, -0.2148845196, 0.327501297, -0.083303459, -0.0832166001, -0.5843408108, 0.2102664858, -0.1626891196, 0.1443041563, -0.4967543483, -0.3728494644, 0.1991520524, -0.2268793881, -0.6331869364, 0.2344035655, -0.0566811413, -0.1545067877, -0.0621901527, -0.2077106684, 0.1369816512, -0.1371983439, 0.1999457479, -0.1195718497 ]
https://github.com/huggingface/datasets/issues/620
map/filter multiprocessing raises errors and corrupts datasets
Thanks for reporting. Which tokenizers are you using ? What platform are you on ? Can you tell me which version of datasets and pyarrow you're using ? @timothyjlaurent @richarddwang @HuangLianzhe Also if you're able to reproduce the issue on google colab that would be very helpful. I tried to run your code @richarddwang with the bert tokenizer and I wasn't able to reproduce
After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ```
64
map/filter multiprocessing raises errors and corrupts datasets After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ``` Thanks for reporting. Which tokenizers are you using ? What platform are you on ? Can you tell me which version of datasets and pyarrow you're using ? @timothyjlaurent @richarddwang @HuangLianzhe Also if you're able to reproduce the issue on google colab that would be very helpful. I tried to run your code @richarddwang with the bert tokenizer and I wasn't able to reproduce
[ -0.4514070451, -0.0183558762, -0.0187174454, 0.2539694309, 0.1591785997, -0.1641506106, 0.2910158038, 0.3014787734, -0.0354784317, 0.1461101472, 0.0482643768, 0.4889176488, -0.4316677153, 0.3358331323, -0.3009807169, 0.0277107414, 0.1179320887, -0.0190120228, -0.2634868622, 0.2409512997, -0.2404620647, 0.2257554531, -0.4016406536, 0.1766658425, -0.6489456892, 0.0602546185, -0.0938618183, 0.2311898768, -0.0793511868, -0.5861321092, 0.3002741635, 0.2776139081, 0.0705376565, 0.4927060008, -0.0001242772, 0.0863855779, 0.3700332046, -0.0688939914, -0.07559634, -0.4253844023, -0.1398857087, -0.1962036043, 0.3268500566, 0.1186021417, 0.2297609001, -0.0144218039, -0.110530287, -0.1358259618, 0.2951320708, 0.4044713378, 0.0998646617, 0.210828498, 0.2679282725, 0.1761284471, 0.0266665686, 0.1134237647, -0.0295609199, 0.0634083897, 0.3047252595, -0.3775463104, -0.0981571674, 0.1819480062, -0.25123927, -0.2436911911, -0.1529684663, -0.1884294599, 0.5462461114, -0.6417780519, 0.1981925964, 0.0507448204, -0.126994282, -0.0952431709, -0.4095526934, -0.1634817868, -0.2056479901, -0.5241122842, 0.2612170577, 0.0887136906, -0.1917433441, 0.1744312644, -0.5121266246, -0.1702479869, -0.0599606633, -0.0430706814, -0.1608613133, 0.6147717834, 0.197079137, 0.2824649215, 0.1318351328, 0.1094723195, 0.3357939124, -0.0813706666, -0.1276592016, 0.196050033, -0.4639242589, 0.0747698843, 0.0073758662, -0.0823294222, -0.2194858789, 0.020000957, -0.241867587, 0.1100883782, -0.0998963639, 0.0787979364, 0.3843362927, -0.007607054, 0.2199826241, 0.340882659, 0.2288217843, -0.2596558332, 0.0027707666, 0.0371754728, 0.2253894806, -0.1269135475, -0.0501924381, 0.3050414026, 0.1194526926, -0.2124558389, -0.2372737527, 0.1194865182, -0.2387343943, -0.1723510623, 0.094883725, 0.1854164302, 0.0307488814, 0.7054214478, 0.083328329, 0.1961910576, -0.2631703913, -0.1573425978, -0.0654681921, -0.0452280752, -0.4083971679, 0.1598671079, 0.2452349365, 0.0504905283, -0.0003736168, 0.0509882383, -0.3588193953, -0.2416549176, 0.1078232378, -0.2778945863, 0.0396640971, 0.653049469, -0.0253034793, 0.0413961336, -0.0589494519, -0.2250759304, 0.0949929655, 0.2815290093, -0.511983335, -0.1841510236, -0.261228025, 0.0831530616, -0.0231680907, 0.2442412674, -0.294486016, 0.2875610888, 0.3829674423, -0.2585481405, -0.2988173366, -0.3133198023, -0.3318474889, -0.3129890263, -0.0024769604, 0.3077674508, -0.4657443762, -0.0098975673, 0.0874544084, -0.1941268295, 0.3770691156, 0.1856147349, -0.0613608882, 0.0669876188, -0.126724571, 0.2158232778, 0.1401776075, -0.127975449, -0.1124336794, 0.2217225432, 0.0198812932, 0.2737292349, -0.0765959769, -0.2837845683, 0.3305279016, -0.1273437589, 0.2405419052, -0.0251978002, -0.1801247448, 0.0559502393, -0.4307547212, -0.0424652211, -0.0346670747, -0.0584328659, 0.3701486886, 0.1658082604, -0.1280090362, -0.5494299531, 0.3390696943, -0.0956565216, 0.3101181388, 0.0468286015, 0.2010082155, 0.0631356239, 0.1250196397, -0.3416934609, -0.4653606713, 0.2084015608, -0.22590895, 0.0564941689, -0.1141107082, -0.018633604, 0.0731934011, 0.2188658416, -0.2546341419, -0.1963988543, 0.0548204482, -0.0569620617, -0.2888917029, -0.0090967342, -0.1978034079, 0.6334369183, 0.1330013871, 0.1863329262, 0.0088011436, 0.3338904679, -0.1678681821, -0.4845402837, -0.2472650409, 0.1851918846, 0.0388388038, -0.0657136217, -0.2361522019, 0.4898025095, 0.4748795033, -0.118436344, -0.0471903309, 0.1707308739, 0.2384603173, -0.2020017505, -0.1777738184, 0.0177080147, 0.0314918831, -0.147821188, 0.3008817434, 0.4878903329, 0.1369514316, 0.3793053031, 0.1605119109, 0.2333791256, 0.2646210492, -0.0008947477, 0.0150104612, -0.1043363363, 0.1605807692, -0.1033007652, 0.1602765322, 0.07341896, -0.0789121091, -0.2140072584, 0.1174611449, -0.007737495, -0.2083767653, 0.0450218543, -0.0087572411, -0.116595, 0.1818486452, 0.0216806084, 0.4007754326, 0.0097522363, -0.2446657568, 0.1888061166, -0.2269088626, -0.0186139271, 0.1428357065, -0.0703981891, 0.450250268, 0.2981048822, 0.1378419399, 0.0256141499, -0.2035499215, -0.3304831684, 0.1311553717, 0.4114908874, -0.5120652318, 0.2068114877, -0.303552717, 0.2663401365, 0.0753613561, -0.1748272777, -0.3257457614, -0.5263910294, -0.1301474571, 0.5010661483, -0.0153543577, 0.1526308954, -0.0430889875, 0.0754176229, -0.190926522, 0.3626385331, -0.1729196757, -0.3490326107, -0.2568057179, -0.0965510607, 0.4135926068, -0.2263154238, 0.2242784947, 0.138564229, -0.2426475435, 0.1376250684, -0.326459825, 0.1218762696, -0.045686923, 0.099334456, 0.0476027876, -0.0871100947, -0.0611737818, -0.2198954225, 0.1496851742, -0.1308505088, -0.2919812799, 0.2979001701, -0.1592381746, 0.0333516225, -0.2079058588, -0.292170465, -0.4472480118, -0.0652053803, -0.1241436899, -0.2491880208, 0.3339014947, 0.1426375061, 0.0502238013, -0.0268351138, -0.1156214327, -0.0214422159, -0.0339341313, -0.0353107229, -0.1688209474, 0.0389617719, -0.1542486548, -0.0213813111, 0.0638004243, 0.1187989712, 0.4105750322, -0.3101979494, 0.0174236335, 0.0785767063, -0.0216662735, 0.2032389343, -0.210243389, 0.3423300982, 0.4269163609, 0.0653369427, -0.0130828694, -0.1810947955, 0.0445571616, -0.0369704291, 0.0857582539, 0.0567322597, 0.3177715242, 0.1290830523, 0.6365389228, 0.3438289165, 0.024584271, 0.398835361, -0.1147427261, 0.0726947337, -0.1767133623, -0.4179878235, -0.1399570405, -0.3385908008, -0.003416121, -0.110215053, -0.124660261, -0.4634775817, -0.2985057235, 0.5199306607, -0.4142763615, -0.2368217111, -0.0435288399, -0.4586132765, 0.2934297323, -0.073625721, 0.0203421712, -0.0263481103, -0.0830056965, -0.2813160717, 0.2080481797, 0.0749505907, -0.3284226656, -0.2657977641, -0.2576183975, -0.4092136621, 0.3226160109, 0.2142197937, 0.6462939978, 0.0636077225, 0.0171448365, 0.0897397846, -0.1505381167, 0.9170176983, -0.5295390487, -0.4050594866, 0.2950898409, -0.3897665441, -0.2402523458, -0.2102427334, -0.2094305456, 0.5488239527, 0.4073244929, 0.42700091, -0.1365407407, -0.2981922626, 0.1650773287, -0.1730675697, -0.0973170027, -0.0574684031, -0.3223575354, -0.0002321377, -0.1945852637, 0.1016060114, -0.2478363812, 0.1796010733, -0.2818060517, -0.1345824897, -0.097798124, -0.1065051705, 0.2231224924, 0.120508045, -0.029196823, -0.2103305012, 0.0621238798, 0.0158562977, 0.1463371515, 0.4021160305, 0.2343918383, -0.3336503804, -0.2025295645, 0.1285567284, 0.0272816531, 0.3452243507, 0.2670353651, 0.1780946106, -0.0470146388, -0.0255394895, 0.1622676551, -0.1858314574, 0.0603774823, 0.3804246187, -0.0274199937, -0.5772963762, -0.0675487593, -0.0502465367, 0.4389663637, -0.1459270865, 0.4760753214, -0.3540567756, -0.2528214455, 0.3886634111, 0.0871431008, 0.873688817, -0.3867263794, -0.0512379892, 0.1110443473, 0.0254614353, 0.134284988, 0.0106234998, 0.0341231711, -0.3378762603, -0.00249907, -0.0574132018, 0.0252358019, 0.2827624977, 0.2086384743, -0.1994743645, 0.3970702589, -0.0344182849, -0.0258599203, 0.0347809717, 0.1602527946, 0.4071724713, -0.0319601037, 0.1021644101, -0.0038297661, -0.0508484729, -0.1375652403, -0.0000860319, 0.0362856761, 0.2331689894, -0.1255358905, -0.4613439143, -0.07501968, -0.2013408542, 0.1040432453, 0.204727903, 0.0609577447, 0.029219836, 0.0799928755, -0.0750484243, 0.0657526106, -0.1150756627, -0.1086518541, 0.1019139588, 0.2996260226, 0.156552285, -0.0162425786, 0.3197112679, 0.089908801, -0.2710873783, 0.0669754446, -0.1930178702, -0.1884996444, 0.0744526535, 0.0479698852, 0.0993243232, 0.0851055533, 0.1921527386, -0.0828154385, -0.0831782073, -0.2331731319, 0.0493477993, -0.0488529876, -0.1012546271, 0.4635451734, -0.0041763596, -0.3239700496, 0.0002366016, 0.4024422169, 0.2230578065, -0.077550061, 0.1868162304, 0.1216437668, -0.0351368114, -0.2248840779, -0.1783464849, 0.2387912273, -0.1943720877, 0.2779047191, 0.0762500763, -0.1605110466, 0.2321417481, 0.1646019965, 0.0615421422, 0.3667696714, -0.172534138, -0.3573762178, -0.4265033603, 0.2562675178, 0.0572031438, 0.1014110297, -0.1279230565, 0.2864010036, -0.0851161852, 0.3098719716, -0.2082179785, 0.1187696829, -0.1360605955, 0.2225153893, -0.1965073645, -0.0179605, -0.1610796005, -0.1194787174, 0.0611644983, -0.0054543, -0.1573922485, -0.0320141539, 0.0094413534, 0.1344663203, 0.1584110111, -0.2003342211, -0.0502521135, -0.0367939211, -0.1069702059, -0.2200432718, 0.1807989329, 0.3339798748, -0.209378168, 0.1154191941, 0.3636813462, 0.1378784329, 0.0411138162, 0.3112680316, -0.2223468125, 0.0580957867, 0.079748705, 0.4565372467, 0.1410528868, -0.153162837, 0.0351948664, 0.069718048, -0.0300752148, -0.0467674844, 0.2754617631, -0.2803600729, 0.0170984045, -0.0801738054, 0.3852861524, 0.4729865491, -0.1101136804, 0.0536670759, 0.2419434488, 0.1005080491, 0.0127546191, -0.1003076583, 0.5062841177, 0.0754640251, 0.0284594893, 0.3011084795, 0.2736849189, -0.13196522, 0.1596792042, 0.1436665654, -0.0089023113, 0.0741019696, 0.2282645106, 0.123529315, 0.0906232595, 0.2836115956, 0.2560000718, -0.3117285371, -0.1668028384, 0.2614684403, -0.0699934661, 0.4988989532, -0.0726136938, 0.1265602261, -0.066648826, -0.6106047034, 0.0236302391, 0.1232683063, -0.4722033441, 0.1165863648, -0.0490898453, 0.2964209914, -0.2452361435, -0.4382403493, -0.1968892217, 0.4906771779, -0.2664857507, -0.2746958137, -0.2911098599, -0.1555421948, -0.1222532764, 0.0831958055, -0.1335174143, 0.1632056832, 0.7009334564, 0.0375612415, -0.1453173906, -0.2588684857, -0.3074166179, 0.0625748485, 0.2946560085, -0.021973297, 0.4159761071, -0.0591313876, 0.0211353153, -0.082620129, 0.1767219901, 0.5027847886, 0.7412921786, -0.4434116185, -0.0956698805, 0.1072114557, -0.0295978189, -0.0504567884, 0.3902238607, 0.1533710361, 0.2753219306, 0.2701472938, -0.0577950068, 0.023225572, -0.2116096318, 0.3104964197, 0.3349042535, 0.0382671468, 0.2798104882, 0.0830267817, -0.0984346941, 0.1646281183, 0.0704896748, -0.3213475049, -0.1564122587, 0.5048125386, -0.4208493829, 0.2050681263, 0.1837754101, 0.0468104519, 0.0648643076, 0.4589599967, 0.3885908425, 0.209923327, -0.3096044362, -0.2707787752, -0.4379997253, 0.0589853898, -0.2144681513, 0.2642842531, -0.0245213881, -0.0231493786, -0.0907868594, 0.1096472144, -0.1216092408, -0.091539979, 0.3047742844, 0.3852219582, -0.6141394377, 0.2424551547, 0.0491848513, -0.0252741091, -0.0407546535, -0.3058139086, 0.3760470748, -0.0075481012, -0.1159947291, 0.0928705782, 0.1337739378, 0.1216403544, 0.1457852721, 0.0757282376, 0.2860177457, 0.6471263766, 0.0260007158, 0.1075965688, -0.2877636552, 0.0207424723, -0.158635214, -0.026822459, 0.0917569399, 0.3597252965, -0.1527472734, -0.0074945902, 0.0157905407, 0.2090012878, 0.0230569094, 0.1894231439, -0.4248899519, 0.1550400555, 0.2044279575, 0.0485259779, -0.178172186, 0.3578694761, 0.1304600686, 0.3544120491, -0.3445575237, -0.2215114236, 0.3429217935, -0.1219318956, -0.1181417257, -0.6038334966, 0.1825371981, -0.1230143905, 0.0716554299, -0.4986055493, -0.3239955902, 0.2401542515, -0.2024821341, -0.5594590902, 0.249853909, -0.0530509874, -0.1302679479, -0.1022435054, -0.2015974522, 0.1135976017, -0.1165666953, 0.1701205522, -0.0705462098 ]
https://github.com/huggingface/datasets/issues/620
map/filter multiprocessing raises errors and corrupts datasets
Hi, Sorry that I forgot to see what my version was. But after updating datasets to master (editable install), and latest pyarrow. It works now ~
After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ```
26
map/filter multiprocessing raises errors and corrupts datasets After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ``` Hi, Sorry that I forgot to see what my version was. But after updating datasets to master (editable install), and latest pyarrow. It works now ~
[ -0.4262238145, -0.0280841291, -0.0307076629, 0.1988401562, 0.1331438422, -0.1375354975, 0.3073059022, 0.3333303034, -0.0064561814, 0.1290372163, 0.0286431983, 0.4368832111, -0.4107376933, 0.2967741489, -0.264008522, 0.02800069, 0.1197287887, -0.0393837132, -0.369564712, 0.2227409184, -0.2434658259, 0.2219715863, -0.3648296595, 0.1880379617, -0.5168521404, 0.0857325867, -0.078606613, 0.2741607726, -0.0307961497, -0.6130791306, 0.3389738202, 0.2480801046, 0.0578998402, 0.4376125038, -0.0001256642, 0.1473098695, 0.391125232, -0.0815783888, -0.0722280815, -0.3685733676, -0.1584379971, -0.1926399469, 0.3192972839, 0.0929169059, 0.2308609635, -0.0479285643, -0.0620318428, -0.1443226188, 0.217092514, 0.4277086556, 0.0917038471, 0.2471805066, 0.3233214021, 0.1382609308, 0.114336893, 0.1332388818, -0.0305652283, 0.0948625952, 0.3123919964, -0.3872241378, -0.1458236575, 0.209410578, -0.2584967613, -0.2159693837, -0.1401002407, -0.1977103949, 0.7033735514, -0.6043232083, 0.151326105, 0.0687064677, -0.1421925128, -0.1105686128, -0.4584228992, -0.1897013634, -0.2329023629, -0.4672611058, 0.2183403969, 0.0801674351, -0.184158802, 0.1617284864, -0.4882290363, -0.184417367, -0.0824081004, -0.0564406142, -0.142925024, 0.5605847836, 0.2558193207, 0.3145768344, 0.1811260283, 0.1358933598, 0.271571368, -0.0153847495, -0.1004990935, 0.191243723, -0.4333982468, 0.0872442424, 0.0809165686, -0.0300572067, -0.2048362195, -0.0127456672, -0.1876096576, 0.0651660413, -0.0511016361, 0.0766423494, 0.445684731, -0.0959976763, 0.1797673404, 0.2698796391, 0.2719273567, -0.2075210661, 0.0286726505, 0.0807523131, 0.3131719828, -0.1496838629, -0.0081412867, 0.3094293773, 0.151496008, -0.1890539229, -0.1789470911, 0.1035738364, -0.156428799, -0.2262354642, 0.0571261756, 0.2013353556, -0.0015101694, 0.7359144092, 0.0622509569, 0.168650806, -0.2956739664, -0.0981431082, -0.0687070638, -0.0478757545, -0.4130522609, 0.0763137564, 0.2600795031, -0.0466357395, -0.012045417, 0.0436596386, -0.3395043015, -0.2060764432, 0.0464952514, -0.2673593163, 0.0518139303, 0.6857974529, -0.0926615521, 0.0158425048, -0.1071033999, -0.1668682396, 0.0837770849, 0.3180407882, -0.4997595847, -0.2156491876, -0.2343014926, 0.0743628144, -0.0113551728, 0.2890094519, -0.1569271088, 0.2223332673, 0.4243984818, -0.3984229267, -0.2808879912, -0.3528043628, -0.2661391795, -0.33011958, -0.0200215988, 0.2311417609, -0.4519302249, -0.0070783272, 0.0694645196, -0.2467419803, 0.3404718041, 0.2008073628, -0.05478305, 0.0209854022, -0.1204582378, 0.1243193746, 0.1210084334, -0.1624819487, -0.1700265706, 0.2329110801, 0.0909360573, 0.3055433035, -0.0249601118, -0.2829442024, 0.293525666, -0.1453177184, 0.1870567799, -0.0470502451, -0.1874556094, 0.0637620538, -0.414736867, -0.0254793614, -0.0138664655, -0.0490528122, 0.385358423, 0.1553204209, -0.0500653461, -0.533054769, 0.3955679536, -0.1283029318, 0.2941477299, 0.0562962554, 0.1294926554, 0.0282951426, 0.1651159972, -0.335367471, -0.5447028875, 0.2167761922, -0.3194434047, 0.0381354205, -0.118916139, -0.0113430694, 0.1209756881, 0.222920835, -0.2015700042, -0.1836011857, 0.0278963, -0.0479689576, -0.287488997, -0.0581979156, -0.2361510247, 0.6122261286, 0.1214626133, 0.1500259042, -0.0901402757, 0.2515524924, -0.1920495629, -0.4561063945, -0.228939563, 0.2139546275, -0.0094456151, -0.0951694846, -0.1919599026, 0.4771193266, 0.4192662239, -0.0924052447, -0.125446707, 0.097467497, 0.2632505894, -0.2146299481, -0.1903713644, 0.0501567945, 0.0394389331, -0.1361795515, 0.2064526975, 0.4271456599, 0.1136484668, 0.3887141943, 0.2086596638, 0.1980686933, 0.2781986892, 0.0070676431, -0.0115256831, -0.1324375123, 0.1837472022, -0.1079406664, 0.2337143421, 0.0872981399, -0.1605638266, -0.2234202325, 0.1846197248, -0.0384716429, -0.1715283543, 0.0879374593, 0.0761116594, -0.131168142, 0.168131873, 0.0018954054, 0.3951359391, 0.0263034441, -0.1902968585, 0.1835366338, -0.3049235344, -0.0213665031, 0.156835556, -0.082507506, 0.4652839005, 0.3312571049, 0.1206730157, -0.0191457588, -0.2211037725, -0.2978807688, 0.1129527763, 0.38663432, -0.5138965249, 0.183179006, -0.2872391343, 0.3498785198, 0.092032969, -0.1949670613, -0.335986495, -0.5282184482, -0.1070326865, 0.5051033497, -0.0670931488, 0.1802400351, -0.0582268164, 0.0198348761, -0.1841712892, 0.3257269263, -0.1978475153, -0.3439944983, -0.2296531647, -0.1021255255, 0.3826712072, -0.3052901924, 0.1771932989, 0.1326709688, -0.2272642255, 0.1391626298, -0.3415319324, 0.0712248236, -0.0616251007, 0.1284911931, 0.1239016131, -0.041520521, 0.0358641446, -0.1854020655, 0.1367174387, -0.1725319475, -0.2678059638, 0.3011962175, -0.1624246687, 0.0868373737, -0.2424319685, -0.3241862953, -0.4599990845, -0.0813023821, -0.078573212, -0.2051279992, 0.3805842698, 0.1081921905, 0.0498001724, -0.0413516983, -0.0785899162, 0.0359863564, 0.0005277544, -0.0447287858, -0.1604642868, 0.05609411, -0.1445964575, -0.103899315, 0.0696533173, 0.0801024139, 0.4333180785, -0.3374866247, 0.0384629108, 0.0815890506, -0.0312824622, 0.2492198497, -0.172550872, 0.3564425707, 0.4469681978, 0.0673355088, -0.0272415653, -0.1913273036, 0.0769572631, 0.0046288781, 0.0519373566, 0.1071615741, 0.3089772463, 0.1419170797, 0.6148814559, 0.301805079, 0.0772750601, 0.4116167724, -0.10487739, 0.1169989854, -0.1796388626, -0.3834996521, -0.181512624, -0.332311064, 0.017943792, -0.1800165027, -0.1385162622, -0.5107504725, -0.2946904898, 0.5083618164, -0.3593626916, -0.2251889408, -0.0294324216, -0.4158391654, 0.2737883329, -0.0482280329, 0.0361039042, -0.0130671188, -0.0562718771, -0.2096551061, 0.1837158352, 0.1155626923, -0.295001924, -0.2535851896, -0.3127791286, -0.4444831908, 0.2949639559, 0.2194480151, 0.6344885826, 0.0734116063, -0.0596754253, 0.0646128729, -0.205896318, 0.9419649839, -0.5384348631, -0.3539144397, 0.276704222, -0.3353812099, -0.2960507572, -0.2124643624, -0.1766387969, 0.5135712028, 0.3529107273, 0.3994940519, -0.1902302802, -0.3092125356, 0.2457103729, -0.194255054, -0.1138670146, -0.0568229854, -0.326867938, -0.014941968, -0.1858352572, 0.0560889207, -0.3065468669, 0.1166411564, -0.2546604276, -0.0739463866, -0.1020662189, -0.0674834624, 0.2043438256, 0.0934563652, -0.0091579575, -0.2332932204, 0.0409652367, 0.0166019611, 0.1896849871, 0.3295453787, 0.2695167661, -0.3906277418, -0.152263552, 0.1227662116, 0.0285158567, 0.3404768705, 0.2600502372, 0.1857446283, -0.0897350758, 0.0456199795, 0.1567802578, -0.2219394743, -0.0335158333, 0.3762091994, -0.068357192, -0.5702347159, -0.121167779, -0.0074182302, 0.4382995963, -0.1868736297, 0.4402967095, -0.3790721893, -0.2562080622, 0.3504358828, 0.0944136977, 0.8294615746, -0.3685492873, 0.0075136479, 0.1412826627, -0.029199034, 0.0943378285, 0.063898921, 0.0500575677, -0.3382293582, 0.0104199387, -0.0711941943, 0.0385398418, 0.2984250188, 0.1840956211, -0.1552239954, 0.4556516409, 0.0017168671, -0.0308157895, 0.0527693927, 0.1311540157, 0.4662726223, -0.0224106908, 0.0719880164, -0.0450141095, -0.0487075374, -0.2092830837, 0.015917737, -0.0253666975, 0.2764601111, -0.1492098272, -0.3791148961, -0.0607024767, -0.1620898545, 0.1172117889, 0.2236530185, 0.0434420854, -0.0240611434, 0.0903698578, -0.1618458927, -0.0048664496, -0.0905559957, -0.117132172, 0.0959544107, 0.359336555, 0.1913873702, 0.0502057001, 0.3596628308, 0.0885728896, -0.3440556228, 0.0759110898, -0.2765966356, -0.2300834507, 0.0411391854, 0.075366199, 0.0709391236, 0.1175203174, 0.1291136295, -0.089992784, -0.1023082212, -0.202532202, 0.0411027595, -0.0845110118, -0.0814067423, 0.5126654506, -0.0769249946, -0.3476968408, -0.0207591429, 0.4428583086, 0.2637454867, 0.0066063404, 0.1803857684, 0.1175714433, -0.0454968587, -0.2055633366, -0.2149386108, 0.236913383, -0.2474359572, 0.2362888157, 0.0953160822, -0.2004563659, 0.2147379667, 0.0181344468, 0.0912798345, 0.3555852771, -0.208153218, -0.3337842822, -0.4716791213, 0.2893686593, 0.0941014141, 0.0819012672, -0.0503454618, 0.2005215287, -0.0628680363, 0.2897005379, -0.2136403769, 0.1265349686, -0.0537288599, 0.2342293561, -0.1462178379, 0.0019318941, -0.1427260339, -0.1322263181, 0.0681192279, -0.0060306005, -0.2096678615, -0.0245787967, 0.0344262645, 0.1419172734, 0.0874609277, -0.2029490471, -0.0352997072, -0.0580191761, -0.0988913476, -0.1757138669, 0.2244659811, 0.3153638542, -0.1890925169, 0.1739247292, 0.2851655781, 0.1327940077, 0.068040207, 0.3327769041, -0.2285958529, 0.0301885344, 0.0431795344, 0.4585479796, 0.0703358576, -0.1614503711, 0.0679127127, 0.0809253603, -0.0829220116, -0.0151948193, 0.3467746079, -0.2715692818, -0.005046688, -0.0690951198, 0.4576991498, 0.440900296, -0.1034998149, 0.0886820108, 0.1568597257, 0.0879036263, -0.0480683818, -0.1489168704, 0.5387005806, 0.1000272706, 0.0745917782, 0.2959094644, 0.305413872, -0.1664313078, 0.200623095, 0.1431750357, -0.0360419378, 0.0572847836, 0.1860652268, 0.2724130154, 0.1090565324, 0.2569831908, 0.2660531104, -0.2749655843, -0.1071280167, 0.3567868769, -0.0522893071, 0.4890811741, -0.066310592, 0.1197937354, -0.0846302509, -0.6104991436, 0.0734563321, 0.1625177562, -0.4453827739, 0.1463363171, 0.0180166103, 0.3133054376, -0.1979465336, -0.4583255947, -0.1947420835, 0.4319695532, -0.2319601625, -0.3544823527, -0.2850641608, -0.1753605604, -0.0590660125, 0.0898283198, -0.1454728693, 0.1400508881, 0.799793601, 0.0538037196, -0.0667099059, -0.246662572, -0.3139769733, -0.0052762628, 0.3108202219, 0.0006737858, 0.4508588612, 0.0546376482, -0.0182312466, -0.0618826784, 0.1523041129, 0.5050354004, 0.6981105804, -0.4474114478, -0.048005335, 0.1089008451, -0.0523094498, -0.0862671733, 0.3647114336, 0.1321836859, 0.3019933105, 0.2963253558, -0.0656324178, 0.0374661796, -0.1313715726, 0.3382486105, 0.3452511728, 0.024048958, 0.3023179471, 0.079864502, -0.1416299194, 0.256580323, 0.0797427297, -0.3255625069, -0.1952848732, 0.5692189336, -0.4321991503, 0.2212005854, 0.1722593606, 0.0486827157, 0.1094024703, 0.5082315207, 0.3306358457, 0.1644895077, -0.2521733344, -0.1733621061, -0.4439539015, 0.03429842, -0.1946026534, 0.3066216111, 0.0515658297, 0.0016596094, -0.0660617948, 0.0451498814, 0.0099892393, -0.1932029128, 0.3324363828, 0.3729103208, -0.5852847695, 0.2185806334, 0.0498897545, -0.006330952, -0.0412900746, -0.3142339885, 0.3802559376, -0.0498879515, -0.1108642071, 0.0644588917, 0.1700887233, 0.1745329797, 0.2145899236, 0.0503832847, 0.2386427969, 0.5973002911, -0.0063544884, 0.0381643251, -0.2856546342, 0.0393659398, -0.1230391264, -0.0162282772, 0.0599052273, 0.3647588789, -0.1982252896, -0.0922027826, 0.0259148888, 0.2058888674, 0.0098137781, 0.2247243375, -0.3983494341, 0.1691612154, 0.2343991697, -0.0028142892, -0.200594306, 0.3361581862, 0.1607356519, 0.3032528758, -0.3313167691, -0.2211695313, 0.3441255689, -0.0998624712, -0.1244289204, -0.59441185, 0.1723876148, -0.171697855, 0.0901963264, -0.5051230788, -0.3045919836, 0.2259378135, -0.1941107661, -0.5847125649, 0.2045530975, -0.0807114318, -0.1077305451, -0.0857333317, -0.2042673528, 0.125451833, -0.1983264536, 0.1375614405, -0.1398177594 ]
https://github.com/huggingface/datasets/issues/620
map/filter multiprocessing raises errors and corrupts datasets
Sorry, I just noticed this. I'm running this on MACOS the version of datasets I'm was 1.0.0 but I've also tried it on 1.0.2. `pyarrow==1.0.1`, Python 3.6 Consider this code: ```python loader_path = str(Path(__file__).parent / "prodigy_dataset_builder.py") ds = load_dataset( loader_path, name="prodigy-ds", data_files=list(file_paths), cache_dir=cache_dir )["train"] valid_relations = set(vocabulary.relation_types.keys()) ds = ds.filter(filter_good_rows, fn_kwargs=dict(valid_rel_labels=valid_relations)) ds = ds.map(map_bpe_encodings, batched=True, fn_kwargs=dict(tokenizer=vocabulary.tokenizer), num_proc=10) # add all feature data ner_ds: Dataset = ds.map( add_bio_tags, fn_kwargs=dict(ner_label_map=vocabulary.ner_labels, tokenizer=vocabulary.tokenizer), ) rel_ds: Dataset = ner_ds.map( relation_ds_factory, batched=True, writer_batch_size=100, fn_kwargs=dict(tokenizer=vocabulary.tokenizer, vocabulary=vocabulary), ) ``` The loader is essentially a jsonloader with some extra error handling. The data is a jsonlines format with text field and a list of span objects and relation objects. In the `ner_ds` a field, `ner_labels` is added, this is used in the downstream `relation_ds_factory`. It all runs fine in a single process but I get a KeyError error if run with num_proc set : ``` File "/Users/timothy.laurent/src/inv-text2struct/text2struct/model/dataset.py", line 348, in relation_ds_factory ner_labels = example["ner_labels"] KeyError: 'ner_labels' ``` This is just one example of what goes wrong. I've started just saving the dataset as arrow at the end because it takes a long time to map/filter/shuffle and the caching isn't working (tracked it down to byte differences in the pickled functions). ^^ Interestingly if I heed the warning from Tokenizers and set the environment variable, `TOKENIZERS_PARALLELISM=true` the map just hangs: ``` [I 200921 21:43:18 filelock:318] Lock 5694118768 released on /Users/timothy.laurent/.cache/huggingface/datasets/_Users_timothy.laurent_.cache_huggingface_datasets_prodigy_dataset_builder_prodigy-ds-5f34378723c4e83f_0.0.0_e67d9b43d5cd82c50b1eae8f2097daf95b601a04dc03ddd504f2b234a5fa247a.lock 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 1.34ba/s] #0: 0%| | 0/1 [00:00<?, ?ba/s] #1: 0%| | 0/1 [00:00<?, ?ba/s] #2: 0%| | 0/1 [00:00<?, ?ba/s] #3: 0%| | 0/1 [00:00<?, ?ba/s] #4: 0%| | 0/1 [00:00<?, ?ba/s] #5: 0%| | 0/1 [00:00<?, ?ba/s] #6: 0%| | 0/1 [00:00<?, ?ba/s] #7: 0%| | 0/1 [00:00<?, ?ba/s] #8: 0%| | 0/1 [00:00<?, ?ba/s] ```
After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ```
289
map/filter multiprocessing raises errors and corrupts datasets After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ``` Sorry, I just noticed this. I'm running this on MACOS the version of datasets I'm was 1.0.0 but I've also tried it on 1.0.2. `pyarrow==1.0.1`, Python 3.6 Consider this code: ```python loader_path = str(Path(__file__).parent / "prodigy_dataset_builder.py") ds = load_dataset( loader_path, name="prodigy-ds", data_files=list(file_paths), cache_dir=cache_dir )["train"] valid_relations = set(vocabulary.relation_types.keys()) ds = ds.filter(filter_good_rows, fn_kwargs=dict(valid_rel_labels=valid_relations)) ds = ds.map(map_bpe_encodings, batched=True, fn_kwargs=dict(tokenizer=vocabulary.tokenizer), num_proc=10) # add all feature data ner_ds: Dataset = ds.map( add_bio_tags, fn_kwargs=dict(ner_label_map=vocabulary.ner_labels, tokenizer=vocabulary.tokenizer), ) rel_ds: Dataset = ner_ds.map( relation_ds_factory, batched=True, writer_batch_size=100, fn_kwargs=dict(tokenizer=vocabulary.tokenizer, vocabulary=vocabulary), ) ``` The loader is essentially a jsonloader with some extra error handling. The data is a jsonlines format with text field and a list of span objects and relation objects. In the `ner_ds` a field, `ner_labels` is added, this is used in the downstream `relation_ds_factory`. It all runs fine in a single process but I get a KeyError error if run with num_proc set : ``` File "/Users/timothy.laurent/src/inv-text2struct/text2struct/model/dataset.py", line 348, in relation_ds_factory ner_labels = example["ner_labels"] KeyError: 'ner_labels' ``` This is just one example of what goes wrong. I've started just saving the dataset as arrow at the end because it takes a long time to map/filter/shuffle and the caching isn't working (tracked it down to byte differences in the pickled functions). ^^ Interestingly if I heed the warning from Tokenizers and set the environment variable, `TOKENIZERS_PARALLELISM=true` the map just hangs: ``` [I 200921 21:43:18 filelock:318] Lock 5694118768 released on /Users/timothy.laurent/.cache/huggingface/datasets/_Users_timothy.laurent_.cache_huggingface_datasets_prodigy_dataset_builder_prodigy-ds-5f34378723c4e83f_0.0.0_e67d9b43d5cd82c50b1eae8f2097daf95b601a04dc03ddd504f2b234a5fa247a.lock 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 1.34ba/s] #0: 0%| | 0/1 [00:00<?, ?ba/s] #1: 0%| | 0/1 [00:00<?, ?ba/s] #2: 0%| | 0/1 [00:00<?, ?ba/s] #3: 0%| | 0/1 [00:00<?, ?ba/s] #4: 0%| | 0/1 [00:00<?, ?ba/s] #5: 0%| | 0/1 [00:00<?, ?ba/s] #6: 0%| | 0/1 [00:00<?, ?ba/s] #7: 0%| | 0/1 [00:00<?, ?ba/s] #8: 0%| | 0/1 [00:00<?, ?ba/s] ```
[ -0.3902373016, -0.0592107922, -0.0349890813, 0.2178631425, 0.1645163447, -0.1262938827, 0.2920174301, 0.2759281099, 0.0126605555, 0.1588913649, 0.0277856588, 0.5151475072, -0.46636796, 0.193190366, -0.3737512529, 0.063692905, 0.1268114299, -0.0707599074, -0.2591288686, 0.2371894717, -0.2843743563, 0.2678747475, -0.319167912, 0.1325836033, -0.4713805914, 0.0843235776, -0.06831453, 0.3312545419, -0.013863489, -0.58068645, 0.3135139048, 0.2603119612, 0.0385840386, 0.4804728031, -0.0001283317, 0.1993659735, 0.3485917449, -0.0834868401, -0.1267866045, -0.427905947, -0.1911364794, -0.1817947626, 0.3719673157, 0.0618712157, 0.2269575298, 0.0658861399, -0.1034057662, -0.2707558572, 0.2035180032, 0.4774687588, 0.0735512823, 0.2868753076, 0.2374195755, 0.139353916, 0.1045711935, 0.2273842245, -0.0284487568, 0.1546577513, 0.3186121583, -0.3993649185, -0.0570381694, 0.2126241922, -0.2640067935, -0.1909929961, -0.1678399593, -0.1613079607, 0.6544433832, -0.5508720279, 0.1051120907, 0.0620830022, -0.2794815004, -0.0378320366, -0.4952347875, -0.2726467848, -0.2027348131, -0.4610615671, 0.2794248462, 0.1045864373, -0.1968159974, 0.1120820567, -0.5338084102, -0.1942971051, -0.0470321774, -0.0213960372, -0.1469766498, 0.4835236371, 0.2019270509, 0.3012484908, 0.2571430206, 0.1812761575, 0.2967937887, -0.0528955907, -0.026606245, 0.1361510456, -0.3973771632, 0.1082417369, 0.0853556395, -0.0201199688, -0.2698209286, -0.1237291694, -0.2139738053, 0.0115729813, -0.122291036, 0.0902781785, 0.4019187987, -0.1053706035, 0.1600661576, 0.3277309537, 0.2415083349, -0.1646750718, -0.0319320075, 0.0494744852, 0.3143289983, -0.0866264701, -0.0327450894, 0.2530669868, 0.131675452, -0.1706767082, -0.2822542787, 0.1109132543, -0.1060235426, -0.1977370083, 0.0632685721, 0.1860615015, 0.0114506409, 0.7769072056, 0.1179552972, 0.2578342557, -0.3525592983, -0.0100729577, -0.0376887918, -0.0341650918, -0.3728176951, 0.122259222, 0.1942781806, -0.0074802693, -0.0251636412, 0.0497133099, -0.376331687, -0.2689578831, 0.0794304982, -0.2373991907, 0.0149757452, 0.6824308634, -0.114951089, 0.0577007309, 0.0015731174, -0.0498104729, 0.0627913177, 0.3190523982, -0.4979269207, -0.1897994131, -0.2720982432, 0.0515527353, -0.0527853034, 0.2867565751, -0.1496298313, 0.1937611848, 0.4530757964, -0.3586959243, -0.2541364133, -0.3936057687, -0.362046659, -0.3363008201, -0.0959207118, 0.2916576862, -0.4016880393, -0.0132424012, -0.0017038807, -0.1993291229, 0.3723528385, 0.1838977784, -0.0309001748, 0.0506739989, -0.1230265126, 0.0603301674, 0.1363580823, -0.1564815342, -0.1171427965, 0.2111740708, 0.0261723101, 0.3385827541, -0.0083542094, -0.2690423727, 0.2257902175, -0.116329506, 0.120139569, -0.0425290912, -0.1851773858, 0.0972473845, -0.3684076071, -0.0858109593, -0.0229808576, 0.009937264, 0.3281294405, 0.2012067586, -0.078519702, -0.5475060344, 0.3592302501, -0.1374182105, 0.2546218932, 0.1185208485, 0.0758695453, 0.1038609594, 0.0685259923, -0.3592194915, -0.4807929397, 0.1885324717, -0.2075803429, 0.0658969879, -0.1936081201, -0.0030009001, 0.1345405877, 0.2470799685, -0.2068516016, -0.1502069086, -0.0093076229, 0.0411522686, -0.2690404952, -0.0091306418, -0.2266807556, 0.5915709138, 0.1010034829, 0.1743610501, -0.1415228248, 0.2607122958, -0.1526127011, -0.4264977872, -0.2687481046, 0.1833890826, -0.0396082215, -0.1014268249, -0.2160595953, 0.4433249831, 0.5269333124, -0.0824617743, -0.1035686955, 0.084854573, 0.2232705951, -0.1494111121, -0.1771420687, 0.0278935023, 0.0121736601, -0.1099256277, 0.1677222103, 0.4997621477, 0.1795994639, 0.4273737669, 0.1819785237, 0.214172706, 0.2338956594, -0.0310654715, 0.0758879036, -0.1896804273, 0.2151063085, -0.076262325, 0.1706492454, 0.0909809545, -0.1406538635, -0.1778075993, 0.1783757359, -0.0294322409, -0.2232913077, 0.1606329232, 0.1228899956, -0.0887500644, 0.1456648558, 0.0523060188, 0.5058745742, -0.0183926132, -0.1601221561, 0.1790271252, -0.2569560111, -0.0544668213, 0.1409519613, -0.0223163553, 0.4753066599, 0.224064976, 0.138057217, 0.0309250578, -0.1429345012, -0.3365241289, 0.1123441383, 0.4086906612, -0.5074979067, 0.2802314758, -0.2813070416, 0.3557103872, 0.1002072841, -0.1912257671, -0.3339084387, -0.574213028, -0.1533732414, 0.5668956637, -0.0479358658, 0.1126047522, 0.0036663711, 0.0170779973, -0.1395085156, 0.2880707979, -0.1432931721, -0.2916988134, -0.1898863018, -0.1433071494, 0.351378113, -0.3711664081, 0.1469880491, 0.1352254748, -0.2889913619, 0.1344396174, -0.2641728818, 0.0789275616, -0.0919097364, 0.0494199842, 0.1642391235, -0.0096579641, 0.0157177821, -0.1921669394, 0.1474492848, -0.1172629297, -0.2671201527, 0.2706724703, -0.0943267941, 0.1018959582, -0.2709849477, -0.241150707, -0.4557648301, -0.1072817892, -0.0832762718, -0.27243343, 0.4007441998, 0.0954072624, 0.1183817461, -0.0040317811, -0.1268736124, 0.0669995323, -0.0192439612, -0.1305204928, -0.1743670702, 0.0603601001, -0.1189939082, -0.0972298756, 0.048586078, 0.1815910339, 0.4506171644, -0.4062322378, 0.0196718723, 0.0632364303, 0.0106001645, 0.1800068021, -0.0984770283, 0.3727630973, 0.4366262555, 0.0714963973, -0.0349689499, -0.1571933627, 0.0263094753, -0.0105996095, 0.1163598597, 0.1012193263, 0.3563600481, 0.0692336559, 0.6154001355, 0.3028966188, 0.0240008235, 0.4207397699, -0.1140318811, 0.1770394742, -0.1278586239, -0.3886899948, -0.2190794945, -0.2585812807, -0.0364084244, -0.1498003751, -0.114319995, -0.4521030188, -0.3066712022, 0.5371509194, -0.3963173628, -0.2275897264, -0.03571482, -0.5229924321, 0.319706291, -0.1332025975, 0.0923332572, -0.0210316181, -0.0123198219, -0.2608732283, 0.1982915699, 0.1756756604, -0.3334381282, -0.198969245, -0.2547500134, -0.4235696495, 0.2796325982, 0.2617283463, 0.6507918239, -0.0224344283, -0.091263406, 0.0490728356, -0.1589068025, 0.9467087984, -0.5530064702, -0.3837414682, 0.3207139969, -0.3916179836, -0.4114249647, -0.1919741035, -0.1682037711, 0.5057577491, 0.4449553788, 0.4266179204, -0.1363417953, -0.2412993312, 0.1054763049, -0.1660164148, -0.0559678562, -0.1100164354, -0.3457890153, -0.0248539746, -0.2040706426, 0.0148850959, -0.2748684883, 0.136773482, -0.2351039201, -0.0453106761, -0.1065621004, -0.0417623855, 0.2391812205, 0.094594568, -0.062073499, -0.2242544442, 0.0294657871, 0.0364419483, 0.0655656457, 0.3162379265, 0.3880426288, -0.429784447, -0.274839282, 0.1313022375, 0.036329709, 0.3011678159, 0.2805285752, 0.2016389072, -0.054518003, 0.0339011066, 0.1890444309, -0.2018796504, 0.0521097928, 0.3753200769, -0.0061680041, -0.5099648237, -0.1230166554, -0.0394016542, 0.4427992404, -0.1759002209, 0.4935840368, -0.4076265395, -0.2237098813, 0.3691267669, 0.1325160712, 0.8098663092, -0.3974617422, 0.0769531727, 0.173507899, -0.0470371395, 0.1863831878, 0.0982894301, 0.0292270966, -0.2433083057, 0.0711708963, -0.083150655, 0.0216636658, 0.3116820157, 0.2036146224, -0.1217253953, 0.3935731053, -0.0482358411, -0.1080321595, 0.0144363493, 0.1247011721, 0.4162153006, -0.0682364032, 0.0542993508, -0.0463158861, -0.0153231192, -0.1686322391, 0.0636389032, 0.0377893522, 0.2080997676, -0.0833260342, -0.3949975967, -0.018396344, -0.1994371265, 0.1233621612, 0.2540483177, 0.032319203, 0.00042101, 0.1316601634, -0.1175602376, -0.0195803195, -0.054159157, -0.1245343462, 0.1197446659, 0.3622775674, 0.1966810524, -0.0266355481, 0.3706841767, 0.1262274534, -0.2895876765, 0.0818762779, -0.2871002555, -0.3017193377, 0.0900712162, 0.1001077592, 0.061902795, 0.0978555754, 0.1198113561, -0.1417672783, -0.0813701004, -0.1729373634, 0.0055497275, -0.0217467919, -0.0733126253, 0.5219817162, -0.1842381358, -0.4113769531, -0.0405904353, 0.4757857025, 0.239655748, -0.0272115655, 0.1421350539, 0.107305944, -0.0383119993, -0.1914338917, -0.1821706891, 0.1619192809, -0.1182458103, 0.302651763, 0.1182360947, -0.1970137656, 0.2240913808, 0.0110585913, 0.0495457016, 0.3996920586, -0.2089020312, -0.4455859661, -0.4509826899, 0.1496694386, 0.0523024723, 0.0951909348, -0.0578682087, 0.1346494853, -0.0781705081, 0.2322915941, -0.1699385941, 0.1046628952, -0.0788068697, 0.1839054078, -0.0657351986, -0.0033373679, -0.081356138, -0.1864001751, 0.0610803552, -0.0536471792, -0.181994319, 0.0190770477, 0.009059824, 0.1597506702, 0.0806202739, -0.1893982738, -0.0404151008, -0.1286807358, -0.0983986408, -0.1971827447, 0.18598032, 0.296523273, -0.2234842032, 0.130963102, 0.1407269686, 0.1391109079, 0.1234750748, 0.2911406159, -0.261875689, 0.0812755674, 0.0843398571, 0.4735641778, 0.1265993416, -0.14444758, 0.0068722814, 0.0722143501, -0.0797409415, -0.0387319811, 0.3153666258, -0.2142033577, 0.0150713399, -0.0385252088, 0.4274144471, 0.441170007, -0.1120110303, 0.0696166232, 0.1674495339, 0.0414151475, -0.0065669566, -0.1578636467, 0.4595560133, 0.1683234721, 0.0615443289, 0.369309783, 0.2341259271, -0.176297009, 0.2079520971, 0.116348803, 0.0093933269, 0.1069398373, 0.228132993, 0.2055075765, 0.0888193399, 0.2472229749, 0.2893259525, -0.27178213, -0.1593870074, 0.320710361, -0.0138620101, 0.548751235, -0.12501055, 0.0803008378, -0.0502975285, -0.6033816338, 0.0717618763, 0.0658150762, -0.4713745117, 0.1535796821, -0.0319698676, 0.2851428688, -0.3424896002, -0.4615473747, -0.1285650134, 0.4549018145, -0.215129137, -0.2811868191, -0.2522720695, -0.1286940426, -0.0143390596, 0.0968538821, -0.098799631, 0.0768011957, 0.8586915135, 0.1223396212, -0.0632999614, -0.2654591501, -0.3154931366, -0.0107817166, 0.311997354, -0.0386173502, 0.4298148751, 0.0076611638, 0.0211524051, -0.0095240409, 0.0986579582, 0.5285563469, 0.70203197, -0.4172753394, -0.0881647766, 0.0402844138, -0.0154565275, -0.0769825056, 0.36749354, 0.0732595772, 0.4442250133, 0.3028239012, -0.0867995247, 0.0544059128, -0.1054819822, 0.3087967634, 0.3557129204, 0.0208868757, 0.3362544179, 0.0463114865, -0.1064685509, 0.1838286817, 0.0949918702, -0.3379245996, -0.2214139551, 0.5490165949, -0.3919815719, 0.2346205413, 0.1849069893, 0.0182023682, 0.0998835936, 0.4574815631, 0.3904988766, 0.2112109959, -0.2984004617, -0.2296906114, -0.4322274923, 0.0492715687, -0.2420562357, 0.1895723939, 0.0139316693, -0.0174988285, -0.0689750761, 0.0357958376, -0.0441636965, -0.1033582613, 0.2798514664, 0.312505126, -0.5940949917, 0.1194915622, 0.0456097573, -0.0892167687, -0.0052897222, -0.2988870144, 0.382737577, -0.0838380158, -0.1470668614, 0.0633487254, 0.1591051817, 0.0582981855, 0.2646428347, 0.0672977269, 0.2498589605, 0.5691508055, 0.028830722, -0.0433515646, -0.2855924964, 0.0926691443, -0.1503718346, -0.028702531, 0.0138801839, 0.4206103981, -0.2453812361, -0.011322489, -0.0500106476, 0.2309752852, 0.0593192801, 0.1608804166, -0.3077153563, 0.1874895692, 0.2092628479, 0.0709564835, -0.1115401387, 0.2920913696, 0.1933349967, 0.3580842316, -0.3929081559, -0.1799937785, 0.3470680714, -0.106625244, -0.1602697074, -0.4940543771, 0.1251234114, -0.2230124325, 0.1019251421, -0.6306143403, -0.3223096728, 0.2625122368, -0.2387552559, -0.5590648651, 0.1612030268, -0.0511403605, -0.1086179018, -0.1057393402, -0.1461656541, 0.0927548856, -0.1470592767, 0.1850191355, -0.1125656515 ]
https://github.com/huggingface/datasets/issues/620
map/filter multiprocessing raises errors and corrupts datasets
#659 should fix the `KeyError` issue. It was due to the formatting not getting updated the right way
After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ```
18
map/filter multiprocessing raises errors and corrupts datasets After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ``` #659 should fix the `KeyError` issue. It was due to the formatting not getting updated the right way
[ -0.3512135148, -0.1855427176, -0.0195905864, 0.2103054225, 0.1381861717, -0.1670842171, 0.2795540988, 0.3684960008, 0.0815795809, 0.1086879522, 0.088909708, 0.3803422153, -0.400390178, 0.4001254439, -0.3610640764, -0.0263174009, 0.112080723, -0.011387717, -0.3007360995, 0.2906776667, -0.271874547, 0.2635469437, -0.365416646, 0.149477154, -0.5365363955, 0.0468142629, -0.1020093411, 0.2186747789, -0.0090191979, -0.6350334287, 0.3107409477, 0.2939075828, -0.0312146209, 0.4254748225, -0.0001246411, 0.1012203544, 0.3174883127, -0.1645225137, -0.0690910667, -0.2924407721, -0.2228090912, -0.2399376631, 0.2814205885, 0.1030043215, 0.1462619603, -0.064540945, -0.1684445888, -0.1902515292, 0.2530598938, 0.3587375879, 0.1109459177, 0.1981884241, 0.3393392265, 0.1626250148, -0.0395518914, 0.1583098471, -0.028863024, 0.0209672302, 0.3021858633, -0.2966943979, -0.0835425854, 0.3755783141, -0.210648343, -0.2462290823, -0.0806411058, -0.2204286307, 0.6230469346, -0.5610293746, 0.1792757362, -0.0036663637, -0.1783300042, 0.0002154933, -0.4601678848, -0.2397008091, -0.2292335629, -0.4920297563, 0.2561889887, -0.009299878, -0.1389594674, 0.1675010771, -0.459616214, -0.2790908813, -0.0793251842, -0.0925052762, -0.0451417342, 0.5593504906, 0.1581875086, 0.2434730232, 0.0868088752, 0.1045117825, 0.1365901083, -0.0745358765, -0.1486057937, 0.1676540375, -0.3396683931, 0.0580590591, 0.038016282, -0.0539468341, -0.2356516868, -0.0823006034, -0.1954458952, 0.1293754876, -0.0923485607, -0.0000642641, 0.427798599, -0.0651587844, 0.3168871999, 0.2542018592, 0.2367102504, -0.2161654085, 0.0550286919, 0.0133820157, 0.3371007442, -0.1049287319, -0.0162286758, 0.3913568258, 0.119244799, -0.1721611917, -0.1057620272, 0.1788408011, -0.0347326994, -0.2633395791, 0.0211246051, 0.1464976817, 0.0519357249, 0.7958669662, 0.137925759, 0.1675083041, -0.3039821386, -0.0909311324, -0.0441625938, -0.0180005282, -0.4197021723, 0.0411582515, 0.2246309817, -0.0470209196, -0.0232575741, 0.0513088256, -0.394294858, -0.2987008989, 0.0350020379, -0.2101655304, 0.1784783453, 0.6543557644, -0.1317067593, 0.0979318321, -0.0364758857, -0.196934104, 0.0777078867, 0.2401844263, -0.5455408692, -0.1287515163, -0.2255560756, 0.0737910792, 0.0727318153, 0.3116740584, -0.1367509514, 0.2338905036, 0.4383321702, -0.3816612959, -0.275270462, -0.3559556007, -0.3625252843, -0.3146935701, -0.050202854, 0.3140784502, -0.4769178629, -0.07016249, 0.153088212, -0.1712599099, 0.3403356671, 0.1905318797, 0.0236396007, 0.0471596643, -0.1083848029, 0.1005759388, 0.1277478039, -0.2211736441, -0.1791698039, 0.3235460222, 0.0132384598, 0.365031004, 0.048726324, -0.2932929397, 0.3140542507, -0.085017018, 0.2829433978, -0.1017532647, -0.1259889305, 0.0704026446, -0.4282628894, 0.0150120333, -0.0669375956, -0.1675518751, 0.3579865694, 0.1379957795, -0.2026560754, -0.4753989577, 0.3315639794, -0.1146085635, 0.3142384589, 0.0391067378, 0.0674776584, 0.0599835888, 0.1719399542, -0.2382503897, -0.4738482535, 0.1375689805, -0.2226943821, 0.0047199293, -0.1448161751, 0.0010651499, 0.0444624089, 0.1925702542, -0.2922156453, -0.2492040396, 0.0417656787, 0.0107119381, -0.3038727939, -0.082319878, -0.1843635738, 0.5929727554, 0.219553709, 0.1409044862, -0.0882688165, 0.1312271655, -0.2019508034, -0.420919925, -0.2407812327, 0.2778551579, -0.0094553009, -0.1175536215, -0.2070065141, 0.5246533751, 0.5095102787, -0.1137295291, -0.1429156363, 0.1915372163, 0.2684912682, -0.077046223, -0.1743263304, 0.0702150464, 0.022708945, -0.1169538125, 0.0810184032, 0.5475664139, 0.0962424353, 0.4146903157, 0.149448514, 0.2452739477, 0.294958204, 0.0570026487, -0.0602537096, -0.2418473512, 0.1514019519, -0.2580642998, 0.1053333879, 0.0012399424, -0.1381929517, -0.1894831657, 0.1466968209, 0.1051098555, -0.1097237766, 0.0170605704, 0.0435087644, -0.1117244884, 0.0992233753, 0.0218310654, 0.3506154716, 0.0209109709, -0.2784492075, 0.174003467, -0.1862999201, -0.0493128672, 0.1387516856, -0.0766849667, 0.3364539146, 0.3832503259, 0.0561588332, 0.010623157, -0.119214952, -0.2493800819, 0.0324691013, 0.3667413592, -0.507388413, 0.215248853, -0.2861204743, 0.2549752593, 0.0421197824, -0.1092754751, -0.1926585585, -0.5729332566, -0.1137763038, 0.420992136, -0.0548843481, 0.1483544558, -0.0123486631, 0.034480989, -0.1731373072, 0.3812192976, -0.1331983954, -0.3283342719, -0.1470112652, -0.1047791615, 0.3488054872, -0.324047327, 0.2100393027, 0.0738014132, -0.1691034138, 0.2368293554, -0.3342378438, 0.108295463, -0.0472086295, -0.0064861588, 0.1615285426, 0.051807031, 0.0494496226, -0.2052883506, 0.1594746411, -0.1578569412, -0.2601364851, 0.3101466596, -0.1496079862, 0.1178079844, -0.2731794119, -0.3018985391, -0.3529520929, -0.0495851189, -0.0805211216, -0.2224937677, 0.3450746834, 0.1496510655, 0.0659469366, -0.0863048956, -0.1298867613, 0.0984943658, -0.0306521393, 0.0026160106, -0.1935722083, 0.0507387668, -0.1470982432, -0.0774890706, 0.0760760903, 0.0738080144, 0.340960592, -0.284974426, 0.0146596506, 0.0951363444, -0.1259039044, 0.2607450187, -0.1134284735, 0.3486634493, 0.451811254, 0.0555606931, -0.0486737043, -0.1659443527, 0.0462607369, -0.005788154, -0.0026093125, 0.1092260852, 0.3223178387, 0.1327675432, 0.6104083657, 0.3465013504, 0.1186999679, 0.3046036065, -0.1118748188, 0.1353443414, -0.2403228879, -0.3785362542, -0.2232073247, -0.2772703469, 0.0575736016, -0.1354537904, -0.1057831272, -0.3906255364, -0.2962542176, 0.4868231416, -0.4268260598, -0.2603374124, 0.0627298728, -0.5000078678, 0.2818248272, -0.0854155496, 0.1432682276, -0.041168008, -0.0294575449, -0.2238724828, 0.193901971, 0.1314987242, -0.3822317123, -0.2747119665, -0.3799423277, -0.472422719, 0.3566466868, 0.2372461408, 0.7411825657, 0.043160297, -0.0587892942, 0.0283007827, -0.1665702462, 0.8739989996, -0.6421237588, -0.4028035998, 0.2552216351, -0.4388188124, -0.2836795449, -0.2003700435, -0.2084673047, 0.6245747209, 0.3335071206, 0.5355910063, -0.1612274945, -0.3011723161, 0.1604256481, -0.1607852578, -0.1677838564, 0.0375810526, -0.3154363632, 0.0333150253, -0.2019889802, 0.0484582521, -0.3043012023, 0.1444118917, -0.1331543922, 0.0173655823, -0.0812011436, -0.0779056624, 0.1874219775, 0.0885714889, 0.0324916951, -0.1064601466, -0.0421272516, 0.0955705568, 0.1971702278, 0.2318534404, 0.2196662873, -0.4627838731, -0.0665588304, 0.1394234449, -0.1067315936, 0.3257446885, 0.3188469112, 0.2260961533, -0.0660924092, 0.087082155, 0.16524975, -0.3262057602, -0.0555946603, 0.3966927826, -0.031492725, -0.5680362582, -0.2148531377, -0.1295143217, 0.3982080221, -0.2285053134, 0.4416518807, -0.3164743781, -0.2612659037, 0.4667617083, 0.0664019063, 0.8066738844, -0.3608475626, 0.0161100458, 0.0260823779, -0.0325377807, 0.1356407553, 0.0996823981, 0.0458452031, -0.365354985, -0.0213857796, -0.0681434646, 0.0656673685, 0.2799921036, 0.1120453626, -0.2234570235, 0.4462041259, -0.0944938138, -0.0914891437, -0.0715277493, 0.1839322895, 0.4588892758, -0.0594944097, 0.0677262545, -0.047055576, -0.0335689783, -0.1416151226, 0.0173166357, 0.0303137377, 0.286464572, -0.2391976118, -0.4589391053, -0.125800699, -0.0731942356, 0.0951592475, 0.1623541266, 0.0272925198, -0.0312512778, 0.1535901725, -0.0549029671, 0.1623505354, -0.0921655968, -0.1436306536, 0.0721742511, 0.3353670835, 0.1709643304, 0.0916817188, 0.3252388239, 0.1279742718, -0.2073791325, 0.0669182837, -0.2403382212, -0.1906892061, 0.0515940934, 0.0379664525, 0.0266989488, 0.0648271516, 0.0511718504, -0.0490257107, -0.0993204564, -0.2013463676, 0.0393683538, -0.0495409444, -0.0926296785, 0.5310727358, 0.0182946399, -0.3574020267, 0.0256604888, 0.4819168448, 0.2871119976, -0.0772084966, 0.2314323485, 0.1426071674, -0.0611015223, -0.2567400932, -0.1564757079, 0.2738890648, -0.1716050208, 0.2461796105, 0.0910366774, -0.1785511225, 0.1953543723, 0.1071569547, 0.1189333051, 0.3926018775, -0.2192410529, -0.337839067, -0.4110481739, 0.2427577972, 0.1089818999, 0.0621583834, -0.1027742252, 0.1274105161, -0.1135089248, 0.3102332354, -0.2243910432, 0.0437063202, -0.0038377345, 0.2115469873, -0.0914865881, 0.067568019, -0.1171539351, -0.1400817931, 0.0643865094, 0.0095130205, -0.1756792963, -0.0247321185, 0.0113602281, 0.121832855, 0.0662019253, -0.1837544739, -0.0681135878, 0.0006086901, -0.0888133794, -0.2294078022, 0.1706518531, 0.3239960372, -0.184939906, 0.2520669997, 0.1697711796, 0.1847814918, 0.0892378241, 0.3003498614, -0.2528876662, 0.0888920501, 0.0715159029, 0.4085336626, 0.0848134458, -0.1983519197, 0.0351340286, 0.1764116883, -0.1393333822, 0.0583749712, 0.2958631217, -0.2519659996, 0.0858209878, 0.0182248987, 0.4392741919, 0.4068155289, -0.1030398458, 0.15773651, 0.0806600899, 0.0935271233, 0.0447814092, -0.228135705, 0.5149974823, 0.0612498373, 0.0703287125, 0.3585091829, 0.3325627148, -0.1416946501, 0.149649784, 0.1260899901, -0.1130568385, 0.008072421, 0.1593113989, 0.3051337004, 0.1120935604, 0.2470810115, 0.2925590277, -0.2948814034, -0.1773034036, 0.2537924647, 0.0569483638, 0.4753107131, 0.0924654603, 0.1063256711, -0.1624209434, -0.6658924818, 0.1066897213, 0.0673350766, -0.4565558136, 0.2207166851, 0.0476552173, 0.2264571935, -0.3147784173, -0.4356149137, -0.1328766048, 0.383053571, -0.274025321, -0.3404760659, -0.2711841762, -0.1706429124, -0.096193321, 0.0628361255, -0.1690145731, 0.2519634962, 0.7469223142, -0.0266129635, -0.0577593446, -0.1461755782, -0.3229094446, -0.0722545683, 0.4131664038, -0.0892191753, 0.37828511, 0.0570879765, 0.1401821673, 0.0034943661, 0.168492645, 0.4758543074, 0.6625870466, -0.476247251, -0.0825558752, 0.0828101486, -0.1063214988, -0.0610712022, 0.4503816068, 0.0963008702, 0.2951021492, 0.3415158391, -0.0530047305, 0.0091972649, -0.1055934876, 0.3893710375, 0.3519672751, 0.1573843211, 0.2242009193, 0.0683892742, -0.1439942718, 0.2735784352, 0.0962112844, -0.3139981627, -0.3135891855, 0.4868746996, -0.3782310188, 0.2376129329, 0.1878693998, 0.0577968657, 0.175768584, 0.5463712215, 0.3670724034, 0.1067021787, -0.2591425776, -0.1662556082, -0.3945910931, -0.0790591091, -0.1534974724, 0.2001519799, -0.0425515324, -0.0055499561, 0.0010906942, 0.0817844868, -0.031058073, -0.2819318175, 0.2874448895, 0.3652768433, -0.5556681156, 0.2423667908, 0.1740244329, -0.0015558153, -0.0395296291, -0.2589583993, 0.3855096102, 0.0197237916, -0.0677195564, 0.0735882446, 0.1376982927, 0.0231756642, 0.1443929374, -0.0157722794, 0.2998054028, 0.6039763093, -0.049515523, -0.0536488071, -0.2163182944, 0.0924261212, -0.1226850823, 0.0438338295, 0.0929370448, 0.5019170046, -0.1521149278, -0.0962426588, 0.054656744, 0.2140315622, -0.0115538388, 0.2152946591, -0.4327302575, 0.2691525817, 0.2196224779, -0.0632073432, -0.160373494, 0.4091359675, 0.1511186957, 0.2834835351, -0.2557690442, -0.1834850758, 0.2162733376, -0.0245579034, -0.1608249843, -0.5821933746, 0.1678627878, -0.1972013861, 0.1791549921, -0.491659224, -0.2939209938, 0.1788070798, -0.1754195392, -0.5868239403, 0.2050277591, -0.1429530382, -0.121679619, -0.0698916465, -0.1655346602, 0.0962967426, -0.1347576529, 0.0342642218, -0.067989409 ]
https://github.com/huggingface/datasets/issues/620
map/filter multiprocessing raises errors and corrupts datasets
Also maybe @n1t0 knows why setting `TOKENIZERS_PARALLELISM=true` creates deadlock issues when calling `map` with multiprocessing ?
After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ```
16
map/filter multiprocessing raises errors and corrupts datasets After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ``` Also maybe @n1t0 knows why setting `TOKENIZERS_PARALLELISM=true` creates deadlock issues when calling `map` with multiprocessing ?
[ -0.435765326, -0.1091974452, -0.041728802, 0.2121774703, 0.0695552081, -0.1265095323, 0.2546491325, 0.3328143954, 0.1938223541, 0.1219435558, 0.1160639897, 0.4567503631, -0.409302175, 0.2884701192, -0.3752999306, 0.0169867612, 0.0992287844, -0.0942483693, -0.3434431255, 0.2482097298, -0.2782728076, 0.2710520029, -0.3459439874, 0.1593250483, -0.5353227258, 0.0872880071, -0.0039732605, 0.2528873086, -0.0188715309, -0.573178947, 0.2949803472, 0.2831940353, 0.0004396215, 0.4602258801, -0.0001217121, 0.1132881939, 0.369900316, -0.0942806005, -0.0318267569, -0.382178843, -0.2222524434, -0.2362085879, 0.2945426404, 0.1039726734, 0.2161952257, 0.0128661767, -0.0772561952, -0.2985166907, 0.1986686438, 0.3728411794, 0.1237755865, 0.2111008465, 0.2978761196, 0.1220003441, -0.042381417, 0.1499402523, -0.0497975461, 0.0485655367, 0.344838798, -0.3739500046, -0.1388041973, 0.2796157897, -0.236296922, -0.1811294556, -0.120690465, -0.2400204092, 0.6395819187, -0.5958401561, 0.1814057827, 0.0657548606, -0.2609023452, 0.0006291221, -0.4477959275, -0.1799362749, -0.2648522854, -0.4586627483, 0.2208006978, 0.0218167529, -0.1726399362, 0.1386403888, -0.5251964927, -0.21435754, 0.0352458134, -0.0544740558, -0.0366727859, 0.5323663354, 0.2305613011, 0.3113503754, 0.1949431598, 0.1283578128, 0.1138352901, -0.0167993028, -0.0804870129, 0.1571734548, -0.4042955339, 0.1016924977, 0.1372584403, -0.1121078283, -0.2406966537, -0.0560521781, -0.162900418, 0.0969961211, -0.0776492208, 0.0668475255, 0.4337732196, -0.1338674277, 0.1751340777, 0.2501187027, 0.251948595, -0.2092768252, 0.0151374042, 0.031740196, 0.324344933, -0.1119965613, -0.0390044637, 0.3069897294, 0.0965445638, -0.1498584896, -0.1592803001, 0.0878243893, -0.1078606546, -0.2057653964, 0.1011891365, 0.1439214647, 0.0724172369, 0.7728224397, 0.1620068848, 0.1372258365, -0.3340580463, -0.0898867846, -0.0551433042, -0.0666498989, -0.4002196789, 0.1255270243, 0.2563376427, 0.0599528998, 0.0080618896, 0.0419394337, -0.3320832551, -0.2169158757, 0.1727741212, -0.2766209245, 0.0906430781, 0.6310876608, -0.0971728191, 0.0243759528, -0.0824457705, -0.0286205038, 0.0511686653, 0.2955010533, -0.5336508155, -0.1557148248, -0.1656828225, 0.1078612208, 0.0666459054, 0.3060177565, -0.1915564984, 0.2547092438, 0.3675668836, -0.3739644885, -0.2680077851, -0.3238868713, -0.4069845378, -0.3038036227, -0.0046002716, 0.2971156836, -0.3287836313, -0.0234117433, 0.0687608868, -0.2095827758, 0.2784109414, 0.2272230983, -0.0418689661, 0.0260139182, -0.1166274995, 0.1641641259, 0.0441025645, -0.2357016504, -0.0901373774, 0.2893678248, -0.0088906288, 0.3299494982, -0.0470984839, -0.2168495059, 0.3346087933, -0.131182611, 0.2851615846, -0.0148703791, -0.196922332, 0.104875356, -0.4357779026, -0.0461993515, -0.0751761198, -0.082395345, 0.3545597494, 0.1582431942, -0.1317893714, -0.4724712968, 0.2870239317, -0.0712426305, 0.3042712808, 0.0659305751, 0.0621415712, 0.0573612861, 0.1319959015, -0.283013016, -0.5129541755, 0.1994396299, -0.2078187019, -0.0241888743, -0.0505434647, 0.0152705684, 0.1672002375, 0.1793144345, -0.1974945515, -0.1488035768, 0.0632105842, -0.0676073357, -0.2822653949, -0.0553280711, -0.1842778623, 0.6492671371, 0.1441821158, 0.0867221653, -0.0818907171, 0.2211697996, -0.1520410031, -0.4408724308, -0.2989485562, 0.2226273715, -0.0454043448, -0.0716129839, -0.2133343965, 0.4630846083, 0.4911282361, -0.0391305834, -0.0376370139, 0.1667616218, 0.2644604445, -0.1828807592, -0.157065168, 0.123726353, 0.0492206886, -0.1837464273, 0.2333983183, 0.4180729985, 0.0836040974, 0.388074398, 0.2444251478, 0.2387427539, 0.2779891491, 0.0294540301, -0.0278629363, -0.1282216012, 0.212402299, -0.1187336668, 0.1564509422, 0.0764602125, -0.0925447047, -0.2441881448, 0.162702024, 0.0704164356, -0.1369758546, 0.0474279784, 0.0893529952, -0.1253039837, 0.1578321159, 0.0026263744, 0.4480131567, 0.0147290267, -0.1836165041, 0.1479918212, -0.2041997612, -0.0055111162, 0.0887256637, -0.0780097693, 0.4078103304, 0.2825427055, 0.0658046454, -0.0059380699, -0.1520556211, -0.3619626164, 0.1194626391, 0.3093266189, -0.4322057366, 0.1622137725, -0.2236522138, 0.3789779544, 0.1409335732, -0.1527126431, -0.2679883838, -0.5417202115, -0.1163032278, 0.5250685811, -0.1006807983, 0.1325858235, -0.0776601061, 0.0356069058, -0.2600835562, 0.398177892, -0.1304371804, -0.3576634824, -0.1955151856, -0.0964713171, 0.3104199767, -0.229881227, 0.2243646234, 0.159036696, -0.25536412, 0.1742388308, -0.2463355362, 0.0499620959, -0.0595837161, 0.0070890952, 0.1035204679, -0.0094829947, 0.0074033067, -0.1684113294, 0.1527904123, -0.1796363294, -0.2490492463, 0.2781175077, -0.1366078258, 0.077483505, -0.2758560181, -0.3428671956, -0.4242916405, -0.1626054049, -0.0231466182, -0.2679258883, 0.3306126893, 0.0996187702, 0.0179552808, 0.0415382907, -0.0787676722, 0.0687174499, -0.0472173132, -0.062254291, -0.2065600008, 0.045469325, -0.170264259, -0.0546773113, 0.1007056013, 0.0798720866, 0.4131042361, -0.2472682893, 0.0766091719, 0.0603870973, -0.0629387647, 0.2250368595, -0.1975021958, 0.3669520319, 0.3852896094, 0.0340520442, -0.0286279023, -0.1490016431, 0.1410464793, 0.0047074761, -0.0262566656, 0.150727123, 0.3091221154, 0.1691418737, 0.6082268953, 0.3968894482, 0.0640834123, 0.3936432004, -0.0730802864, 0.0929011554, -0.2089507133, -0.3948881626, -0.2156749815, -0.26596874, -0.0087279007, -0.2322268337, -0.1142556965, -0.4359851182, -0.3072883785, 0.5583679676, -0.3613696992, -0.2352525294, -0.0361660272, -0.4950182438, 0.2937208414, -0.1269937605, 0.0426194519, -0.0449107885, -0.0112305954, -0.2169744521, 0.2020893991, 0.1722292751, -0.3479178548, -0.2839288116, -0.2502113581, -0.4786643982, 0.2983989716, 0.2423680872, 0.6819918156, -0.0241154283, -0.0504564792, 0.0272307117, -0.1523957849, 0.8559722304, -0.5813963413, -0.3834443688, 0.3133417368, -0.4112901986, -0.2568884194, -0.1569907367, -0.2074821144, 0.5571997166, 0.4230453074, 0.4213474691, -0.1724169105, -0.2897489667, 0.1223191172, -0.2117929459, -0.0765745118, -0.0049094185, -0.3257114589, 0.041343011, -0.1830752343, 0.0275969338, -0.2412199378, 0.0921081305, -0.1844051629, -0.1235986724, -0.0644759238, -0.0527575016, 0.1900016218, 0.0667349026, -0.0137832835, -0.2407398373, 0.0838181227, 0.0814903453, 0.1122455001, 0.2761126161, 0.2073543668, -0.4210834503, -0.1212014556, 0.1534611881, 0.0207235739, 0.274756521, 0.248124823, 0.2212694585, -0.0763029605, 0.0679317564, 0.153778702, -0.3496288657, -0.0344163626, 0.4136735499, -0.0135288872, -0.4599657059, -0.1147128195, -0.0684801191, 0.4527405798, -0.1746955514, 0.5284519196, -0.4588579237, -0.2545073628, 0.3466488719, 0.0760229751, 0.7190544605, -0.401617974, -0.0562173091, 0.0151903406, -0.0228416026, 0.0593846031, 0.0745065659, 0.039590545, -0.2484300286, 0.0306456946, -0.0443414077, 0.0463400409, 0.3351759613, 0.1809005737, -0.1599565595, 0.3927202225, 0.0175095052, -0.1283547282, -0.0871316344, 0.2037595212, 0.4618615508, -0.0916176215, 0.1236860901, 0.0090226755, -0.0373492464, -0.1418354511, 0.0292476602, 0.0303378142, 0.291194737, -0.1087181419, -0.4325705469, -0.0954680443, -0.1311915964, 0.1541175544, 0.1544954628, 0.1990783811, -0.0326028503, 0.1023311466, -0.1661851853, 0.0416754894, -0.1075470001, -0.1571249217, 0.0811330825, 0.3711444736, 0.1824039221, 0.0543466695, 0.3112120032, 0.0866437554, -0.2523908913, 0.0381992683, -0.2390208989, -0.2315142751, 0.0284902602, 0.0544318855, 0.077021718, 0.1647557914, 0.0821128562, -0.008717142, -0.1461447477, -0.1440383494, 0.0647432506, -0.0179149508, -0.1472046673, 0.488101989, -0.0362952016, -0.3848832846, 0.0069668666, 0.498195827, 0.2140212655, -0.0611985773, 0.1149507165, 0.1187436879, -0.0506511629, -0.244360745, -0.2095258534, 0.2826487422, -0.1479828209, 0.2815427184, 0.0740329176, -0.2316274643, 0.2187854052, 0.0325040631, 0.0637236908, 0.3333226442, -0.2180024683, -0.337672621, -0.4480594397, 0.3275715709, 0.0213086959, 0.0697893277, -0.0914683118, 0.074215211, -0.0702163428, 0.2902165949, -0.2416673005, 0.1283611357, -0.0914110914, 0.2105484009, -0.0862654299, -0.0200430341, -0.1199355125, -0.0852736831, 0.083680205, -0.0180114638, -0.2818262577, -0.0459663048, 0.0401624516, 0.1124787331, 0.0590034761, -0.2106228322, -0.027525451, -0.1331014335, -0.1302310377, -0.2069944292, 0.1849372089, 0.349614352, -0.2491640449, 0.1554154158, 0.2163190097, 0.1715703309, 0.0557655394, 0.3391492963, -0.2267326713, -0.0104441755, -0.0211600773, 0.4098550081, 0.1665137708, -0.1714044809, -0.0185562298, 0.0839517042, -0.0475133508, 0.0274635479, 0.3284187317, -0.1623184681, 0.0814617276, 0.0122956745, 0.4144148231, 0.3997436762, -0.0963430554, 0.0735945776, 0.1048043221, 0.1085939035, 0.0461438075, -0.1891907156, 0.5273977518, 0.0765307173, 0.1033681035, 0.27099666, 0.3167988658, -0.123927176, 0.1547927409, 0.1226013079, -0.1012960896, -0.0107701924, 0.2172137052, 0.185554415, 0.0957501456, 0.2719646692, 0.1953196973, -0.3276027441, -0.1445700079, 0.3017547429, -0.0117889903, 0.6197062135, -0.0176016055, 0.172077328, -0.068152681, -0.6532101035, 0.1134993061, 0.0888802409, -0.5180873871, 0.2230885625, -0.0083671734, 0.2138802707, -0.3701969385, -0.4355391562, -0.1738756895, 0.3970206678, -0.3046890795, -0.3131808341, -0.1625992805, -0.1394880563, -0.0880024284, 0.0597064495, -0.1238610074, 0.2549750507, 0.8632918596, 0.0616494492, -0.0298705399, -0.1592309326, -0.3787844777, -0.0492615402, 0.3128964305, -0.0241105929, 0.3672713339, -0.0567248501, 0.0261304826, -0.0641734302, 0.2130479813, 0.420389235, 0.7252753377, -0.4599180222, -0.0259212703, 0.15455845, -0.0643242896, -0.111235179, 0.4476116896, 0.1016503274, 0.2901468873, 0.2144195139, -0.0477450155, 0.0076079443, -0.2236595452, 0.2952458262, 0.3430364132, 0.0674071461, 0.3543583751, 0.0387330949, -0.1062367707, 0.2686705291, 0.0939203948, -0.2781713605, -0.2895061672, 0.4929399788, -0.4646971822, 0.1978831738, 0.1666086316, 0.0611948743, 0.0654459894, 0.4523868263, 0.2915227711, 0.0668631196, -0.3113784194, -0.1597790271, -0.4288888872, 0.0130604617, -0.2220528573, 0.3328101635, 0.0194288082, -0.0061470531, -0.0825501606, 0.0802585781, -0.0234006718, -0.2516878247, 0.3622835875, 0.3247750103, -0.5693950653, 0.2066014409, 0.0964000672, -0.0811118856, 0.0033284575, -0.2955179811, 0.3567750752, 0.0581231266, -0.0845472291, 0.0598844327, 0.1237469614, 0.0907380581, 0.2181309462, -0.0084369481, 0.2849670649, 0.5745164156, -0.0258134045, -0.0119339228, -0.2263607681, 0.0990827531, -0.0619273521, -0.0422169641, 0.0261934325, 0.4040823877, -0.2270919085, -0.0838271528, -0.0307072513, 0.188560456, -0.0535224937, 0.1660209298, -0.4402957857, 0.1806250513, 0.1818243265, 0.0434455723, -0.1316502541, 0.3896897137, 0.191121757, 0.2865234613, -0.2992206216, -0.1361807883, 0.2220960706, -0.0719351619, -0.0803386271, -0.6504668593, 0.2088208199, -0.2121042758, 0.1164397597, -0.5232221484, -0.3167285919, 0.1987284273, -0.1904967278, -0.5479646325, 0.240485698, -0.1207569316, -0.162224561, -0.0673248246, -0.1826599538, 0.1011771485, -0.1056639105, 0.2155492902, -0.0217617676 ]
https://github.com/huggingface/datasets/issues/620
map/filter multiprocessing raises errors and corrupts datasets
@lhoestq Thanks for taking a look. I pulled the master but I still see the key error. ``` To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) #0: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 21.56ba/s] #1: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 17.71ba/s] #2: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 20.45ba/s] #3: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 26.05ba/s] #4: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 26.83ba/s] #5: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 27.00ba/s] #6: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 27.40ba/s] #7: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 25.91ba/s] #8: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 22.46ba/s] #9: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 20.15ba/s] #10: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 26.81ba/s] #11: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 27.45ba/s] 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████| 322/322 [00:00<00:00, 1462.85ex/s] Traceback (most recent call last): | 0/1 [00:00<?, ?ba/s] File "text2struct/run_model.py", line 372, in <module> main() File "text2struct/run_model.py", line 368, in main | 0/1 [00:00<?, ?ba/s] run_model(auto_envvar_prefix="GFB_CIES") # pragma: no cover File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/lib/python3.6/site-packages/click/core.py", line 829, in __call__ return self.main(*args, **kwargs) | 0/1 [00:00<?, ?ba/s] File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/lib/python3.6/site-packages/click/core.py", line 782, in main rv = self.invoke(ctx) File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/lib/python3.6/site-packages/click/core.py", line 1236, in invoke return Command.invoke(self, ctx) File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/lib/python3.6/site-packages/click/core.py", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/lib/python3.6/site-packages/click/core.py", line 610, in invoke return callback(*args, **kwargs) File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/lib/python3.6/site-packages/click/decorators.py", line 21, in new_func return f(get_current_context(), *args, **kwargs) File "text2struct/run_model.py", line 136, in run_model ctx.invoke(ctx.command.commands[config_dict["mode"]]) File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/lib/python3.6/site-packages/click/core.py", line 610, in invoke return callback(*args, **kwargs) File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/lib/python3.6/site-packages/click/decorators.py", line 21, in new_func return f(get_current_context(), *args, **kwargs) File "text2struct/run_model.py", line 187, in train run_train_model(_parse_subcommand(ctx)) File "text2struct/run_model.py", line 241, in run_train_model checkpoint_steps=config.train.checkpoint_steps, File "/Users/timothy.laurent/src/inv-text2struct/text2struct/model/train.py", line 153, in alternate_training max_len=config.model.dim.max_len, File "/Users/timothy.laurent/src/inv-text2struct/text2struct/model/dataset.py", line 466, in load_prodigy_tf_datasets folder, file_patterns, vocabulary, cache_dir=cache_dir, test_pct=test_pct File "/Users/timothy.laurent/src/inv-text2struct/text2struct/model/dataset.py", line 447, in load_prodigy_arrow_datasets fn_kwargs=dict(tokenizer=vocabulary.tokenizer, vocabulary=vocabulary), File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/lib/python3.6/site-packages/datasets/arrow_dataset.py", line 1224, in map update_data = does_function_return_dict(test_inputs, test_indices) File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/lib/python3.6/site-packages/datasets/arrow_dataset.py", line 1195, in does_function_return_dict function(*fn_args, indices, **fn_kwargs) if with_indices else function(*fn_args, **fn_kwargs) File "/Users/timothy.laurent/src/inv-text2struct/text2struct/model/dataset.py", line 348, in relation_ds_factory ner_labels = example["ner_labels"] KeyError: 'ner_labels' ```
After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ```
299
map/filter multiprocessing raises errors and corrupts datasets After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ``` @lhoestq Thanks for taking a look. I pulled the master but I still see the key error. ``` To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) #0: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 21.56ba/s] #1: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 17.71ba/s] #2: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 20.45ba/s] #3: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 26.05ba/s] #4: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 26.83ba/s] #5: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 27.00ba/s] #6: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 27.40ba/s] #7: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 25.91ba/s] #8: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 22.46ba/s] #9: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 20.15ba/s] #10: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 26.81ba/s] #11: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 27.45ba/s] 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████| 322/322 [00:00<00:00, 1462.85ex/s] Traceback (most recent call last): | 0/1 [00:00<?, ?ba/s] File "text2struct/run_model.py", line 372, in <module> main() File "text2struct/run_model.py", line 368, in main | 0/1 [00:00<?, ?ba/s] run_model(auto_envvar_prefix="GFB_CIES") # pragma: no cover File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/lib/python3.6/site-packages/click/core.py", line 829, in __call__ return self.main(*args, **kwargs) | 0/1 [00:00<?, ?ba/s] File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/lib/python3.6/site-packages/click/core.py", line 782, in main rv = self.invoke(ctx) File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/lib/python3.6/site-packages/click/core.py", line 1236, in invoke return Command.invoke(self, ctx) File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/lib/python3.6/site-packages/click/core.py", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/lib/python3.6/site-packages/click/core.py", line 610, in invoke return callback(*args, **kwargs) File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/lib/python3.6/site-packages/click/decorators.py", line 21, in new_func return f(get_current_context(), *args, **kwargs) File "text2struct/run_model.py", line 136, in run_model ctx.invoke(ctx.command.commands[config_dict["mode"]]) File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/lib/python3.6/site-packages/click/core.py", line 610, in invoke return callback(*args, **kwargs) File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/lib/python3.6/site-packages/click/decorators.py", line 21, in new_func return f(get_current_context(), *args, **kwargs) File "text2struct/run_model.py", line 187, in train run_train_model(_parse_subcommand(ctx)) File "text2struct/run_model.py", line 241, in run_train_model checkpoint_steps=config.train.checkpoint_steps, File "/Users/timothy.laurent/src/inv-text2struct/text2struct/model/train.py", line 153, in alternate_training max_len=config.model.dim.max_len, File "/Users/timothy.laurent/src/inv-text2struct/text2struct/model/dataset.py", line 466, in load_prodigy_tf_datasets folder, file_patterns, vocabulary, cache_dir=cache_dir, test_pct=test_pct File "/Users/timothy.laurent/src/inv-text2struct/text2struct/model/dataset.py", line 447, in load_prodigy_arrow_datasets fn_kwargs=dict(tokenizer=vocabulary.tokenizer, vocabulary=vocabulary), File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/lib/python3.6/site-packages/datasets/arrow_dataset.py", line 1224, in map update_data = does_function_return_dict(test_inputs, test_indices) File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/lib/python3.6/site-packages/datasets/arrow_dataset.py", line 1195, in does_function_return_dict function(*fn_args, indices, **fn_kwargs) if with_indices else function(*fn_args, **fn_kwargs) File "/Users/timothy.laurent/src/inv-text2struct/text2struct/model/dataset.py", line 348, in relation_ds_factory ner_labels = example["ner_labels"] KeyError: 'ner_labels' ```
[ -0.4438700676, -0.167504847, -0.0400716364, 0.1716090888, 0.0889457017, -0.1188245192, 0.2078460157, 0.3421574831, 0.1180473641, 0.1576036513, 0.1306533664, 0.4856554568, -0.4362102449, 0.2786432505, -0.3663623333, 0.095104754, 0.1114206165, -0.0841654539, -0.2514489591, 0.2458097637, -0.2524544895, 0.2934280336, -0.2909146547, 0.19443129, -0.5435569286, 0.0569215044, -0.0106123909, 0.3160979152, 0.0055235103, -0.6119191647, 0.2811328769, 0.3099721372, -0.0368737653, 0.3930649757, -0.0001247698, 0.1380003691, 0.3410484493, -0.0913274959, -0.0481132343, -0.333473295, -0.1873975992, -0.2403929234, 0.2672142982, 0.0872502625, 0.1997459531, -0.0002406985, -0.097595863, -0.2515730262, 0.2583799362, 0.3503323495, 0.0834933966, 0.2617159486, 0.294919908, 0.1206846088, -0.0550052151, 0.1494261175, -0.0134718902, 0.0250229873, 0.3076897264, -0.3135745525, -0.130946517, 0.2591973245, -0.2412053645, -0.2611986697, -0.1408823878, -0.2327819914, 0.6852401495, -0.6341818571, 0.2206975073, 0.0248720981, -0.2827396095, -0.044347696, -0.4056614637, -0.2426634729, -0.2770731449, -0.4417163432, 0.2661074698, -0.0212125722, -0.2025265098, 0.1933265626, -0.4429093897, -0.2660008073, 0.0245452672, -0.1426336467, -0.049413491, 0.570002079, 0.2672022581, 0.2987836301, 0.2195274681, 0.1560342759, 0.1490322649, -0.0205228068, -0.1217985973, 0.1732534617, -0.4211532474, 0.0671509504, 0.0852112919, -0.1909999698, -0.207072854, 0.0080280434, -0.2620823085, 0.0897586197, -0.1146722585, 0.0613031015, 0.3845333159, -0.1019944921, 0.1810253859, 0.3499879241, 0.2183630317, -0.2259042859, 0.0349915698, 0.0682251602, 0.2776801884, -0.0831106007, -0.0578657389, 0.3552602232, 0.1289923489, -0.16255036, -0.1366446614, 0.1850889772, -0.0507491045, -0.1519199312, 0.0860178471, 0.2178294063, 0.0702171251, 0.7802470922, 0.1076602489, 0.1804843694, -0.3048225641, -0.0502082631, -0.0702663362, -0.0836345926, -0.3501508832, 0.0839560851, 0.2303638309, 0.0645158589, 0.0073769875, 0.0501154326, -0.3548417091, -0.2700489461, 0.0857493728, -0.2150363922, 0.0925925523, 0.7242625952, -0.1044124663, 0.0710301474, -0.0193017889, -0.1493756175, 0.056970641, 0.3055912554, -0.5235205889, -0.2392240316, -0.1953840554, 0.077603586, -0.0194744766, 0.2954953611, -0.1016281396, 0.2702436745, 0.4557405412, -0.3296605945, -0.2684328854, -0.3208644986, -0.3652881384, -0.3097074628, 0.0002423078, 0.2654046416, -0.4719289243, -0.0701096356, 0.0881140083, -0.2383559942, 0.3207790852, 0.2269824296, -0.0228192154, -0.0254132673, -0.1377119124, 0.1407951713, 0.0430818722, -0.2105705738, -0.1295746714, 0.2465177625, -0.0217699111, 0.3109730482, -0.0010192879, -0.2680619061, 0.3701475263, -0.1722934246, 0.2614530921, -0.1228102744, -0.1597789228, 0.0373746604, -0.4497858286, -0.0502868816, -0.0986655504, -0.1323396564, 0.4002128541, 0.1608784199, -0.1596160233, -0.4839434624, 0.3021202385, -0.0692918226, 0.3450360596, 0.0765076056, 0.1038474366, 0.0943716466, 0.1545058489, -0.1950788796, -0.4406285882, 0.2530569732, -0.2359960079, 0.041447334, -0.0916196406, -0.0021404102, 0.088906303, 0.2467319667, -0.2203755379, -0.157490775, 0.0297027342, -0.0926257819, -0.285854727, 0.0036854669, -0.1975902617, 0.5724751949, 0.0759177431, 0.162321955, -0.0595486052, 0.1222856045, -0.2260799259, -0.370620966, -0.2846196294, 0.2243586779, 0.0114291068, -0.0456314459, -0.1860868782, 0.4331725836, 0.45288831, -0.0840667039, -0.0524536446, 0.187077105, 0.2882179916, -0.2102928758, -0.1363981515, 0.0566672981, -0.0006442592, -0.2004671693, 0.2589433193, 0.4819489717, 0.1949974746, 0.4277966321, 0.1515735537, 0.2473340034, 0.3132829368, 0.0197369903, -0.0024227276, -0.1308803558, 0.1747963727, -0.1996356845, 0.1646609753, -0.0104097798, -0.0819549412, -0.1719730198, 0.0619492196, 0.0059799477, -0.1777106524, 0.0690231174, 0.1067487895, -0.1143575311, 0.152501002, 0.0578257591, 0.3588521183, 0.0116991177, -0.174393326, 0.1712273061, -0.2872717381, -0.016532097, 0.1565528512, -0.0600656979, 0.3869132698, 0.3310389519, 0.0293595158, -0.0537553281, -0.1345201433, -0.2860233784, 0.127359584, 0.3807800412, -0.4825973213, 0.2592690587, -0.2457228005, 0.3851278722, 0.1693534106, -0.1360459626, -0.2819575369, -0.6221181154, -0.0652160421, 0.4566325545, -0.1154797822, 0.154227078, -0.0666652098, 0.056589514, -0.2285955697, 0.3749102652, -0.2166076899, -0.399230361, -0.1672300994, -0.1114405468, 0.3502188325, -0.3183429241, 0.2055117339, 0.179762125, -0.2845515311, 0.1939576268, -0.3226079345, 0.1029549688, -0.1233832985, 0.0229360145, 0.1417060047, -0.0295711681, 0.0921474248, -0.2455623299, 0.1196158677, -0.2157846093, -0.2701421976, 0.2534025908, -0.1736674458, 0.0497097597, -0.2838630676, -0.2666270137, -0.4348907769, -0.1038086787, -0.0615849309, -0.211789906, 0.3445332348, 0.1505992115, 0.0339485519, -0.0018493868, -0.0442044474, 0.0065717176, -0.0929126069, -0.039703019, -0.2193154991, 0.0723466277, -0.0794133991, 0.0087245405, 0.0408149585, 0.178684473, 0.2683209777, -0.3155305684, 0.1561605483, 0.0617141053, -0.0350547507, 0.1973545253, -0.2283176333, 0.2923777401, 0.438634336, 0.0282353461, -0.0359948091, -0.1160211787, 0.1322573125, -0.0582256988, 0.0320599973, 0.0781753957, 0.3204669952, 0.1851853132, 0.6763289571, 0.3327033818, 0.0967169702, 0.3227103353, -0.0756794065, 0.0984890386, -0.1708202511, -0.3568718731, -0.2422767878, -0.2925571203, -0.0492549166, -0.244211778, -0.1278545111, -0.4333051741, -0.2589370012, 0.5570591688, -0.3641508818, -0.2965089083, -0.0185921956, -0.5365775824, 0.2150663584, -0.0889228433, 0.0328721926, -0.0712173581, 0.0218733363, -0.2308907807, 0.2551973462, 0.1453684568, -0.3435902297, -0.3474945426, -0.2690314651, -0.5460583568, 0.2908348441, 0.1735797673, 0.6667125225, 0.0073581114, -0.0688796192, 0.0245227031, -0.1858623028, 0.934643209, -0.5217697024, -0.3587030172, 0.3193504512, -0.3986981809, -0.2629765272, -0.1964328885, -0.1744775921, 0.5878406167, 0.4124094546, 0.420579493, -0.1871403605, -0.3219943941, 0.1479854733, -0.2433121949, -0.1495636255, -0.003130924, -0.3532735705, 0.0017742962, -0.2907098532, 0.1065585911, -0.1895605922, 0.1086732298, -0.3118035197, -0.0441959575, -0.0536379814, -0.1121004522, 0.1906820685, 0.0595514067, -0.0004532449, -0.2153487504, 0.0571142882, 0.0515728295, 0.0525746867, 0.2204809934, 0.2181245685, -0.2953674793, -0.080443725, 0.1628428698, 0.0561930053, 0.2555770576, 0.2743009925, 0.2173480392, -0.0336351879, 0.1120123416, 0.1660033762, -0.338604033, 0.0078564361, 0.2948586345, -0.0222806558, -0.595117867, -0.1164295822, -0.0660492182, 0.4181353152, -0.2006955147, 0.4809938371, -0.3640760481, -0.2475473732, 0.4040852785, 0.1156169772, 0.8487922549, -0.4349768758, -0.0354045816, 0.0730252713, 0.0347608104, 0.0952213854, -0.0260101482, -0.0044387169, -0.3212983012, 0.0248715207, -0.0477381013, 0.0450861752, 0.2680165768, 0.1381975412, -0.1732298881, 0.3865531683, -0.0107100531, -0.1017479151, -0.0425719619, 0.1595095545, 0.4235086739, -0.0358469374, 0.2206966281, -0.013976559, 0.0037835175, -0.1126477718, 0.0064294562, 0.0148409121, 0.2116277665, -0.1840920001, -0.4811277092, -0.1047281474, -0.0924685895, 0.1672158539, 0.1426285952, 0.1315794736, -0.031353537, 0.0471192673, -0.0821708143, 0.1316119283, -0.079581514, -0.1066345945, 0.0960279554, 0.3185379803, 0.1848516166, 0.057167314, 0.2695899308, 0.095020324, -0.2720301747, 0.0871151164, -0.276904285, -0.2623149455, 0.0128055811, 0.0352959484, 0.0424243733, 0.1547395736, 0.0866835862, 0.0098209456, -0.1576528549, -0.2034297884, 0.0333737507, -0.018138431, -0.1581920385, 0.4551251531, -0.0940321609, -0.3601725996, 0.0371087529, 0.4425574541, 0.2474649549, -0.0432837009, 0.1479087025, 0.1199449673, -0.0285276584, -0.2268241793, -0.2088473439, 0.2126795799, -0.203406781, 0.2764654756, 0.0742246434, -0.3003957272, 0.1926261187, 0.038182959, 0.1105418205, 0.3506319821, -0.2179452181, -0.3140773177, -0.5096754432, 0.287386924, -0.042441193, 0.1220762134, -0.0970069692, 0.1696491987, -0.1525716633, 0.3414253294, -0.2173339128, 0.105587028, -0.1629120708, 0.1904526651, -0.1025945097, -0.0219334494, -0.1139361635, -0.1034619734, 0.0645447522, 0.0122853946, -0.2283413261, -0.0161118135, 0.0347344354, 0.1336133629, 0.0958128422, -0.1613650024, -0.0570760258, -0.0350735486, -0.0952624977, -0.128837049, 0.1950083077, 0.3021520078, -0.1840545833, 0.2073190212, 0.2600147724, 0.2184174657, 0.0725978389, 0.3586006463, -0.1578326672, 0.0714999959, 0.0375997312, 0.4504763186, 0.1239430085, -0.1813463271, -0.0115455613, 0.1517643929, -0.0749084726, 0.0154510159, 0.3202468157, -0.1412188709, 0.0418410599, -0.0255631488, 0.3905259967, 0.40804708, -0.1006334126, 0.0914172903, 0.1294074059, 0.0766719729, 0.0393771827, -0.2356763631, 0.5182752609, 0.0912386104, 0.071226716, 0.3274708688, 0.3438837528, -0.1482007504, 0.2190130502, 0.1972626746, -0.0391979553, -0.0060764011, 0.2387894392, 0.2392290086, 0.0759489238, 0.2791833878, 0.2188094705, -0.3160719275, -0.0826135129, 0.3064981997, -0.0430861227, 0.5566529632, -0.0231059827, 0.1107525826, -0.050467696, -0.6661481261, 0.0684484839, 0.0505351759, -0.493052274, 0.2006000876, 0.0285406038, 0.2260498703, -0.28438434, -0.4488836229, -0.1733898371, 0.4938538671, -0.3161696196, -0.3530268371, -0.2420184463, -0.1568669975, -0.0794110447, 0.0364155248, -0.0944968984, 0.2113802284, 0.7398549914, 0.0275038555, -0.0592763275, -0.1394856572, -0.3253465593, 0.0243772939, 0.3143860698, -0.030281309, 0.3852972686, 0.060253609, 0.0233473703, -0.0070994603, 0.1115667671, 0.5153450966, 0.6486164927, -0.4715348184, -0.0051935054, 0.1361400783, -0.0671252385, -0.1269531399, 0.4410919547, 0.0936704576, 0.2746125162, 0.2441115975, -0.0740941167, 0.0241559073, -0.2729544044, 0.3861776292, 0.355855763, 0.1640093327, 0.3792753816, 0.0898746401, -0.159875989, 0.2703371346, 0.1561390162, -0.2949355543, -0.2356025577, 0.4741257429, -0.4620780349, 0.2306520045, 0.189804554, 0.0532198213, 0.0667016581, 0.5093151927, 0.3080869019, 0.0298362691, -0.3025133014, -0.1432609558, -0.4209336638, 0.0029772744, -0.1952995509, 0.2641443014, 0.043420691, 0.0151952282, -0.0301826783, 0.1026464254, -0.0260659382, -0.2269306779, 0.3380416334, 0.3651673794, -0.4900594354, 0.2121444345, 0.104849264, -0.0730707571, -0.0113658756, -0.2620003223, 0.3681491911, 0.0782691911, -0.0984348208, 0.102991268, 0.1684941053, 0.0956899002, 0.1215944141, 0.0248369575, 0.3121722937, 0.5543991327, -0.0393268466, -0.0194799528, -0.2440268397, 0.0761285946, -0.1203721017, -0.0872049481, 0.0565833859, 0.3794704974, -0.1878676116, -0.0857281983, -0.0020313039, 0.1167895943, 0.0629816055, 0.1787513047, -0.4754634202, 0.2667181492, 0.1632101983, 0.0898108035, -0.1624225974, 0.4008591175, 0.2307240218, 0.3632056117, -0.3065881729, -0.2034913301, 0.3098653555, 0.0045142658, -0.0913124382, -0.5519126654, 0.1878937781, -0.1802448928, 0.1410412043, -0.5005385876, -0.3521083295, 0.1755145192, -0.2155505717, -0.5762811899, 0.2504346073, -0.066198267, -0.2007855028, -0.0668060854, -0.1931967586, 0.1094471067, -0.1223391443, 0.219039917, -0.0942721963 ]
https://github.com/huggingface/datasets/issues/620
map/filter multiprocessing raises errors and corrupts datasets
The parallelism is automatically disabled on `tokenizers` when the process gets forked, while we already used the parallelism capabilities of a tokenizer. We have to do it in order to avoid having the process hang, because we cannot safely fork a multithreaded process (cf https://github.com/huggingface/tokenizers/issues/187). So if possible, the tokenizers shouldn't be used before the fork, so that each process can then make use of the parallelism. Otherwise using `TOKENIZERS_PARALLELISM=false` is the way to go.
After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ```
75
map/filter multiprocessing raises errors and corrupts datasets After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ``` The parallelism is automatically disabled on `tokenizers` when the process gets forked, while we already used the parallelism capabilities of a tokenizer. We have to do it in order to avoid having the process hang, because we cannot safely fork a multithreaded process (cf https://github.com/huggingface/tokenizers/issues/187). So if possible, the tokenizers shouldn't be used before the fork, so that each process can then make use of the parallelism. Otherwise using `TOKENIZERS_PARALLELISM=false` is the way to go.
[ -0.469833672, -0.1674958766, -0.0140371434, 0.1800088286, 0.0854438841, -0.1905385554, 0.2459651232, 0.2756290734, 0.1016366705, 0.1726741791, 0.0916073471, 0.4495119452, -0.3661409318, 0.2278043777, -0.354011476, 0.0023673666, 0.1458533406, -0.1064162776, -0.1076291054, 0.2664557099, -0.1931274831, 0.3243031502, -0.2955477238, 0.1895085573, -0.5507255793, 0.0559617803, -0.0354772843, 0.2832088172, -0.059308216, -0.6136328578, 0.2357298434, 0.3920495808, 0.0007348284, 0.4456866682, -0.0001240691, 0.14680022, 0.3075380027, -0.0330979377, 0.0126843899, -0.3793182373, 0.0019112751, -0.3051969409, 0.2444293201, 0.1721025109, 0.20134601, -0.1140397936, -0.1220818236, -0.1859958768, 0.3015087545, 0.2977147102, 0.07807789, 0.3740779459, 0.2976613641, 0.1366761625, -0.1064471006, 0.1475402713, -0.0475676246, 0.0469950065, 0.3224135339, -0.2692289054, -0.1280240715, 0.2051907033, -0.2646254301, -0.2231661677, -0.1501529515, -0.2274801284, 0.533118844, -0.5728852749, 0.2040058225, 0.097276561, -0.27528283, -0.0516940579, -0.4340717494, -0.2814347148, -0.3205260038, -0.5091425776, 0.2326747477, -0.0793348551, -0.1704741716, 0.1779234707, -0.42785725, -0.2340728343, 0.0443508476, -0.1019556597, -0.1272598952, 0.5708987713, 0.1884503067, 0.3039304912, 0.1742693186, 0.1578972191, 0.0501511917, -0.0234679636, -0.1019759178, 0.1722767055, -0.4143753052, 0.0462936088, 0.1032639518, -0.3558403254, -0.1825722456, 0.1217044294, -0.2746937275, 0.1585306823, -0.1526543498, 0.0980683118, 0.3795083463, -0.0054286122, 0.1715632826, 0.2787453234, 0.2465554178, -0.208906278, 0.0966487229, 0.0370955579, 0.2416517437, -0.0345836468, -0.1681512594, 0.251570344, 0.1012417153, -0.1497584134, -0.1228370294, 0.1711269915, -0.0874946415, -0.1317545027, 0.1335783601, 0.22652933, 0.035856992, 0.7409480214, 0.0959843323, 0.1662243307, -0.3377199173, -0.0669956431, -0.0701268911, -0.0903124958, -0.3008593321, 0.1250006407, 0.1836818308, 0.133336395, 0.0110817403, 0.0407855175, -0.3716781139, -0.2365078926, 0.1697466075, -0.2469154596, 0.0945084244, 0.6521362662, -0.0997273996, 0.0555374399, -0.0186682716, -0.0468874462, 0.0317682177, 0.196613282, -0.476574719, -0.1805386543, -0.2433283776, 0.0614766888, -0.0035068616, 0.2686828375, -0.2681763768, 0.3582110107, 0.3775328398, -0.3392840326, -0.2765751481, -0.320417881, -0.3733094335, -0.2689910531, 0.0655428544, 0.223552987, -0.3428295255, -0.1210214272, 0.1379307061, -0.2942816913, 0.3426655829, 0.3083378673, -0.0074371658, 0.071666874, -0.1562375128, 0.2892134786, 0.0653890446, -0.2158593833, -0.0655004457, 0.1830652356, -0.0553092659, 0.3571155667, -0.0528864264, -0.2939493954, 0.4720182121, -0.0873196274, 0.2116505057, -0.0297007114, -0.1419224143, 0.0854482129, -0.4349105954, -0.0488806032, -0.0801714212, -0.1022237912, 0.3707358837, 0.1900942326, -0.1735135317, -0.4965016246, 0.2677108943, -0.1114348322, 0.3222943246, 0.0715090856, 0.1434362829, 0.2017227709, 0.1305196285, -0.2422704399, -0.3775041401, 0.2210634798, -0.2406435311, 0.0430390947, -0.0146283619, -0.0765242055, 0.1771067828, 0.1960383058, -0.212210536, -0.1775559783, 0.029469952, -0.0402093977, -0.2606337965, -0.0142164826, -0.2204610705, 0.538718164, 0.0587077886, 0.2526372969, -0.0555031002, 0.1792500019, -0.1390037835, -0.4171820879, -0.3341958523, 0.2444839478, 0.0404214561, -0.1081418991, -0.181263119, 0.42811203, 0.4532392025, -0.025946714, -0.0781762376, 0.1995645165, 0.3030529618, -0.1107941642, -0.1663929224, 0.0268021151, -0.000516206, -0.1456859112, 0.3252195418, 0.4752387702, 0.2583377659, 0.3575919569, 0.2223508656, 0.196734637, 0.3676939011, 0.0150890574, 0.0022731423, -0.0769254789, 0.2094926834, -0.0871472359, 0.1809680015, -0.0041486528, -0.1150162518, -0.133817032, 0.0661965758, 0.0200308934, -0.1528685689, 0.0386655815, 0.077141434, -0.1419482678, 0.1957790554, -0.09082257, 0.3246193826, 0.0216592588, -0.1594496667, 0.1472173184, -0.2333883047, 0.0095725209, 0.0928070322, 0.008376807, 0.2946353853, 0.3339908421, 0.0658815205, 0.0199001096, -0.292672962, -0.3580294549, 0.133021608, 0.3469078243, -0.4571522474, 0.2395317256, -0.233221367, 0.4305513501, 0.1446417421, -0.1823007166, -0.3057281077, -0.5543478727, -0.0787178203, 0.6065192819, -0.1380047947, 0.1733261645, 0.0008414015, 0.2153343707, -0.2424017042, 0.4286289215, -0.2241809368, -0.3289695382, -0.2124573737, -0.132561788, 0.3449446559, -0.2445285618, 0.2342278212, 0.1797367781, -0.3173705339, 0.0695009008, -0.4021344185, 0.1804140508, -0.0857735947, 0.0844072923, 0.115848884, -0.0306755081, 0.0407306179, -0.2014769614, 0.1514791399, -0.2354545593, -0.2923104167, 0.2084233761, -0.1431323588, -0.0121390298, -0.2302770019, -0.2263070047, -0.4417110384, -0.142899096, 0.0151708098, -0.262347281, 0.2979032993, 0.0964591354, -0.0297174249, 0.0305126272, -0.167861253, 0.0696146861, -0.1205816865, -0.0331987068, -0.2646497488, 0.008934997, -0.1214701533, -0.0549474284, 0.0504693165, 0.1772571653, 0.3300921619, -0.2351268828, 0.0866734609, 0.0608826876, -0.0232303962, 0.1820588112, -0.2267752141, 0.3626005948, 0.3413051963, 0.0460006446, 0.0077076405, -0.1542689502, 0.0969478339, -0.1244618893, 0.0159215517, 0.1115940213, 0.4085758626, 0.2045237869, 0.7586420774, 0.3068210483, 0.0815784037, 0.4106375575, -0.1149263233, 0.1306592375, -0.1736630499, -0.408628583, -0.1645549685, -0.2788652182, -0.0181172788, -0.1671077013, -0.1037887335, -0.3872558177, -0.2170633525, 0.4757823348, -0.3586521745, -0.2796052694, -0.0407269187, -0.3742518723, 0.2615020275, -0.0808700621, 0.085449174, -0.110737361, 0.0205803737, -0.1939821243, 0.3396610022, 0.1362642646, -0.2989440262, -0.4563955665, -0.1417519897, -0.5637004375, 0.2795661986, 0.1750621051, 0.7341635823, -0.023458451, -0.0551585183, 0.058768291, -0.1976895183, 0.9357394576, -0.3594238758, -0.421744734, 0.2894861698, -0.5509144664, -0.2396152914, -0.1999229491, -0.1644485891, 0.5423986316, 0.5066505075, 0.4541930556, -0.2640604973, -0.3756240606, 0.1025580987, -0.2309690267, -0.1180359945, -0.0011557713, -0.3292896152, 0.0030473322, -0.3068461418, 0.1469484568, -0.2366103083, 0.1321083903, -0.2422955334, -0.1088759229, -0.1110603064, -0.1524035335, 0.1520631611, 0.0903232247, -0.0314771943, -0.2784014642, 0.0413449779, 0.117349416, 0.0203276798, 0.2515932918, 0.1480955184, -0.1945677698, -0.2651034296, 0.1867992729, 0.1416079402, 0.2887452245, 0.2998697162, 0.1709918678, 0.0748204142, 0.1025452986, 0.2095710486, -0.3290911913, 0.1400388032, 0.3358180523, 0.069967255, -0.5486968756, -0.0179217905, -0.0950678438, 0.4445161819, -0.1411133558, 0.5347308517, -0.3712591529, -0.2485485971, 0.3692537248, 0.0327240676, 0.8252353668, -0.3295946121, -0.0434362441, 0.1655748934, 0.0162165388, 0.1495188922, -0.1232442856, 0.0196429156, -0.3308779299, 0.1058772504, -0.0595264547, 0.0748170763, 0.363846004, 0.1011833102, -0.0730091929, 0.376454711, 0.0869760364, 0.0188118666, -0.0523899496, 0.2291527241, 0.3855036795, -0.1186163723, 0.1237100214, -0.0242682509, -0.0108195711, -0.0757310241, -0.0079829432, 0.0491070002, 0.2691779137, -0.1717043221, -0.4417430162, -0.1571897566, -0.1704279929, 0.1635919064, 0.2262083888, 0.1969400048, -0.0660279319, -0.1102874577, -0.0307536609, 0.1097055078, -0.0953580216, -0.103452526, 0.0131339831, 0.2929880619, 0.1415535212, -0.0207574964, 0.3205948174, 0.0703558475, -0.196579963, 0.041591242, -0.2161635756, -0.2890792787, 0.0082080513, 0.0118188374, 0.1558251232, 0.1179112494, 0.151567623, 0.0131440051, -0.09916123, -0.2529159784, 0.0169842541, 0.0426877365, -0.2005049884, 0.3394846618, -0.086979717, -0.3004576266, -0.0263283681, 0.4269419312, 0.134354353, -0.1657797694, 0.0890283361, 0.14596048, -0.0572543666, -0.1820952594, -0.2436644435, 0.3081693053, -0.1138094664, 0.2650244832, 0.0591092557, -0.361875385, 0.1516513228, 0.1528607309, 0.043123994, 0.2574945688, -0.2684717774, -0.3503496647, -0.4898903668, 0.2911828458, -0.0430531316, 0.1423330605, 0.0262092724, 0.2362662554, -0.2149016857, 0.3182536364, -0.2029467523, 0.0799201354, -0.129232347, 0.1701392084, -0.1570796967, -0.1027875319, -0.1325786859, -0.0717894286, 0.0402050912, -0.0084999651, -0.2677111626, -0.005371552, -0.0168905668, 0.1345699728, 0.1642263532, -0.2353511155, 0.0058797039, 0.034765102, -0.1145367473, -0.2087295502, 0.2275357246, 0.2921719551, -0.2415675372, 0.0413199663, 0.2643429637, 0.199307099, 0.0712748989, 0.3201952577, -0.0814486146, -0.0106403902, 0.0728162676, 0.4271274209, 0.0593410321, -0.1600847095, 0.0053651519, 0.1601384729, 0.1081877053, -0.0624856576, 0.2911907136, -0.1417411268, -0.0191014558, -0.0337679163, 0.3789578378, 0.4425646067, -0.0806522518, 0.0001761615, 0.1910729259, 0.0638121963, 0.0837066621, -0.2308317125, 0.6156698465, 0.0289503485, 0.1046122164, 0.2869163156, 0.3824903369, -0.0869988501, 0.1666944027, 0.1613330692, 0.0975130945, -0.0389077663, 0.2546489239, -0.0366276726, -0.0137627125, 0.3162635267, 0.2590506077, -0.3143551946, -0.1352951676, 0.2599978149, -0.0384013839, 0.5948086381, -0.09696199, 0.2248377502, -0.0666980445, -0.6823629141, 0.005719353, 0.0434205309, -0.4615485072, 0.2214369923, -0.0417676605, 0.2390714288, -0.3056818843, -0.4851906896, -0.1789883077, 0.5870897174, -0.3731824756, -0.3623028994, -0.2752657533, -0.1022212803, -0.0483386815, 0.0446072742, -0.0661511794, 0.2078544796, 0.627822876, -0.0410968699, -0.0900360644, -0.2425193787, -0.3348078728, -0.0144504681, 0.2810927629, -0.00616467, 0.4593022466, -0.0634732544, 0.0371081196, -0.0783303976, 0.2297443449, 0.4838186204, 0.6397121549, -0.3236340284, -0.1313555539, -0.0015714876, -0.0112166144, -0.1275708079, 0.5054549575, 0.0100010093, 0.2311813384, 0.1936001033, -0.0822728127, 0.0042333677, -0.3318097293, 0.3296575546, 0.268466413, 0.0815629363, 0.3986718953, 0.013811158, -0.0815165192, 0.2123760879, 0.089623943, -0.2625279427, -0.1877288818, 0.4311320186, -0.4376078844, 0.1901279837, 0.1663140059, 0.0485993177, 0.1046997011, 0.3792053163, 0.3668212295, 0.0105576254, -0.2663842738, -0.121975109, -0.4246730506, 0.0550432429, -0.1286811382, 0.3264996409, 0.0281254649, -0.0015221611, -0.0064911507, 0.1175891012, -0.106641598, -0.1334382445, 0.387414366, 0.254484117, -0.5439997315, 0.3324020505, 0.0594326369, -0.0865357146, 0.0030171201, -0.2445366234, 0.4164735079, 0.0530777425, -0.1391580701, 0.1390383542, 0.1647679359, 0.0155451521, -0.0387470126, 0.0465330891, 0.3313429356, 0.5452201962, -0.0247946419, 0.0179174468, -0.2891092002, 0.0822054818, -0.1622479111, -0.0324380361, 0.0186816193, 0.4277775884, -0.1754867435, -0.0600117184, -0.0635538697, 0.1609639376, 0.0716932416, 0.0929154605, -0.4696355462, 0.3175767064, 0.1130193546, 0.085116595, -0.1704148203, 0.4451141357, 0.2408339977, 0.3864461482, -0.3660423756, -0.2012933791, 0.3071757257, -0.1852538884, -0.0721667185, -0.650763154, 0.1840497553, 0.0079315081, 0.0638970956, -0.4796607494, -0.3711153269, 0.1487099081, -0.2079420388, -0.534414351, 0.2810969651, -0.1004107744, -0.2441051304, -0.0879499912, -0.14139691, 0.1348971128, -0.0782703608, 0.2719237506, -0.0260523036 ]
https://github.com/huggingface/datasets/issues/620
map/filter multiprocessing raises errors and corrupts datasets
> Thanks for taking a look. I pulled the master but I still see the key error. I am no longer able to get the error since #659 was merged. Not sure why you still have it @timothyjlaurent Maybe it is a cache issue ? Could you try to use `load_from_cache_file=False` in your `.map()` calls ?
After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ```
56
map/filter multiprocessing raises errors and corrupts datasets After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ``` > Thanks for taking a look. I pulled the master but I still see the key error. I am no longer able to get the error since #659 was merged. Not sure why you still have it @timothyjlaurent Maybe it is a cache issue ? Could you try to use `load_from_cache_file=False` in your `.map()` calls ?
[ -0.4314344823, -0.1244709268, -0.0213455409, 0.2268494964, 0.1299646348, -0.1009751856, 0.2160617709, 0.4109105766, 0.0611690432, 0.1098549068, 0.0194625333, 0.4327998161, -0.3604689538, 0.3772414625, -0.3395301104, 0.0885623395, 0.0790842026, -0.0142772458, -0.3960186243, 0.2743344307, -0.3150269985, 0.2834355235, -0.2949821353, 0.1656471193, -0.5235918164, 0.0930625573, -0.130279839, 0.3110471964, -0.0348667316, -0.5802861452, 0.300783813, 0.2864477932, -0.0216590203, 0.440155834, -0.0001220014, 0.0999585092, 0.384760201, -0.0793479159, -0.0716060176, -0.3503718972, -0.2476752847, -0.2170802355, 0.2827694416, 0.1046236902, 0.2202924788, 0.0048063919, -0.1133265942, -0.1922501326, 0.2761891782, 0.3521155119, 0.1191017032, 0.1872770339, 0.3880855739, 0.0826244652, -0.0354131609, 0.0937606096, 0.0023207739, 0.0315019153, 0.3187670708, -0.425323844, -0.0663859844, 0.3480862975, -0.2237598002, -0.2809121907, -0.0422262251, -0.2567038536, 0.6990745664, -0.6044114828, 0.215970844, 0.0483473688, -0.2460377812, 0.0421751142, -0.4069693685, -0.2098710537, -0.2622978091, -0.4920372367, 0.2894101441, -0.0301968642, -0.2084700316, 0.1472872198, -0.4892230034, -0.3158852458, 0.0184610039, -0.1103787273, -0.0474435613, 0.533496201, 0.2437924147, 0.2392061055, 0.2650355399, 0.0902931318, 0.1246620417, -0.0066082831, -0.150888145, 0.1766753197, -0.4339981377, 0.0645185485, 0.1089490131, -0.0413446352, -0.2220737934, -0.0226178169, -0.2547250688, 0.1037644297, -0.0433887169, -0.0031187208, 0.4134055674, -0.002176486, 0.2312471271, 0.2805158794, 0.2089717984, -0.1722469032, -0.0612266809, 0.0473841354, 0.3236123621, -0.1497554481, -0.027468536, 0.3400016427, 0.1807361394, -0.2046601027, -0.0820393041, 0.1177493036, -0.0378101803, -0.1827214658, 0.0796592087, 0.2012062222, 0.0385182165, 0.7639833093, 0.1573565602, 0.2139507234, -0.2573188245, -0.0196507871, -0.0640081763, -0.072523132, -0.3861180544, 0.0672841147, 0.2665230036, -0.1440189481, 0.0057661161, 0.0307269879, -0.2962279618, -0.2732140422, 0.0389923602, -0.2264908254, 0.1082785949, 0.668219924, -0.0568311214, -0.0038680658, -0.0184369348, -0.18389979, 0.1021291018, 0.2575798035, -0.533706069, -0.2903420329, -0.1411549747, 0.1126206443, 0.0741788894, 0.3089142442, -0.1461549401, 0.2622053325, 0.4196607769, -0.3520434499, -0.3468494415, -0.2722084224, -0.3974719346, -0.3425092399, -0.0154753365, 0.2808100879, -0.4167695045, -0.0224718899, 0.020111654, -0.1394569278, 0.2266025692, 0.1803769767, -0.0676962212, -0.0393483527, -0.1244897097, 0.0694300383, 0.080917269, -0.2601239383, -0.2011921704, 0.284745723, 0.0258580297, 0.270403564, -0.0125878975, -0.2769046724, 0.3264493346, -0.2117159069, 0.2866228521, -0.097726211, -0.1917824, 0.0468077734, -0.4094305336, -0.0715952963, -0.0588161498, -0.1225054562, 0.4039651752, 0.1986883581, -0.1238543838, -0.5356957912, 0.2904837132, -0.0942864567, 0.3269189, 0.1106510386, 0.0971124917, 0.0743151978, 0.1996758878, -0.2014027387, -0.4499588907, 0.237993598, -0.2776957154, 0.0345757678, -0.0698411316, 0.0290762335, 0.0377739593, 0.1411242783, -0.2355705649, -0.1979336739, 0.0776841193, -0.1049175411, -0.2467699647, -0.0876590163, -0.2032499909, 0.6826040149, 0.1674461365, 0.1178387031, -0.0962088183, 0.1435915083, -0.2462818325, -0.3751712441, -0.2172573805, 0.2264215797, 0.0565495528, -0.0111119188, -0.2679610848, 0.4512028098, 0.3154534101, -0.0701539069, -0.0241987947, 0.1356796175, 0.2313728184, -0.2101559341, -0.1253692955, 0.0301835015, 0.0150994398, -0.213948667, 0.2164919674, 0.4391388893, 0.070073314, 0.4827991426, 0.1989905387, 0.2445271015, 0.2599721551, 0.0034954175, -0.0056858659, -0.1511244476, 0.213259086, -0.2349465042, 0.1760886461, 0.0235670563, -0.0262154639, -0.1778626442, 0.125843063, 0.0529481098, -0.1465938091, 0.0932001024, 0.0199285597, -0.1221552789, 0.0554957464, 0.0848884359, 0.4538977742, 0.0019586347, -0.1927703172, 0.2290839851, -0.2108613253, 0.0234584473, 0.2167140096, -0.0559657998, 0.3995962739, 0.3053631485, 0.0496960431, -0.0101161506, -0.1340930015, -0.1709619164, 0.068992354, 0.3572277129, -0.4408535957, 0.179074645, -0.2723535895, 0.2788523436, 0.1270203292, -0.1176957339, -0.3178193569, -0.5795156956, -0.0903852656, 0.4781470597, -0.0622866005, 0.1386612952, -0.0254732221, 0.0500343367, -0.1990745217, 0.3770761192, -0.2009774148, -0.405900985, -0.1911112666, -0.0859188884, 0.3534835279, -0.336589247, 0.2275675237, 0.1413677037, -0.2084483951, 0.1989381313, -0.386902988, 0.0617677905, -0.1056495458, 0.0377924144, 0.1049742252, 0.0098569132, 0.0452488288, -0.2347850949, 0.1706170142, -0.2592057288, -0.2953842878, 0.2534519434, -0.1604864001, 0.1175034717, -0.2453966588, -0.3316819668, -0.3674534261, -0.1288785338, -0.0563550182, -0.2305461317, 0.3234849274, 0.1497077793, 0.0233157463, -0.0651253909, -0.0555440709, 0.0701148286, -0.1063602567, -0.0443348363, -0.197531566, 0.0210301094, -0.0967614725, 0.0295416638, 0.0621927828, 0.2017762363, 0.2890486419, -0.2750004828, 0.0504877865, 0.0505749434, -0.0595086291, 0.1866525859, -0.1573630869, 0.3616137505, 0.4519872367, 0.0313640088, -0.0434337482, -0.1598109454, 0.0565444231, 0.040626917, 0.0759693161, 0.0609644316, 0.3302604556, 0.1010791808, 0.6614119411, 0.3625735641, 0.1548634171, 0.3272960186, -0.0050969487, 0.1325636506, -0.2484613955, -0.42858693, -0.2518709898, -0.3223818541, 0.067650117, -0.2475496233, -0.083268851, -0.4537875652, -0.3686929643, 0.480954349, -0.4484337866, -0.2922632098, -0.0461663306, -0.4731538892, 0.2454739809, -0.0141624622, -0.0119091421, -0.0777524784, -0.0021200851, -0.219032228, 0.2599598765, 0.075097017, -0.3458836377, -0.2463167012, -0.3370692134, -0.4683032036, 0.3457070589, 0.1800268143, 0.6090080738, 0.0361391455, -0.0807520822, -0.0433469601, -0.1559735686, 0.8825862408, -0.6140750647, -0.3779459894, 0.2825071812, -0.3890219331, -0.2425139993, -0.1773127913, -0.09258724, 0.6058714986, 0.3386295438, 0.4422084689, -0.1263381094, -0.2924232483, 0.2054608762, -0.189968735, -0.0901448205, 0.0126368776, -0.3751437962, -0.0160287842, -0.2455599159, 0.0628633946, -0.2198966444, 0.1254926622, -0.294368118, -0.0085715503, -0.071398899, -0.1021112502, 0.164604038, 0.0390509814, 0.0290780365, -0.1818820089, 0.0904810056, 0.0928512812, 0.1154307872, 0.217958048, 0.2940435112, -0.3483686447, -0.1547675133, 0.1047960371, 0.0128384754, 0.2593702972, 0.287494719, 0.2066314369, -0.09997572, 0.1383374631, 0.1219354272, -0.2631671131, -0.0537207648, 0.3026965261, -0.0508229807, -0.5315157771, -0.1201045662, -0.057883136, 0.43633008, -0.1913838387, 0.5237947702, -0.4564825892, -0.2383583486, 0.3694997728, 0.0153907426, 0.8982656002, -0.4066663384, -0.0524985418, 0.0174783766, -0.0084767267, 0.1157461107, 0.0991767496, 0.043469023, -0.3183646202, -0.0740958676, -0.0107523911, 0.0392738432, 0.2415267825, 0.1067192703, -0.1916185617, 0.4355039001, 0.0005612522, -0.073670812, -0.01783216, 0.1360594034, 0.3495914042, -0.0522042252, 0.1100699604, -0.0046373047, -0.0166384093, -0.088445358, -0.0157272443, -0.0205742065, 0.2720364332, -0.2323772311, -0.4313637614, -0.0588857755, -0.1246226206, 0.1750725508, 0.1152962297, 0.0633786023, -0.0505628251, 0.0899280682, -0.0465703718, 0.1361657381, -0.0849652216, -0.0713882893, 0.1129712164, 0.3522194922, 0.144709155, 0.0274931528, 0.266585499, 0.0795497596, -0.2798361778, 0.0733122975, -0.2621833682, -0.2616204917, -0.0750689059, 0.065535292, -0.0213129204, 0.1660890132, 0.1111551076, 0.0022648796, -0.1481416821, -0.2251985371, 0.072707057, -0.0243668221, -0.0966151655, 0.5090482831, -0.0795827806, -0.3521083593, 0.0703075677, 0.4957982302, 0.2781364024, -0.1033784524, 0.1932802498, 0.1098286659, 0.0112322494, -0.2096006423, -0.2345389873, 0.1449983716, -0.1942836344, 0.2652486861, 0.0830779672, -0.1989104152, 0.2920132875, 0.0552811958, 0.079302296, 0.3829926848, -0.157069087, -0.2903128564, -0.5277740955, 0.2857147157, 0.0204579867, 0.0969084948, -0.1263669282, 0.1305976212, -0.0521105379, 0.3383781016, -0.2322707325, 0.1291427612, -0.0689733997, 0.1807767898, -0.0222006682, -0.0191033762, -0.1341445446, -0.1393243372, 0.0646476746, 0.0359887667, -0.2040904164, -0.0415783413, 0.0906358808, 0.1251113564, 0.0482723117, -0.1485891044, -0.1163554862, -0.0774999782, -0.0856594294, -0.116111353, 0.1988503039, 0.2801805735, -0.1707324088, 0.1351584196, 0.1905764639, 0.1141843945, 0.1202028617, 0.3806436062, -0.2339487821, 0.0605881922, 0.0225693882, 0.371925354, 0.1211015582, -0.2014987469, 0.0049365535, 0.0348658115, -0.1434601992, 0.0187817104, 0.3351596296, -0.2365334332, -0.0248167366, 0.0341960602, 0.3707669675, 0.4022579789, -0.0860795975, 0.1595243812, 0.103833206, 0.1220132709, -0.0071794838, -0.2172744721, 0.4551475048, 0.0618548542, 0.0393966958, 0.2695763111, 0.274484992, -0.1645929366, 0.238414675, 0.2020526677, -0.1007003561, -0.0349233225, 0.2911988497, 0.2524306178, 0.1633521616, 0.1991354674, 0.2327963859, -0.2741602063, -0.1227261573, 0.2849214673, -0.0553045012, 0.4940329492, -0.0550817437, 0.093950361, -0.0180261508, -0.6432762742, 0.046890799, 0.1129793078, -0.5432622433, 0.2139879316, -0.0143762417, 0.2575622201, -0.329477787, -0.4762938619, -0.1783713102, 0.4282782972, -0.2955002487, -0.3060942888, -0.1915064454, -0.1794610918, -0.0360149369, 0.0738011301, -0.0777121335, 0.2517941892, 0.7757794261, 0.043746464, -0.0749631673, -0.1485755146, -0.3636281192, 0.0480314828, 0.3650007546, -0.0165604353, 0.3460434973, 0.0974887237, 0.0123903034, 0.0072353389, 0.2230303884, 0.5292611122, 0.6800274253, -0.4986773431, 0.0286409594, 0.174210012, -0.0936604738, -0.1186067015, 0.3716340363, 0.1848438531, 0.2645902932, 0.2285881788, -0.0288324095, 0.0111453608, -0.1483450383, 0.2338473201, 0.3835113347, 0.0782252997, 0.2400386781, 0.1224802881, -0.1629372388, 0.2473021001, 0.1672247499, -0.319960624, -0.2731570005, 0.5410602093, -0.3435688019, 0.188339144, 0.1087425947, 0.0782318711, 0.1328064501, 0.5063025355, 0.3657384515, 0.0652464628, -0.3326124549, -0.2014543414, -0.470719099, -0.0253028795, -0.205199033, 0.3350788057, 0.0426927879, 0.03329961, -0.05733715, 0.05228379, 0.0825945362, -0.295743525, 0.3538030386, 0.3220773041, -0.5336773396, 0.2130993605, 0.1855690181, -0.0482996032, 0.037356928, -0.3035820723, 0.3494935036, 0.0925251022, -0.0680239797, 0.0800443739, 0.154036954, 0.1159982681, 0.2276992202, 0.0035319459, 0.2739094794, 0.5416815281, -0.0236294977, -0.0608467162, -0.2412251234, 0.049302876, -0.0706123412, -0.0246937685, 0.0966273099, 0.3412882686, -0.2193376422, -0.0735442191, -0.0155455898, 0.1029609442, 0.0029366985, 0.1824572086, -0.4736461043, 0.1919771731, 0.2427234352, 0.0934371799, -0.1141402945, 0.4848996699, 0.1436389387, 0.3039217293, -0.3430655599, -0.2384909391, 0.3032130599, 0.0595473535, -0.1561746746, -0.5542378426, 0.2285562754, -0.1924532354, 0.1500560641, -0.5447404385, -0.3107990623, 0.1632045209, -0.1011643037, -0.5235198736, 0.235222578, -0.1159074679, -0.1001252159, -0.0555109791, -0.2853478193, 0.1060263067, -0.1125025079, 0.1802149713, -0.0616507158 ]
https://github.com/huggingface/datasets/issues/620
map/filter multiprocessing raises errors and corrupts datasets
> The parallelism is automatically disabled on `tokenizers` when the process gets forked, while we already used the parallelism capabilities of a tokenizer. We have to do it in order to avoid having the process hang, because we cannot safely fork a multithreaded process (cf [huggingface/tokenizers#187](https://github.com/huggingface/tokenizers/issues/187)). > So if possible, the tokenizers shouldn't be used before the fork, so that each process can then make use of the parallelism. Otherwise using `TOKENIZERS_PARALLELISM=false` is the way to go. Ok thanks :) Is there something we should do on the `datasets` side to avoid that that the program hangs ? Also when doing `.map` with a tokenizer, the tokenizer is called once on the first examples of the dataset to check the function output before spawning the processes. Is that compatible with how tokenizers are supposed to be used with multiprocessing ?
After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ```
140
map/filter multiprocessing raises errors and corrupts datasets After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ``` > The parallelism is automatically disabled on `tokenizers` when the process gets forked, while we already used the parallelism capabilities of a tokenizer. We have to do it in order to avoid having the process hang, because we cannot safely fork a multithreaded process (cf [huggingface/tokenizers#187](https://github.com/huggingface/tokenizers/issues/187)). > So if possible, the tokenizers shouldn't be used before the fork, so that each process can then make use of the parallelism. Otherwise using `TOKENIZERS_PARALLELISM=false` is the way to go. Ok thanks :) Is there something we should do on the `datasets` side to avoid that that the program hangs ? Also when doing `.map` with a tokenizer, the tokenizer is called once on the first examples of the dataset to check the function output before spawning the processes. Is that compatible with how tokenizers are supposed to be used with multiprocessing ?
[ -0.4756868482, -0.184416011, -0.0169960856, 0.2049910575, 0.0819930136, -0.1816115826, 0.2112883925, 0.2972383797, 0.1219884455, 0.1559848487, 0.0783843473, 0.4451446533, -0.3864645362, 0.212769419, -0.3733436763, -0.0014690261, 0.1317393929, -0.1153581589, -0.1595416665, 0.2606371939, -0.2197672427, 0.3095825613, -0.2703736722, 0.166252628, -0.5590299368, 0.0384630337, -0.0322676152, 0.2863254547, -0.0549480245, -0.6010500193, 0.2577682436, 0.3555741012, -0.0043142065, 0.4539800584, -0.0001233048, 0.149854809, 0.3030837178, -0.0515439138, 0.0069509, -0.3851673901, -0.0252812728, -0.2770889997, 0.2699128985, 0.1735981107, 0.2168497741, -0.0784446672, -0.1051158458, -0.2321280539, 0.3105880022, 0.3287058175, 0.088408947, 0.3713169992, 0.3021603823, 0.1215193868, -0.0908219963, 0.1545790881, -0.0447722226, 0.063639313, 0.3348872364, -0.2771944404, -0.145732224, 0.2240811586, -0.2541382015, -0.2269085497, -0.1373942941, -0.2417159528, 0.5890156031, -0.5875573158, 0.2012945116, 0.076841265, -0.3183766007, -0.0541420691, -0.4318968654, -0.2669736445, -0.3097963035, -0.4587302804, 0.218554914, -0.0512985736, -0.1742660254, 0.1737107337, -0.4497326314, -0.2364095747, 0.0335573852, -0.0911895037, -0.0923879296, 0.5922752023, 0.1959979385, 0.3030860126, 0.1892191619, 0.1432411224, 0.0748302042, -0.0224435013, -0.0954667181, 0.167911917, -0.4035379887, 0.0827432051, 0.0919436514, -0.289583087, -0.1892670989, 0.0969899893, -0.2585219741, 0.1280375123, -0.1369833648, 0.0878548473, 0.384493351, -0.0407922938, 0.1682699919, 0.2833286524, 0.2176348865, -0.1963158995, 0.0822511911, 0.0322843157, 0.2650043368, -0.0385889634, -0.179919824, 0.2314647436, 0.1087989137, -0.1337112188, -0.1423175037, 0.145865798, -0.0681992173, -0.1296157539, 0.1331933439, 0.2315075696, 0.0405018926, 0.7244011164, 0.0960809886, 0.1575881243, -0.3393119276, -0.0529346168, -0.0782698244, -0.0812075436, -0.3169772625, 0.121912472, 0.2087178081, 0.0912898183, 0.0224033818, 0.046905946, -0.3484090567, -0.2333080173, 0.1524539888, -0.2576985657, 0.1139875203, 0.6639274955, -0.094220385, 0.0524131581, -0.0007991106, -0.0574264564, 0.037214309, 0.201759845, -0.4929461181, -0.1640476733, -0.2443073839, 0.0716861486, 0.0040532127, 0.2785003781, -0.2109914124, 0.3353478909, 0.3713303506, -0.336381793, -0.2624808848, -0.3061411083, -0.3731999397, -0.2744421661, 0.0567434877, 0.2622535825, -0.3388673663, -0.1239293888, 0.0962540582, -0.2862995565, 0.3363774419, 0.3003758788, 0.004522115, 0.0158558469, -0.1642983705, 0.267231971, 0.0184791833, -0.2350247055, -0.0645951927, 0.2141817659, -0.0669817924, 0.3825891912, -0.0254144967, -0.2721017003, 0.4200842977, -0.0821271688, 0.2206923068, -0.0719711781, -0.1598529667, 0.0723822936, -0.4469228089, -0.0518161207, -0.0820841342, -0.1019435078, 0.3698020279, 0.1915018559, -0.1476292759, -0.5076674223, 0.2738440633, -0.092694521, 0.32787624, 0.0927439407, 0.1244728863, 0.174836427, 0.11795827, -0.2395219952, -0.4158411324, 0.2211659998, -0.232601136, 0.0300395712, -0.0494017825, -0.0657086074, 0.1601953357, 0.2033798397, -0.2015705556, -0.1870109737, 0.0405648872, -0.0309697352, -0.2700847685, -0.0150517449, -0.1966945231, 0.5803504586, 0.0635140538, 0.2283320427, -0.0783324242, 0.2001925856, -0.1512624025, -0.4348412156, -0.3166471124, 0.2367545217, 0.0390143394, -0.0944888666, -0.171459496, 0.4456485212, 0.4693526924, -0.0102844909, -0.0764802024, 0.1878215522, 0.2870837152, -0.1167801395, -0.1751221418, 0.0338019468, -0.0003314875, -0.1464108825, 0.3196179271, 0.467567265, 0.2330543101, 0.35755983, 0.2539696991, 0.208148241, 0.3439672589, 0.0218103155, -0.0242781043, -0.114661023, 0.2205084711, -0.0994281024, 0.2154526263, -0.0023473185, -0.1119074821, -0.1728879511, 0.0676756352, 0.0434734449, -0.1648949981, 0.0478624143, 0.0696719512, -0.1340029091, 0.2139620185, -0.0598308705, 0.3442216218, 0.0206977688, -0.1673969477, 0.1541671008, -0.2430064529, -0.0026992038, 0.0961445868, -0.0014896151, 0.3433481455, 0.3187736273, 0.0600363761, 0.0198895596, -0.2822588682, -0.3517171144, 0.1048056185, 0.3378387392, -0.4633423388, 0.2322875559, -0.2338475287, 0.3845455348, 0.1340387464, -0.1681769639, -0.335367322, -0.5566542745, -0.0958942696, 0.5814809203, -0.138418898, 0.1703569591, 0.0081469342, 0.1842133701, -0.2478106618, 0.4309320152, -0.2272589505, -0.3418931365, -0.2030449808, -0.1250691712, 0.3325463831, -0.2567780912, 0.251768291, 0.175916642, -0.2938836515, 0.087206319, -0.3873682618, 0.1526653022, -0.0731864572, 0.0602888614, 0.1090695709, -0.015784841, 0.004555814, -0.2544359863, 0.1518447846, -0.235000208, -0.2803879678, 0.2101807892, -0.1396772414, -0.0038290657, -0.2559754252, -0.2211222649, -0.4681302011, -0.145834893, 0.039494589, -0.2275545299, 0.2909943461, 0.0990902632, -0.0002404898, 0.0341752991, -0.1882958859, 0.0580967739, -0.0850368291, -0.0544361882, -0.2630149722, 0.0116213933, -0.1198364496, -0.0642388612, 0.0606498122, 0.1806455851, 0.3339758813, -0.2563076317, 0.1001734212, 0.0493986756, -0.0370886773, 0.1855916977, -0.2218176872, 0.3342915475, 0.371093899, 0.0414133109, -0.008144401, -0.1233916953, 0.1171029806, -0.1319322586, 0.0046586059, 0.0892832875, 0.3490758538, 0.2013566792, 0.7101206779, 0.334425211, 0.0820882916, 0.4097700715, -0.1185180694, 0.1328540742, -0.1741560102, -0.4208974838, -0.1677476764, -0.2943629622, -0.0208639279, -0.1777103245, -0.1070089862, -0.3720281422, -0.2348509133, 0.4854690433, -0.3625715375, -0.2823321819, -0.0557360947, -0.3887001872, 0.2692976594, -0.0867177472, 0.0599014685, -0.101154238, 0.0085386969, -0.1931377053, 0.3219670653, 0.1452972889, -0.2985173166, -0.3974402547, -0.1347716004, -0.5677658916, 0.2774593532, 0.1943190247, 0.7197505832, -0.0222187936, -0.0550329462, 0.0424782671, -0.1738961041, 0.9335657358, -0.3680358827, -0.4003143907, 0.2850964069, -0.5228547454, -0.2758199275, -0.1567969322, -0.1825540811, 0.5299732685, 0.5033147335, 0.4867070317, -0.2295978367, -0.3556587398, 0.0830108747, -0.2391572744, -0.1092305779, 0.012159802, -0.3224846125, 0.0144701377, -0.283495605, 0.109376736, -0.2248122692, 0.1295709908, -0.2389337718, -0.0865682214, -0.0915004686, -0.1397094131, 0.1736970991, 0.0825339779, -0.0099505614, -0.276592344, 0.0427074656, 0.1082734019, 0.0614705421, 0.2878067493, 0.1701034009, -0.2898443937, -0.2364210933, 0.1754238158, 0.1405268908, 0.2871018648, 0.2971279323, 0.189392671, 0.0590800382, 0.1080008447, 0.2241964936, -0.3453715146, 0.1258982718, 0.3439382911, 0.0436474755, -0.5239557028, -0.0237528868, -0.0868224949, 0.4585831165, -0.1333038956, 0.533106029, -0.3373070061, -0.2532938123, 0.3776795566, 0.0537827164, 0.8072376847, -0.3265658617, -0.0229222681, 0.1562321782, 0.0173010975, 0.1537855268, -0.0873935446, 0.0119430982, -0.3142172098, 0.0634773523, -0.045258604, 0.0772980601, 0.3754206896, 0.1355509758, -0.1105252653, 0.3550400138, 0.0733519047, -0.0283131972, -0.0541694313, 0.2516596913, 0.3914585114, -0.1080618203, 0.105713591, -0.0169225261, -0.0128001422, -0.0900198817, -0.001877185, 0.0472424962, 0.2725729048, -0.1645090282, -0.4301683307, -0.1503294408, -0.1936087906, 0.1699729562, 0.1953965724, 0.202820763, -0.060846597, -0.0689831898, -0.049998939, 0.1060871333, -0.1081338227, -0.0966313332, 0.0212486759, 0.2923198342, 0.1452781558, 0.0103634074, 0.3265814185, 0.0780172795, -0.196808815, 0.0214367881, -0.2279841602, -0.2474512458, 0.0210345984, 0.0023377892, 0.1069785282, 0.1279895604, 0.1304856837, -0.0097238906, -0.1147374213, -0.2371559143, 0.0243882015, 0.0176019408, -0.1802401841, 0.3344121277, -0.0578750297, -0.3232226372, -0.033304017, 0.4687098265, 0.1090970337, -0.1480857134, 0.087135531, 0.156236589, -0.0553741269, -0.1955142021, -0.2417648435, 0.3024615943, -0.1383019537, 0.2691166103, 0.0501746349, -0.3383755088, 0.1524505317, 0.1182906777, 0.0824208856, 0.278463155, -0.2389623821, -0.3424565196, -0.5004397035, 0.3107464015, -0.0229353402, 0.1211040914, -0.004904069, 0.2242472023, -0.1838409007, 0.3456743956, -0.2167029828, 0.0904567763, -0.1229844838, 0.1570655853, -0.1399088204, -0.0782788992, -0.1073130369, -0.0705692917, 0.0576644316, 0.0072549134, -0.2576481104, -0.0122714601, -0.0123427324, 0.1236550361, 0.1437784284, -0.2519903183, -0.0008614399, 0.0077840984, -0.1153552979, -0.2093735337, 0.2162915766, 0.3250148296, -0.2374073565, 0.0713873133, 0.2589741945, 0.2015165836, 0.0900136679, 0.3069147468, -0.1149782389, 0.0090331882, 0.0585630499, 0.4187966287, 0.0750648528, -0.1738539636, 0.0440642349, 0.1432625204, 0.0831820816, -0.0576640703, 0.2839286923, -0.1335043907, -0.0173585042, -0.0663814321, 0.403185904, 0.4276719093, -0.0851339698, 0.0137507766, 0.1785945445, 0.0751704127, 0.0693165511, -0.238771379, 0.6134281158, 0.049021773, 0.1219350994, 0.2915588319, 0.3574239612, -0.0887227505, 0.1632320583, 0.1584092081, 0.0630490109, -0.0033902079, 0.2510576844, 0.025180338, 0.0141694248, 0.3006722927, 0.2103434056, -0.3132793605, -0.1205044165, 0.2594993412, -0.0557234436, 0.6039003134, -0.1152226031, 0.2057899833, -0.0456067584, -0.6763280034, 0.0114664808, 0.0381748788, -0.4961697459, 0.2350895405, -0.0345540754, 0.2672609985, -0.2967123389, -0.4756685197, -0.1563878059, 0.573658824, -0.3739910722, -0.3585555851, -0.2344926596, -0.1247156784, -0.0726849958, 0.035706304, -0.067015104, 0.1927329749, 0.671390295, -0.0255353115, -0.0532724224, -0.2078121006, -0.3529198468, -0.0205993839, 0.2887108624, -0.0138298944, 0.4510282278, -0.0193168893, 0.0346366689, -0.0704790354, 0.2078048885, 0.4842168093, 0.6428657174, -0.3257591724, -0.1031229421, 0.0446991734, -0.0426334888, -0.1281031072, 0.4979904294, 0.0402993076, 0.2436811328, 0.2387829274, -0.0687587559, -0.0063499957, -0.3184901774, 0.3477510214, 0.2925029993, 0.0784483999, 0.3953695595, -0.0102353301, -0.1058870554, 0.2256114334, 0.0787454098, -0.2978061438, -0.2154134512, 0.4452062845, -0.4718311131, 0.2071364969, 0.1623841375, 0.0543016568, 0.099355191, 0.4279837906, 0.3365134001, 0.0190368164, -0.2900007367, -0.1771737337, -0.4373069406, 0.056230627, -0.1554323882, 0.3129947186, 0.0160791837, -0.0343841091, -0.0275726207, 0.0940409452, -0.1088187024, -0.1512669772, 0.3697277904, 0.2707336545, -0.5438624024, 0.3023400903, 0.0414216146, -0.0839155167, 0.0237435922, -0.2340022326, 0.4112956524, 0.0445200503, -0.1232692301, 0.1092151999, 0.1708803475, 0.030402571, 0.0013583675, 0.0325281285, 0.3359396458, 0.5460199118, -0.0598557331, -0.0129343905, -0.2739141881, 0.0819797665, -0.1434950382, -0.0431737304, 0.0291469805, 0.4516669512, -0.1946025044, -0.0968337357, -0.0568247661, 0.1586850882, 0.0600403398, 0.1141103357, -0.4758605957, 0.3058162928, 0.1352462471, 0.0754724443, -0.1574519575, 0.4557606876, 0.2329502255, 0.3832817078, -0.3385998905, -0.2078129947, 0.3144842982, -0.1796897948, -0.067739822, -0.6403827071, 0.1724692136, -0.0376217365, 0.0543128178, -0.4913699031, -0.365424037, 0.1549301893, -0.2062965631, -0.5398039818, 0.2500114739, -0.1176327318, -0.2370580137, -0.0799299255, -0.1416474879, 0.1694277972, -0.0873234496, 0.2898896337, -0.0397705846 ]
https://github.com/huggingface/datasets/issues/620
map/filter multiprocessing raises errors and corrupts datasets
Hmmm I pulled the latest commit, `b93c5517f70a480533a44e0c42638392fd53d90`, and I'm still seeing both the hanging and the key error.
After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ```
18
map/filter multiprocessing raises errors and corrupts datasets After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ``` Hmmm I pulled the latest commit, `b93c5517f70a480533a44e0c42638392fd53d90`, and I'm still seeing both the hanging and the key error.
[ -0.3769074678, -0.1904280782, -0.0374410823, 0.1203722954, 0.0553162992, -0.1488190144, 0.2511086166, 0.3369264007, 0.147349298, 0.1732959449, 0.1061453521, 0.4089758098, -0.402579546, 0.3307687938, -0.3633013368, 0.0771413296, 0.1139563844, -0.0854949206, -0.2521779835, 0.2545394003, -0.2573972344, 0.3191767037, -0.3519816399, 0.1498745084, -0.4931677282, 0.0434547067, -0.0665593073, 0.2185117453, 0.006951943, -0.5854379535, 0.326818496, 0.3247092366, -0.0737520531, 0.4171652794, -0.0001270814, 0.0950681493, 0.374981761, -0.0569980815, -0.0621218346, -0.322950542, -0.2003950477, -0.1734237969, 0.3260553777, 0.1571727395, 0.215814963, 0.0487518981, -0.1209586337, -0.2518004179, 0.2916256785, 0.333244592, 0.0684019104, 0.2374105155, 0.2859759331, 0.062996611, -0.0574421138, 0.1290602088, -0.0421770178, 0.0746062696, 0.3901982307, -0.385820508, -0.1070057303, 0.2573369741, -0.21885553, -0.2458667904, -0.1455920339, -0.2127301395, 0.7035229206, -0.6686065793, 0.2560442388, 0.0739467293, -0.1959520578, 0.0557753742, -0.404548943, -0.2214630097, -0.1872922778, -0.3897366524, 0.2885861397, -0.0384366028, -0.2220057547, 0.1508984566, -0.4666243196, -0.299028337, -0.0254020616, -0.0910087824, -0.1253380477, 0.5177981853, 0.2228847742, 0.310762465, 0.2196398824, 0.1078672558, 0.1268883646, 0.0262997858, -0.0399371088, 0.1460581273, -0.3875405788, 0.123090297, 0.1270539612, -0.0346094929, -0.2650229037, -0.0606250092, -0.316619277, 0.1118732169, -0.0469085798, 0.0298337936, 0.4266043901, -0.1204851344, 0.1959402561, 0.2868335843, 0.299754858, -0.1972701699, -0.0210939124, 0.0749313682, 0.3320277333, -0.0966067761, -0.045182243, 0.2595968544, 0.1397740245, -0.2171104252, -0.1264468729, 0.2199630737, -0.030010283, -0.1575805098, 0.074505493, 0.2125531733, 0.0536563545, 0.7686822414, 0.1733050644, 0.1522126496, -0.3153266907, 0.0211221389, -0.0567555204, -0.0981144011, -0.3359582424, 0.0658253431, 0.251072228, 0.0283319876, -0.0049025901, 0.0512579083, -0.2201681137, -0.2804396152, 0.0260843933, -0.2334935218, 0.0855752528, 0.7288822532, -0.1311664879, 0.0851640329, -0.0606712028, -0.2393741012, 0.0579016358, 0.2445373982, -0.4499717951, -0.1594578177, -0.2083012909, 0.0692319721, 0.0207366943, 0.2660789192, -0.0923075005, 0.2476322055, 0.3904182315, -0.3749098778, -0.2846360803, -0.3598321378, -0.3236691058, -0.2998532057, -0.0062102303, 0.2738518119, -0.4148652852, -0.0681551546, 0.0381255299, -0.1951262504, 0.2469147444, 0.1701106727, 0.0047406144, -0.0040636547, -0.1401057988, 0.0848201364, 0.0337083489, -0.2646327019, -0.1709020734, 0.3487450778, 0.0507283956, 0.3495717049, 0.030820787, -0.2441288084, 0.3451979458, -0.1635711491, 0.2934062481, -0.1475644112, -0.1938305646, -0.0110451505, -0.3982700706, -0.0040061623, -0.1653380394, -0.027719371, 0.3931815624, 0.1603292823, -0.1570050716, -0.4206723571, 0.3244022727, -0.0621509068, 0.291777879, 0.0633133203, 0.0832025781, 0.0400551483, 0.1159482151, -0.2447222769, -0.5887967944, 0.2484186143, -0.2199754715, 0.0512512475, -0.1405695379, 0.0062955245, 0.0613508597, 0.2660133839, -0.2182734609, -0.1511551142, 0.0010988899, -0.1101251245, -0.3123207986, -0.0099407062, -0.2225746363, 0.6338647008, 0.1779185385, 0.1317068934, -0.0696570426, 0.1322144121, -0.2103149444, -0.4017415047, -0.2050702721, 0.2375406623, -0.0374560729, -0.0920632333, -0.1554382294, 0.4663212895, 0.4083087742, -0.0508937202, -0.0710794926, 0.1463083178, 0.299375236, -0.0992848352, -0.1243375093, 0.019627735, -0.0481175259, -0.2307187021, 0.1942163706, 0.4785980582, 0.1279975176, 0.4742357731, 0.1451378614, 0.2324373126, 0.2692559063, 0.038584739, -0.0693084151, -0.148695752, 0.2669601738, -0.2604361176, 0.2012295425, -0.0160272848, -0.1401325762, -0.1557420492, 0.1133950129, 0.0463224351, -0.2064401954, 0.0521031208, 0.1835553795, -0.117773369, 0.097070694, 0.0266081244, 0.3838664889, -0.0120634008, -0.16497311, 0.1629100591, -0.3187047541, -0.0670595095, 0.1021066904, -0.0751305073, 0.3526099324, 0.3592048883, 0.0093884468, -0.0652726591, -0.1116800755, -0.2466428429, 0.0869155973, 0.4097519219, -0.4359945953, 0.1810721904, -0.2512141466, 0.3177021146, 0.1722840667, -0.1079265177, -0.300871253, -0.6829923391, -0.0988099873, 0.4672753215, -0.0280367769, 0.1316034943, -0.0585719422, 0.097900942, -0.2034886032, 0.3690124154, -0.2601802349, -0.3610714674, -0.1307524145, -0.116575785, 0.3876151443, -0.3217427731, 0.1939662695, 0.1785331517, -0.2623197436, 0.2360369861, -0.3630243838, 0.0889375731, -0.1330858022, 0.0983871743, 0.2035270631, 0.0626999885, 0.0286585316, -0.2158974856, 0.13849549, -0.248205632, -0.21093826, 0.2669925988, -0.1468038857, 0.0815713108, -0.271766305, -0.2617190182, -0.4254598916, -0.121594429, -0.0129349157, -0.2455708086, 0.2889829576, 0.0955773816, 0.0731863379, 0.0277667083, -0.0155600905, 0.0322794393, -0.0874171257, -0.0393871143, -0.1897683591, 0.0770543739, -0.0941976607, -0.0101407543, 0.0946572646, 0.1635848731, 0.2471495867, -0.4137744009, 0.0598172992, 0.018181866, 0.0326774865, 0.2200662792, -0.2045188546, 0.2996484339, 0.4151649177, 0.0442237668, -0.0472326539, -0.1688159257, 0.1457175165, 0.0228530467, 0.004089538, 0.0625500977, 0.3118416369, 0.138562575, 0.5838055611, 0.3471314907, 0.1333994865, 0.2966780961, -0.010763092, 0.1625949591, -0.2375049144, -0.3424848616, -0.230947271, -0.2596516311, -0.0357607417, -0.20820871, -0.0774802864, -0.475335598, -0.3420633078, 0.5736966133, -0.3473549783, -0.2954933047, -0.0182279069, -0.4701860249, 0.1867371649, -0.0581232309, 0.0410768986, -0.0324108899, 0.0159688592, -0.2220126837, 0.2033531964, 0.2477430552, -0.3888172209, -0.2757534087, -0.3214168847, -0.5434209704, 0.3417307734, 0.1708777994, 0.6498445868, -0.0604272187, -0.0693534762, 0.0671449974, -0.279838562, 0.9282134771, -0.6133785844, -0.413512826, 0.3380133212, -0.3858445287, -0.3094111979, -0.1624198854, -0.1247924343, 0.6172780991, 0.4274933636, 0.4958398044, -0.1949277967, -0.3422041535, 0.1864225566, -0.2154035568, -0.1390998214, -0.0089295059, -0.3572756648, -0.0246111304, -0.1836157441, 0.0630400106, -0.2271407545, 0.0721593201, -0.2332511246, -0.0317148119, -0.0655675903, -0.1067607105, 0.1498788595, 0.0699226856, -0.0642876327, -0.1818166524, 0.0786016881, 0.1185865775, 0.0782494172, 0.1344026476, 0.192481339, -0.4369201064, -0.0408000313, 0.1790242344, 0.024401091, 0.2706045806, 0.376802206, 0.2354345918, -0.0519163199, 0.1326308697, 0.2377824038, -0.3062328994, -0.0404672511, 0.3642690778, -0.0291334298, -0.5488157272, -0.14459683, -0.0697221085, 0.4107924104, -0.2437295914, 0.4938352406, -0.352071017, -0.2111813575, 0.4083633721, 0.0432298072, 0.8509263396, -0.4368037283, -0.0274584014, 0.0738661587, 0.0087482259, 0.0746611059, 0.0460171476, -0.0211693086, -0.2881328464, -0.0226640422, -0.0523689948, 0.0165420324, 0.2664730251, 0.0620556436, -0.1888650507, 0.3902169168, -0.0525775105, -0.0227682255, -0.0531621501, 0.1566172391, 0.3568082154, 0.0011245217, 0.1205026507, -0.0319024995, 0.0010161418, -0.1539351642, -0.0333329663, 0.037693698, 0.2755375206, -0.1731652766, -0.4624627829, -0.0863377005, -0.0337373205, 0.1094928384, 0.180020988, 0.0868439376, -0.0420008749, 0.1443869174, -0.0876279473, 0.1290788054, -0.104980953, -0.0392892808, 0.0913177729, 0.3167605102, 0.2007427216, 0.0527566262, 0.2512255609, 0.0953505039, -0.272937119, 0.0647851378, -0.297625035, -0.2978366613, -0.0349188596, 0.0663810968, 0.0433259495, 0.1381474435, 0.1084845215, 0.0053986181, -0.1620922089, -0.2438485026, 0.0253622383, -0.0496190749, -0.0770036206, 0.5529298186, -0.1073595881, -0.3646471202, 0.0220890604, 0.4827077985, 0.2201128751, -0.0684501603, 0.1795391738, 0.1709267795, -0.0470665619, -0.2419932783, -0.2201522589, 0.182852596, -0.1992957294, 0.3358792067, 0.0801731572, -0.2542219758, 0.250511229, 0.0631614029, 0.103247568, 0.275700897, -0.2074913383, -0.3126969934, -0.4722817838, 0.2779870033, 0.0353671797, 0.1646442711, -0.1440235674, 0.0844868422, -0.0956176221, 0.3184505403, -0.2017914802, 0.1109799966, -0.1178986877, 0.1734064966, 0.0233174339, -0.068320781, -0.0564723983, -0.0996854305, 0.0616015494, 0.0252548605, -0.2078844905, 0.0018626489, 0.0787152946, 0.1410407275, 0.0153405406, -0.1227160096, -0.018377129, -0.0836906731, -0.0561201163, -0.149604857, 0.2362178862, 0.2881513238, -0.1598276347, 0.1271130443, 0.1638922989, 0.2709273696, 0.184581086, 0.3855479062, -0.1361585855, 0.0546355471, 0.0643815845, 0.3762480319, 0.1178331524, -0.1927626729, 0.0755717456, 0.1425984353, -0.1473908275, -0.0080327736, 0.391315788, -0.1805915385, 0.0680833459, 0.0077023692, 0.4710885882, 0.4523412585, -0.0779252574, 0.1208761707, 0.1299077719, 0.0696716011, 0.034648478, -0.1935384572, 0.4906767309, 0.1123970598, 0.0491275564, 0.3103136122, 0.2718248367, -0.2247927189, 0.242173329, 0.2108755857, -0.0402919464, -0.0609164424, 0.1589463949, 0.2681151927, 0.12290214, 0.1887984425, 0.2788739204, -0.278347671, -0.1302313209, 0.2561227083, -0.0551638789, 0.4227167368, -0.0360202417, 0.1916900873, -0.0057649799, -0.5985740423, 0.0967565104, 0.1316303611, -0.5105829835, 0.1886386275, 0.0281333812, 0.2563146353, -0.3215011358, -0.4784181416, -0.2237989008, 0.491032958, -0.286705941, -0.361181885, -0.1705146432, -0.1840224862, -0.0509408787, 0.0617617145, -0.1212101653, 0.1888552904, 0.764154911, 0.0984373018, 0.0199345909, -0.218836382, -0.3564907312, -0.1301347017, 0.3811823726, -0.0232565179, 0.399907738, 0.0608853549, 0.0769784302, -0.0178104695, 0.1241744533, 0.5050504804, 0.6178363562, -0.5770912766, 0.0544341579, 0.1402112842, -0.0746734738, -0.1047941446, 0.441591531, 0.1400590092, 0.3452733755, 0.2643518448, -0.0792870224, 0.0287589729, -0.1752515137, 0.3240354061, 0.4146074951, 0.1198319495, 0.3769000173, 0.0824237689, -0.1472591609, 0.3186637163, 0.1207049787, -0.3364005089, -0.3069581091, 0.3774549067, -0.473380357, 0.2193964571, 0.2029016912, 0.053062994, 0.0539453663, 0.5389311314, 0.3228668571, 0.1460400522, -0.2956556082, -0.1743112355, -0.465269953, -0.1036810502, -0.1289960146, 0.2323548198, -0.0010729432, 0.0631358773, 0.01736979, 0.0486699902, -0.0112876296, -0.3573031723, 0.3776763678, 0.3754169047, -0.4881598055, 0.2943688631, 0.1610055566, -0.0582351163, -0.0448183939, -0.2514106929, 0.365336895, 0.0154903904, -0.1032706276, 0.0839195102, 0.131334275, 0.0639711544, 0.1456253082, 0.0059960335, 0.295233041, 0.5333593488, -0.00390625, -0.0923868194, -0.2899855375, -0.001628641, -0.0063386708, -0.0786272436, 0.0183876678, 0.4141328633, -0.1897064894, -0.0324866287, 0.0221516676, 0.036564488, 0.0039272606, 0.2086857557, -0.434114784, 0.2006961703, 0.1827539504, 0.0372736901, -0.1726937294, 0.4718486965, 0.2118580788, 0.3240718246, -0.275852561, -0.1918586195, 0.308701992, -0.0378632173, -0.1339871883, -0.488653779, 0.2032842338, -0.2365765423, 0.1142092943, -0.5074617863, -0.3361957669, 0.1557457149, -0.1966836154, -0.6136549711, 0.2853866518, -0.0619891435, -0.2154582143, -0.0629373938, -0.2218506634, 0.1091929302, -0.1340532303, 0.1599438936, -0.0997042805 ]
https://github.com/huggingface/datasets/issues/620
map/filter multiprocessing raises errors and corrupts datasets
Hi @timothyjlaurent The hanging fix just got merged, that why you still had it. For the key error it's possible that the code you ran reused cached datasets from where the KeyError bug was still there. Could you try to clear your cache or make sure that it doesn't reuse cached data with `.map(..., load_from_cache=False)` ? Let me know if it it helps
After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ```
63
map/filter multiprocessing raises errors and corrupts datasets After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ``` Hi @timothyjlaurent The hanging fix just got merged, that why you still had it. For the key error it's possible that the code you ran reused cached datasets from where the KeyError bug was still there. Could you try to clear your cache or make sure that it doesn't reuse cached data with `.map(..., load_from_cache=False)` ? Let me know if it it helps
[ -0.3729743958, -0.1859519333, -0.0339066535, 0.1292065084, 0.0980354249, -0.1325109899, 0.2636092305, 0.4240572453, 0.1391319335, 0.1339372098, 0.04652787, 0.4665991068, -0.3696385026, 0.4466210008, -0.3270708621, -0.0237083137, 0.0842830539, -0.0867141187, -0.2671529651, 0.2537029982, -0.3337045014, 0.283506453, -0.3157280684, 0.2287003398, -0.4680306911, -0.0196890589, -0.1571885645, 0.2040823251, -0.0384006798, -0.5968237519, 0.3087914586, 0.2592989802, -0.0386930481, 0.4125380814, -0.0001159086, 0.0726361573, 0.2915794849, -0.0694601759, -0.1238340959, -0.4056950212, -0.1630978584, -0.1920109242, 0.2794991136, 0.0924811214, 0.1524777859, -0.0606593229, -0.1370133907, -0.1714395583, 0.3692105711, 0.2973995805, 0.1916393638, 0.1836926043, 0.3016476035, 0.0850574076, -0.0235229805, -0.0432402156, -0.0450258665, -0.0441332944, 0.3562735021, -0.3835231066, -0.1636611968, 0.3938466012, -0.1591008306, -0.2558551431, 0.0481432192, -0.2032790929, 0.5871523023, -0.671469152, 0.193637535, 0.0861368775, -0.1640221924, 0.0897188336, -0.4611240029, -0.2372851074, -0.1507684588, -0.415684849, 0.2953737378, -0.0046017449, -0.2389934808, 0.1260257661, -0.4238137603, -0.295544982, -0.0063167885, -0.1394235194, -0.0698228255, 0.5199891329, 0.2127801925, 0.2648281455, 0.2387395501, 0.0557519048, 0.1047798768, -0.0716205612, -0.1317557693, 0.2191710472, -0.4095769823, 0.0330455452, 0.1441627443, 0.0376286842, -0.1656428874, -0.159757629, -0.2449799329, 0.0867292657, 0.0025473386, -0.0358655788, 0.4091966748, 0.0254016519, 0.2701695561, 0.2604086101, 0.2943359911, -0.1905031353, -0.0741187707, 0.0486717746, 0.2814125121, -0.1145019531, 0.0346958935, 0.3480426073, 0.1715754271, -0.1595719159, -0.0572376624, 0.0996335223, -0.0458176658, -0.1653963923, 0.0432072617, 0.2337241322, 0.0642116815, 0.6473724246, 0.123902306, 0.1313351244, -0.2320636213, -0.0496825427, -0.111442104, 0.0055615809, -0.3369038701, 0.0278138444, 0.2164689302, -0.1401568651, 0.0144438669, 0.0348332487, -0.2584848404, -0.2159377635, 0.0866362527, -0.1109286249, 0.105356589, 0.6346385479, -0.0904197469, 0.0813512281, -0.0073843142, -0.2797871828, 0.0319330692, 0.2214625776, -0.5432989597, -0.2066185325, -0.0062795617, 0.1853592098, 0.0336940363, 0.1731324643, -0.1481186897, 0.233171463, 0.317797482, -0.2863823771, -0.2881159484, -0.2861081958, -0.3216991723, -0.3718680441, -0.0147519894, 0.2520572841, -0.4537151158, -0.0823265463, 0.015872255, -0.1219129413, 0.1184696406, 0.1592023969, -0.1201318055, 0.1245122403, -0.183537513, 0.1869907975, 0.0609240495, -0.2588416338, -0.2698890269, 0.3641793728, -0.0383910015, 0.2907693386, 0.0354469642, -0.1966841072, 0.3934171498, -0.1237008274, 0.3538432717, -0.055093877, -0.1739273965, 0.0194122642, -0.3822776079, -0.0390149914, -0.1381127983, -0.1211567819, 0.4054741561, 0.0988969356, -0.2085455656, -0.3918995261, 0.2793351412, -0.1241681352, 0.3125299215, 0.0472509451, 0.1341273636, -0.0408254527, 0.2059614807, -0.1856371164, -0.4679732621, 0.3226782084, -0.2243600786, 0.0517387837, -0.1350093335, -0.0220245086, -0.0217185952, 0.1625898927, -0.3280446529, -0.2965449095, 0.1462073624, -0.0975855887, -0.1941839755, -0.0213771798, -0.2148848027, 0.5157758594, 0.3289864063, 0.1015793011, -0.1412807107, 0.1307110786, -0.3029307723, -0.3651213944, -0.1901478767, 0.151597023, -0.0101105347, -0.0993063152, -0.2007431388, 0.4787654281, 0.3853653669, -0.0566650741, -0.0514837168, 0.2301514, 0.2624252439, -0.0285462961, -0.1271886081, 0.1052160189, 0.0398716666, -0.1369028836, 0.1302663833, 0.6239998341, 0.1339048892, 0.4230163991, 0.1884406954, 0.25621292, 0.2815240622, 0.0092141852, -0.0580090545, -0.0960735828, 0.1638166904, -0.3738697767, 0.0694388226, -0.0546684787, -0.0387199447, -0.1118887961, 0.2121152133, 0.0672767237, -0.1295356899, 0.065358907, -0.0247471295, -0.0656455234, 0.0686136037, 0.1543726325, 0.3296007514, 0.0632077157, -0.2049695253, 0.1575352103, -0.1924858391, -0.0033681169, 0.2201357782, -0.1017462313, 0.2729793787, 0.3356057703, 0.0168673582, -0.0137431715, -0.1414911151, -0.1968515813, 0.0353916511, 0.4144651592, -0.4009287953, 0.1306916028, -0.3032637239, 0.1425557584, 0.2121396363, 0.0424721353, -0.1998409927, -0.6393452883, 0.0158740412, 0.3853756487, -0.0572636053, 0.1523911059, 0.0208076462, 0.0435620248, -0.1224137247, 0.3570632637, -0.1496758312, -0.4756733179, -0.1689331681, 0.0046426989, 0.3523487151, -0.339056462, 0.2837489545, 0.1046040282, -0.2289169431, 0.2102588117, -0.4162885845, 0.0956427604, -0.1570354849, 0.0642077178, 0.1687161326, 0.1463492215, -0.015052177, -0.2420863211, 0.1787248999, -0.2942128181, -0.3931435645, 0.3215639889, -0.1034125388, 0.1561696231, -0.2509377599, -0.4179033339, -0.3237200379, -0.2003942579, 0.0025402494, -0.2512458563, 0.2524789274, 0.1682387739, 0.0705054253, 0.0267114043, -0.097010538, 0.0801038966, -0.0571685024, 0.0765102804, -0.1497868747, 0.0077676736, -0.2464170754, 0.0273186713, -0.0277022794, 0.1303967834, 0.16356197, -0.3969524205, -0.019088272, 0.0411075875, 0.1432684958, 0.2680862546, -0.1667963415, 0.3800821304, 0.3647363186, -0.0318775959, -0.0121162832, -0.0928803831, 0.0071053468, 0.0735901594, 0.0149995759, 0.0612242259, 0.3435928226, 0.1424989998, 0.685436666, 0.2752220631, 0.1665731519, 0.246554926, -0.0247273911, 0.107436493, -0.3065448105, -0.3919322193, -0.2665601969, -0.3089667261, 0.0054317862, -0.1595830619, -0.0569788441, -0.4351716936, -0.261517942, 0.3561649919, -0.3579083979, -0.369356215, 0.0764670372, -0.4184981287, 0.1878630966, 0.015867658, 0.0569496304, -0.055755049, 0.0530098677, -0.1381235123, 0.211224407, 0.2029324025, -0.4287494719, -0.2992617786, -0.3367245197, -0.4790397286, 0.3524398804, 0.1396894753, 0.4911385775, 0.0711003393, -0.1520290077, 0.0255532917, -0.2880281806, 0.8684402108, -0.637208581, -0.4212187827, 0.2932736874, -0.4309197068, -0.3340886235, -0.1677615345, -0.2115841657, 0.6215444803, 0.4047411084, 0.6619133949, -0.1441090703, -0.3067399859, 0.0630456731, -0.0505902395, -0.1604542136, 0.0117190517, -0.2843122482, -0.0188126564, -0.3132394254, 0.090273805, -0.2285684049, 0.1387683004, -0.247348249, -0.044891037, -0.066725798, -0.182761848, 0.0605005398, 0.1177206933, 0.0207157955, -0.178988874, 0.1849312186, 0.0713651627, 0.1505002081, 0.144843027, 0.2358932942, -0.3872399628, -0.0403510965, 0.1115827709, -0.0157176591, 0.1632237732, 0.3018372655, 0.1191117465, 0.0477821901, 0.1097645164, 0.1298587024, -0.2769139409, -0.1139098406, 0.3287651241, -0.0388905145, -0.4032018483, -0.2041195929, -0.0893790945, 0.3325067759, -0.2921118736, 0.4312382936, -0.2718986571, -0.1908376813, 0.4886578619, -0.0745827705, 0.7769600153, -0.3349075913, -0.0066601541, 0.0609167367, 0.0300631821, 0.1367227733, 0.1239456311, 0.043686524, -0.3786576688, -0.0925654322, -0.0124078393, 0.0585335195, 0.2362628877, 0.058880575, -0.2678052783, 0.3634711802, -0.0307005793, -0.0799856931, -0.1024515033, 0.206217289, 0.2626483738, -0.0335404351, 0.1019343436, 0.0622214451, 0.0284093004, -0.0071157608, -0.0046361126, -0.0091661662, 0.2356941402, -0.3067147136, -0.4391649067, -0.1244048104, -0.0712276325, 0.1604684591, 0.1113623232, 0.0259932876, -0.0110035613, 0.1811230928, -0.0262444876, 0.1916253269, -0.1083160564, -0.0665972307, 0.0885277539, 0.297801733, 0.157238692, 0.0032183677, 0.2674489319, 0.0098747835, -0.270814389, 0.0919782519, -0.2495940328, -0.2325695306, 0.0245082527, 0.0437905565, -0.0755676031, 0.1149441898, 0.1390469521, -0.0572594889, -0.2020317018, -0.2990485728, 0.1250750422, -0.0670377463, -0.1480725557, 0.5049772263, 0.0004374374, -0.3364855051, 0.0603947192, 0.5290408134, 0.3188911676, -0.0273060426, 0.3020718098, 0.1057572216, -0.0224166252, -0.3198133707, -0.1011004671, 0.3219787776, -0.1888061762, 0.2999119461, 0.0288368315, -0.1429737508, 0.29146716, 0.1589841545, 0.2008523941, 0.324157387, -0.2165279388, -0.2855882049, -0.4925097227, 0.3401562274, 0.069254227, 0.1096185595, -0.1709639132, 0.0120271854, -0.0885961056, 0.3379906118, -0.3164949417, 0.0990832001, -0.1227276772, 0.1764858961, 0.0475693122, 0.0167668462, -0.0642427132, -0.0726295859, 0.1442051232, 0.0324130654, -0.202299878, -0.1341395825, 0.056739822, 0.0862879679, 0.0451297686, -0.0931860805, -0.06408263, -0.1119963452, -0.0435157493, -0.1591749489, 0.2522531748, 0.2442685217, -0.1731220186, 0.1908837855, 0.1939109415, 0.1710808426, 0.1297674924, 0.3485622108, -0.2650258839, 0.0747029185, 0.0642915815, 0.344096005, 0.1167639196, -0.1704270691, -0.0454203784, 0.0493139774, -0.2381976694, 0.0777194574, 0.3738657236, -0.1637315452, -0.0200511813, 0.0756105185, 0.4149125516, 0.3762213886, -0.0997753292, 0.1957226396, 0.1410351247, 0.1938937008, -0.0605850704, -0.2219110876, 0.3633991182, -0.0010875575, 0.1359814405, 0.2581402659, 0.2731423378, -0.101000011, 0.1861303896, 0.1735335588, -0.0997623205, 0.0171634816, 0.1727564484, 0.3383667767, 0.094493039, 0.1121392548, 0.2775924802, -0.3374083042, -0.1136409864, 0.2190268189, -0.1272079945, 0.4180168808, 0.0461271293, 0.178393811, 0.0522110239, -0.5796560049, 0.226004377, 0.0386141464, -0.4828646183, 0.0830071419, 0.0876455605, 0.1291490644, -0.2772058547, -0.4099136293, -0.1989882886, 0.4446932673, -0.3052483499, -0.3082732856, -0.246054545, -0.2431775928, -0.0763233751, 0.036475569, -0.12999475, 0.1892974526, 0.687055409, 0.0264354292, -0.0380797796, -0.1530596614, -0.2805671692, -0.0369319841, 0.4131475985, -0.0874878913, 0.3061194122, 0.1116387695, 0.0818896443, 0.0151323453, 0.1614597738, 0.5225880146, 0.6355862617, -0.5816594362, 0.0809553489, 0.1785971373, -0.130025804, -0.1262001693, 0.3898411095, 0.1103925705, 0.3445352912, 0.3547883332, 0.0350761563, -0.0407641828, -0.0740590245, 0.2384990454, 0.3921905458, 0.0455205329, 0.2923434973, 0.136506483, -0.1297212243, 0.2803793848, 0.1797733754, -0.331474483, -0.3247244656, 0.4075756073, -0.3387224674, 0.217364803, 0.0483184606, 0.1162083298, 0.032241717, 0.5574448705, 0.3594414294, 0.094039835, -0.331867367, -0.0865824446, -0.5268971324, -0.0841360837, -0.2359144092, 0.1518826038, -0.0689223856, 0.1092917249, -0.0044612475, 0.1244384348, -0.0152090192, -0.3844551146, 0.3050990105, 0.4490069151, -0.490737915, 0.181388095, 0.3065486252, -0.0227140523, -0.030096367, -0.3004018664, 0.3344732821, 0.0385639258, 0.0250893161, 0.1233911887, 0.0685882717, 0.1777246296, 0.0547104441, 0.0470290259, 0.2941527963, 0.5331445336, -0.0617082119, -0.0615420863, -0.2617750168, -0.0059646405, -0.117256254, -0.0979967713, 0.1279918104, 0.4231702983, -0.08404807, -0.1946536899, -0.0175151359, 0.1063803807, -0.0377557054, 0.118232429, -0.404391557, 0.2538506389, 0.2120010108, 0.0213681869, -0.1611507833, 0.441686362, 0.1101303697, 0.1553226858, -0.2664438784, -0.247541368, 0.2882623672, -0.0436087698, -0.1493017673, -0.4830524325, 0.1681105793, -0.2236528695, 0.1611626446, -0.4133827984, -0.289352119, 0.2315832824, -0.1507373452, -0.5155611038, 0.2117444873, -0.0465030335, -0.1064362377, -0.0427053384, -0.2824116945, 0.0595691018, -0.1379958391, 0.1203346997, -0.0893133134 ]
https://github.com/huggingface/datasets/issues/620
map/filter multiprocessing raises errors and corrupts datasets
Hi @lhoestq , Thanks for letting me know about the update. So I don't think it's the caching - because hashing mechanism isn't stable for me -- but that's a different issue. In any case I `rm -rf ~/.cache/huggingface` to make a clean slate. I synced with master and I see the key error has gone away, I tried with and without the `TOKENIZERS_PARALLELISM` variable set and see the log line for setting the value false before the map. Now I'm seeing an issue with `.train_test_split()` on datasets that are the product of a multiprocess map. Here is the stack trace ``` File "/Users/timothy.laurent/src/inv-text2struct/text2struct/model/dataset.py", line 451, in load_prodigy_arrow_datasets ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/src/datasets/src/datasets/arrow_dataset.py", line 168, in wrapper dataset.set_format(**new_format) File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/src/datasets/src/datasets/fingerprint.py", line 163, in wrapper out = func(self, *args, **kwargs) File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/src/datasets/src/datasets/arrow_dataset.py", line 794, in set_format list(filter(lambda col: col not in self._data.column_names, columns)), self._data.column_names ValueError: Columns ['train', 'test'] not in the dataset. Current columns in the dataset: ['_input_hash', '_task_hash', '_view_id', 'answer', 'encoding__ids', 'encoding__offsets', 'encoding__overflowing', 'encoding__tokens', 'encoding__words', 'ner_ids', 'ner_labels', 'relations', 'spans', 'text', 'tokens'] ```
After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ```
174
map/filter multiprocessing raises errors and corrupts datasets After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ``` Hi @lhoestq , Thanks for letting me know about the update. So I don't think it's the caching - because hashing mechanism isn't stable for me -- but that's a different issue. In any case I `rm -rf ~/.cache/huggingface` to make a clean slate. I synced with master and I see the key error has gone away, I tried with and without the `TOKENIZERS_PARALLELISM` variable set and see the log line for setting the value false before the map. Now I'm seeing an issue with `.train_test_split()` on datasets that are the product of a multiprocess map. Here is the stack trace ``` File "/Users/timothy.laurent/src/inv-text2struct/text2struct/model/dataset.py", line 451, in load_prodigy_arrow_datasets ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/src/datasets/src/datasets/arrow_dataset.py", line 168, in wrapper dataset.set_format(**new_format) File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/src/datasets/src/datasets/fingerprint.py", line 163, in wrapper out = func(self, *args, **kwargs) File "/Users/timothy.laurent/.virtualenvs/inv-text2struct/src/datasets/src/datasets/arrow_dataset.py", line 794, in set_format list(filter(lambda col: col not in self._data.column_names, columns)), self._data.column_names ValueError: Columns ['train', 'test'] not in the dataset. Current columns in the dataset: ['_input_hash', '_task_hash', '_view_id', 'answer', 'encoding__ids', 'encoding__offsets', 'encoding__overflowing', 'encoding__tokens', 'encoding__words', 'ner_ids', 'ner_labels', 'relations', 'spans', 'text', 'tokens'] ```
[ -0.3455715179, -0.1244764626, -0.0099465251, 0.2436583638, 0.1326489151, -0.1085882634, 0.2091189176, 0.3842710257, 0.1275978833, 0.0554182865, 0.0650869161, 0.2885192633, -0.4160085917, 0.3402270973, -0.2788421214, 0.0649457797, 0.1176878959, -0.1377786994, -0.3737221658, 0.2305666804, -0.2718409896, 0.3353239, -0.2338253707, 0.1210064441, -0.5046043992, 0.044989042, -0.1111249328, 0.3717215359, 0.0684829205, -0.5890561938, 0.336789906, 0.2819590271, -0.0480296612, 0.3480780423, -0.0001237825, 0.1063452661, 0.3847543299, -0.1089713722, -0.0622289218, -0.3263473511, -0.1737762839, -0.1924264878, 0.2838897109, 0.0872731358, 0.1946081668, 0.0350467674, -0.1431169361, -0.2287727296, 0.329803288, 0.3584725559, 0.1089559644, 0.2274537981, 0.261017561, 0.1091632098, 0.0079342443, 0.0144448383, 0.0120127052, 0.0105954334, 0.2656101584, -0.3017510176, -0.1495727003, 0.3722896278, -0.2551001906, -0.2173596025, -0.005880693, -0.2193848938, 0.6108902693, -0.5381213427, 0.2770036161, 0.0419183522, -0.2300307751, 0.0360987447, -0.4760312736, -0.2049937844, -0.2889748216, -0.5068157315, 0.348262459, -0.0966813415, -0.1204990447, 0.1607676297, -0.5080751181, -0.2501935363, 0.0702679083, -0.130907923, -0.0440388098, 0.451602906, 0.2043743879, 0.2419660389, 0.2371709645, 0.120609507, 0.082829982, 0.0368177854, -0.1755400896, 0.1539330184, -0.4410715997, 0.0727008134, 0.1501144469, -0.011230655, -0.2254720628, 0.0307364836, -0.1942614317, 0.0992030054, -0.0916409791, -0.0203568749, 0.3888457417, 0.0562040694, 0.1859546304, 0.1347612739, 0.2832922041, -0.1339153349, 0.0400833115, 0.0852167755, 0.3456961215, -0.0811673477, -0.012440525, 0.4133487046, 0.1129975691, -0.2294030786, -0.059984412, 0.150242269, -0.0300123915, -0.2148934901, 0.0187903345, 0.1588861495, -0.0029366314, 0.7559257746, 0.1216322631, 0.1799939275, -0.2985370457, -0.1033539623, -0.0879791975, -0.0783389211, -0.4462860227, 0.1121377945, 0.2708314955, -0.1769162416, 0.002922941, -0.0293805134, -0.3343161047, -0.2672931254, -0.0057172105, -0.1809576452, 0.1032913923, 0.6875528097, -0.1682295352, 0.0362339541, -0.0051767537, -0.07475283, 0.0804439783, 0.2214312404, -0.5367617607, -0.2646109164, -0.2048505992, 0.0789482445, 0.0069825947, 0.3434107006, -0.0932206511, 0.2554689646, 0.4977131188, -0.3562062979, -0.2382132262, -0.2570955157, -0.4370437562, -0.3161762953, 0.0009101853, 0.3135169744, -0.3315528333, -0.1214470491, 0.0876628906, -0.1581774205, 0.1792081296, 0.263336122, 0.0320779905, -0.0481659435, -0.1246595457, 0.0071642995, 0.0008243397, -0.2042134702, -0.2987397909, 0.1712993234, 0.0336492807, 0.3593226373, 0.0736716241, -0.2474052012, 0.3209020197, -0.1843068004, 0.3235271275, -0.1109331101, -0.1499822289, 0.0182688013, -0.4165968299, -0.0977839679, -0.1438875496, -0.1557714641, 0.4019340277, 0.2475281805, -0.092515856, -0.4669561982, 0.27581653, -0.1199948043, 0.3561440408, 0.1221893132, 0.0755645186, 0.0702305436, 0.1730287969, -0.1853505373, -0.6047537327, 0.3041416109, -0.2501341701, 0.0431422256, -0.1477177441, -0.0157316104, 0.1335426718, 0.089136079, -0.1490691304, -0.2978051603, 0.0218346417, -0.0204792954, -0.20459342, -0.0690911934, -0.2441113144, 0.7258079052, 0.1483658552, 0.1382078528, -0.1974516809, 0.1050305888, -0.194503203, -0.3886590004, -0.2837042809, 0.2788517177, 0.0128649687, -0.1279389858, -0.1980294138, 0.3907419443, 0.4001691341, -0.0490547046, -0.0479038134, 0.2991716266, 0.2560943365, -0.1387453824, -0.0958264172, 0.0244670324, -0.0276679639, -0.1636603773, 0.2050243914, 0.4712378383, 0.0942042768, 0.4115885794, 0.1363153458, 0.208451122, 0.3067333102, 0.0051695034, 0.0097786933, -0.1651754677, 0.1741487086, -0.1718352437, 0.2099789232, 0.0183889139, 0.0095013902, -0.1484383941, 0.1744220257, 0.02080293, -0.1467273831, 0.0808339566, 0.0518875085, -0.1830231845, 0.1379191726, 0.0632491335, 0.3419296443, -0.018310206, -0.1588672101, 0.2417916805, -0.1436591595, 0.030283954, 0.1401723921, -0.1044880897, 0.2664695978, 0.2962370813, 0.0669785738, 0.0796570927, -0.1835866272, -0.1547775418, 0.087173976, 0.3825925887, -0.4576375782, 0.1936841458, -0.2836959362, 0.2762708962, 0.2124965191, -0.0819847658, -0.3219940662, -0.5981808305, -0.0910406411, 0.5325679779, -0.0922539756, 0.1659389287, 0.0012374669, 0.1561729908, -0.1698636264, 0.3994179368, -0.2692894638, -0.4219842553, -0.1239214391, -0.129248932, 0.3829291463, -0.4229316711, 0.2115244418, -0.0087404624, -0.259565413, 0.1289772391, -0.3988408446, 0.093348287, -0.1177260429, -0.0403881222, 0.1544982195, -0.0220290236, 0.0301759541, -0.1615736783, 0.1917391717, -0.3205644488, -0.3525018692, 0.2628384829, -0.1099283174, 0.1426185668, -0.30572173, -0.2488420606, -0.3357540965, -0.0755686387, -0.0051655602, -0.2657305002, 0.3179180026, 0.208411023, -0.0123671219, -0.0700817779, -0.1287320256, 0.0919009894, -0.1273195148, -0.1091574505, -0.2312929481, 0.0972521156, -0.1596215367, 0.0335745886, 0.0853755027, 0.1164718345, 0.3502860665, -0.3690615892, -0.0689096376, 0.0925488323, -0.0223241299, 0.2291084081, -0.1680189222, 0.3719294369, 0.4319144487, 0.0209557414, -0.056946028, -0.2369538248, 0.0805419907, 0.0453713275, 0.0599121936, 0.1212503016, 0.3780144453, 0.1420283467, 0.7747808099, 0.2866217494, 0.0910939574, 0.3499987125, -0.0283165313, 0.2063766718, -0.2522656024, -0.3935124874, -0.2120961845, -0.3562318981, 0.0119809434, -0.1822085977, -0.1161308065, -0.3244531751, -0.2922691107, 0.4655031264, -0.3842083514, -0.2641272545, 0.0236346442, -0.5137662888, 0.2076698393, -0.0357885398, 0.0154694319, -0.0819850564, -0.0316875502, -0.1261826158, 0.2762477696, 0.0724559799, -0.2466896772, -0.1857014, -0.2622129023, -0.5433812141, 0.3920851052, 0.2058962882, 0.5801012516, 0.0003781915, -0.0944948718, 0.0310258809, -0.1828489155, 0.9522314072, -0.4960357845, -0.3466889262, 0.2083468437, -0.5455588102, -0.0968325883, -0.1612380743, -0.1376574188, 0.5284411311, 0.3635910153, 0.4567364454, -0.1420131773, -0.3185375929, 0.1275020838, -0.2275506556, -0.1594370753, 0.0970045105, -0.3764307201, -0.0169396251, -0.2257393599, 0.0497246012, -0.2841276228, 0.0744782165, -0.2121138275, 0.0725717694, -0.0414902158, -0.1092956141, 0.0846452862, 0.012382865, 0.0425307862, -0.175103277, 0.0125153065, 0.1235080361, 0.2027813345, 0.2298823297, 0.3496077657, -0.4466883242, -0.0730662271, 0.1103928834, 0.0605575629, 0.1754443049, 0.3381383121, 0.1892447472, -0.040699549, 0.2022448927, 0.1370727718, -0.2748178542, -0.0869929492, 0.356679827, 0.0427996591, -0.4777279794, -0.1417575777, -0.0164193846, 0.3961030841, -0.2092303634, 0.4785841405, -0.3369475305, -0.2643270493, 0.3273870647, 0.0282705352, 0.850217104, -0.3952553868, 0.081100449, 0.0051098689, 0.0029543638, 0.234865129, 0.0672201514, 0.0788503885, -0.2991181314, -0.0682638064, -0.063596651, 0.0854784846, 0.2161599696, 0.0640436709, -0.1342248917, 0.4749642909, 0.0191980153, -0.009689427, -0.0696130767, 0.0654876977, 0.4006585479, -0.0159191433, 0.028745912, -0.0263754558, -0.0330135897, -0.0150102805, -0.0109568872, -0.031505499, 0.2435777187, -0.206092, -0.3993577063, -0.0928605646, -0.1005686969, 0.1672326624, 0.2263252139, 0.0712783933, -0.1681401134, -0.011466857, 0.0234826375, 0.0354609564, -0.0957324132, -0.1205085665, 0.0795379579, 0.3757039607, 0.1605423838, 0.1067802832, 0.3242851496, 0.0916086286, -0.2471614778, 0.0538607612, -0.2858959436, -0.316901207, 0.1146845818, 0.0705761537, 0.0022968058, 0.1854239404, 0.1068825945, -0.0222487189, -0.1199301854, -0.2102942914, 0.0326753259, 0.0129163703, -0.1103705615, 0.3514621258, -0.1062308699, -0.3452737629, 0.038797114, 0.5714832544, 0.1792283058, -0.1365398318, 0.170863077, 0.1084279865, 0.0061466396, -0.1767274439, -0.3147131801, 0.1856325567, -0.1824868619, 0.2496789545, 0.0504753217, -0.3122133017, 0.2068658322, 0.1525931656, 0.1529289186, 0.431165576, -0.1888730079, -0.3375234008, -0.5552377701, 0.3313269019, 0.0134998616, 0.0687700436, -0.0746567324, 0.0384716094, -0.0967091173, 0.3323039114, -0.2229069769, 0.0763912052, -0.0844006389, 0.1326911449, -0.0582742244, -0.0666018575, -0.0561767966, -0.135122329, 0.032747332, -0.039970547, -0.3238811493, -0.0252361465, 0.0375821814, 0.1396388113, 0.0788785517, -0.1851936281, -0.0814330131, -0.0081886202, -0.0455565378, -0.1865731776, 0.2877726257, 0.3394422531, -0.1544636488, 0.2089520991, 0.2331170589, 0.1577555537, 0.0578181595, 0.329200983, -0.189348802, -0.0028623044, -0.0109868385, 0.35879457, 0.0093155093, -0.1553909332, 0.1088604107, 0.1274763942, -0.1252472997, 0.0000301953, 0.3466127217, -0.1869860142, -0.0326903649, -0.0120867863, 0.4094203413, 0.349945724, -0.0911723748, 0.0539570153, 0.0707107931, 0.0874183178, -0.0165347382, -0.2206016034, 0.4769599438, 0.1111515164, 0.0399639457, 0.3331833482, 0.3857435584, -0.2357190102, 0.3037247658, 0.1749237478, -0.0326315574, -0.0436580107, 0.2259700596, 0.1887937784, 0.1417436153, 0.2702080607, 0.3282845616, -0.2589626014, -0.1217817366, 0.3115175366, 0.0219226144, 0.5094077587, -0.0482710227, 0.1318747699, -0.1260093451, -0.6924303174, 0.1770565957, 0.1194026098, -0.4325997531, 0.1958503425, -0.0499770977, 0.2462950051, -0.2728314698, -0.4944809973, -0.1602723151, 0.4801965952, -0.2971931994, -0.3529890776, -0.1438142657, -0.1685840487, 0.0178840458, 0.1029440016, -0.1220699996, 0.2627573907, 0.8705644011, 0.0013406202, -0.094067201, -0.1132532805, -0.4790522754, -0.0430616215, 0.3615501523, 0.0103331283, 0.4478522539, 0.1272517443, 0.0100145517, 0.0303876102, 0.2453125119, 0.4614395797, 0.5924247503, -0.4128951132, 0.0152745135, 0.195364058, -0.015819598, -0.1098345518, 0.4632718563, 0.127803117, 0.2036518306, 0.1503854096, -0.0878529176, 0.0157423168, -0.0646654814, 0.2157687545, 0.3244133592, 0.0953297466, 0.3323194087, 0.0612211451, -0.1504761577, 0.2467200458, 0.2314828336, -0.2850703299, -0.3031417131, 0.4654027522, -0.4126182795, 0.2292833626, 0.1793418974, 0.0684706122, 0.1095159203, 0.5616856217, 0.4272269905, 0.0376747698, -0.2462727129, -0.1341665834, -0.480601877, 0.0521212108, -0.1457831413, 0.3216113448, 0.0292443968, -0.0412232801, -0.0230939612, -0.0139864413, 0.0510888323, -0.2876896262, 0.324234128, 0.2073317468, -0.6163626313, 0.255011797, 0.0732990429, -0.0790689066, 0.0205273479, -0.246768564, 0.3759613037, 0.0571830198, -0.0936128423, 0.1096083224, 0.1519050598, 0.031431742, 0.1592656225, -0.0070394687, 0.2722210288, 0.5062954426, 0.0325846672, -0.1097388193, -0.2741949856, 0.0390066952, -0.0975884497, -0.0104547851, 0.0155984815, 0.3920154274, -0.2077858448, -0.088322565, -0.0510441214, 0.3311131299, 0.0445093513, 0.0752524883, -0.4925263524, 0.2965707481, 0.1937415004, 0.0449452661, -0.0557093211, 0.4923798442, 0.1975071728, 0.2907035947, -0.3964490294, -0.1536382586, 0.3064301014, -0.1346030384, -0.1018176302, -0.5227096677, 0.2753643095, -0.2127099633, 0.0281952452, -0.5799023509, -0.2829040885, 0.124026306, -0.1634377688, -0.6094950438, 0.2030109167, -0.1839464009, -0.1443094909, -0.0127473772, -0.2461770922, 0.1673191935, -0.1142710969, 0.109600082, -0.1046538353 ]
https://github.com/huggingface/datasets/issues/620
map/filter multiprocessing raises errors and corrupts datasets
Thanks for reporting. I'm going to fix that and add a test case so that it doesn't happen again :) I'll let you know when it's done In the meantime if you could make a google colab that reproduces the issue it would be helpful ! @timothyjlaurent
After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ```
47
map/filter multiprocessing raises errors and corrupts datasets After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ``` Thanks for reporting. I'm going to fix that and add a test case so that it doesn't happen again :) I'll let you know when it's done In the meantime if you could make a google colab that reproduces the issue it would be helpful ! @timothyjlaurent
[ -0.4121121764, -0.1406522393, -0.0332483165, 0.2547086179, 0.1527838558, -0.1575675607, 0.30054003, 0.2842244506, 0.1220512018, 0.1786370575, 0.1057422608, 0.4333916903, -0.4509958327, 0.4070869088, -0.353771776, 0.0146761471, 0.1161235347, -0.0957588106, -0.347450912, 0.2302013636, -0.286965698, 0.2845574915, -0.3517573476, 0.1021789312, -0.5433181524, 0.1019153222, -0.0670115277, 0.2673822045, -0.008146558, -0.5242772102, 0.3323917985, 0.2354477197, 0.0166101605, 0.4473762214, -0.0001247603, 0.1054001525, 0.4095133841, -0.0857338011, -0.0827230141, -0.3554014564, -0.2497745305, -0.1529172957, 0.3264889717, 0.1437780559, 0.1948262751, 0.1137632877, -0.10991963, -0.2486579418, 0.2269988656, 0.4095129669, 0.0959755555, 0.1484623253, 0.2599981725, 0.1074938178, 0.0507763103, 0.1133370996, -0.0529269055, 0.0704245865, 0.3206124604, -0.3827883601, -0.0285130106, 0.2651743591, -0.2205008864, -0.2288653851, -0.1529171467, -0.1964246631, 0.5769399405, -0.6709401608, 0.2063276619, 0.0575010553, -0.1395636052, 0.0107133985, -0.4084627628, -0.174831301, -0.2404251248, -0.5073004365, 0.2954609394, 0.0732230023, -0.1645456403, 0.1377579272, -0.5124663115, -0.2298982739, -0.0507230908, -0.0670411214, -0.1249320209, 0.5671178699, 0.1926735342, 0.250354439, 0.17252931, 0.1166618392, 0.2252906859, 0.0004574787, -0.1084870398, 0.172709167, -0.4143944383, 0.0668924972, 0.0559623949, -0.0301124789, -0.2576619387, 0.0049357992, -0.2148028314, 0.1003774405, -0.1257175207, 0.0608673804, 0.4142849147, -0.0701465607, 0.1900148541, 0.2748541236, 0.2734546363, -0.2022540718, 0.0313616395, 0.0276479311, 0.3244480193, -0.0992099717, -0.0978158042, 0.3083828092, 0.1657087356, -0.2255907208, -0.1998538077, 0.1551363915, -0.1435679793, -0.1698982716, 0.0583346225, 0.1578018963, -0.0012237132, 0.7389064431, 0.1697744429, 0.2071861327, -0.2773302794, -0.0702123418, -0.0462488644, -0.0458043106, -0.3989475369, 0.1445713192, 0.2839929163, 0.0619041361, -0.0171849802, 0.0610628277, -0.3289621472, -0.289817661, 0.1121147946, -0.2676971555, 0.0600719228, 0.6653482914, -0.0666880906, 0.0173053369, -0.0513037778, -0.1929861903, 0.1065103561, 0.27547279, -0.5094781518, -0.1835172474, -0.238602981, 0.0853221416, 0.0016957447, 0.2611216307, -0.1982410848, 0.2716642022, 0.3829973042, -0.3211612701, -0.2798916101, -0.3254051208, -0.3995459378, -0.3155678809, -0.0366555005, 0.3317926526, -0.4338263869, -0.0076312721, 0.018109411, -0.1475034952, 0.3232699335, 0.1963199526, 0.009014722, -0.033404883, -0.0844932199, 0.0571833998, 0.013590578, -0.1737549007, -0.0926609933, 0.2674303353, 0.0592566431, 0.3397593796, -0.0784275234, -0.2674648464, 0.2675085664, -0.1609187126, 0.3010760546, -0.0502837151, -0.1990885586, 0.058914125, -0.4480962455, -0.0225977786, -0.0723856837, -0.0604243577, 0.3800570369, 0.250186801, -0.1262992173, -0.5661560893, 0.2953266203, -0.1131383553, 0.3038972914, 0.0671970919, 0.172844559, 0.0754973739, 0.1271312535, -0.2823401093, -0.5102092624, 0.2077349871, -0.2473969758, 0.0669734925, -0.0537819937, 0.0269371122, 0.1027749181, 0.1855366379, -0.1827418953, -0.195130676, 0.0312554985, -0.0589821748, -0.337864846, -0.0385567658, -0.1812933683, 0.6933580041, 0.1202342883, 0.1226820946, -0.1053095385, 0.2600259483, -0.1650941521, -0.4068417549, -0.2427135408, 0.1782925576, -0.0046846326, -0.0679922551, -0.1938951612, 0.4690291584, 0.4765936732, -0.0957910269, -0.0227427837, 0.1581679285, 0.2636247873, -0.2274561524, -0.124952741, 0.0475178733, 0.0494187921, -0.190145269, 0.1769474894, 0.4724966884, 0.083590813, 0.4071724415, 0.1771290004, 0.2576021552, 0.2519412339, -0.002774328, -0.0431509279, -0.1274017543, 0.1881029159, -0.1360895932, 0.1354997754, 0.0153030138, -0.191150099, -0.1967400759, 0.0656924099, 0.0238860399, -0.1781354994, 0.0698436648, 0.0506458208, -0.0983655527, 0.1394211054, 0.091566205, 0.4157840014, -0.0159222167, -0.2074044943, 0.2007093877, -0.252638638, -0.0316540003, 0.1466476768, -0.0746338367, 0.4457933605, 0.2905402184, 0.1170275137, 0.039616216, -0.1530811787, -0.2957580984, 0.0840764046, 0.3804650009, -0.5017094612, 0.2247674763, -0.2366140932, 0.3143703938, 0.0710459799, -0.1635519266, -0.3381691873, -0.5497385263, -0.1408771425, 0.5135914087, -0.0627145171, 0.074720934, -0.003652975, 0.0951412544, -0.2184603214, 0.3544716239, -0.1666019261, -0.3664503396, -0.1646608561, -0.0982068032, 0.3514309824, -0.26840204, 0.2071597129, 0.1605088413, -0.2898549736, 0.1963261962, -0.3250333667, 0.0755513459, -0.0649645925, 0.0667984486, 0.0770853758, -0.053613726, 0.0180142671, -0.1801130176, 0.1070768684, -0.1326915175, -0.2237789333, 0.2805934548, -0.152341187, 0.0745680779, -0.260884881, -0.3176511824, -0.4118557572, -0.088594012, -0.0634895414, -0.1984794438, 0.3380075395, 0.1842394918, 0.038436275, -0.0361017063, 0.0183381438, 0.0001206174, -0.0855408907, -0.0190587938, -0.1976471543, 0.0187300108, -0.1217799783, 0.0045979843, 0.0779644102, 0.0880388916, 0.4316267371, -0.3409131169, 0.0082468502, 0.0334049836, -0.1302685738, 0.2284379154, -0.2176263481, 0.3201876879, 0.4128196239, 0.07419388, -0.0453655086, -0.2277773619, 0.0828586519, 0.0422933958, 0.0735567585, 0.1102713197, 0.2954916954, 0.0971602201, 0.5779756904, 0.3713935018, 0.1077597439, 0.3553506434, -0.0479933359, 0.1098851264, -0.1997596323, -0.3643078506, -0.1469734013, -0.3254578114, -0.0515600443, -0.2147625536, -0.1002347171, -0.4813914299, -0.3708154559, 0.6098027825, -0.3687625229, -0.2261604816, -0.0518781096, -0.5061079264, 0.258155793, -0.1039816737, 0.0180869699, 0.0079532117, -0.0503299385, -0.2855007052, 0.1941619217, 0.1236154884, -0.3415811956, -0.1887843907, -0.2972683012, -0.4389138818, 0.3179172873, 0.2152687311, 0.6413888335, -0.0004073679, -0.0335091017, 0.0526657328, -0.1484398097, 0.8724443316, -0.6391437054, -0.3901263773, 0.3212070763, -0.3892723024, -0.2305483222, -0.1832405627, -0.2340836823, 0.5545515418, 0.3665709496, 0.4201633036, -0.0816423893, -0.2458472252, 0.2316584587, -0.1978904307, -0.1097210944, 0.0331493132, -0.3254032433, -0.0265205279, -0.1062934473, 0.0646465346, -0.2433365583, 0.092380181, -0.2778443098, -0.0833830461, -0.0638622418, -0.0533601195, 0.2049311399, 0.0473100916, -0.0129923932, -0.1951615959, 0.0472442806, 0.0210987851, 0.1435595751, 0.2934770882, 0.3029739261, -0.4163867533, -0.0815623105, 0.1727876365, -0.0032638945, 0.2762564421, 0.3233978748, 0.2592428625, -0.1373687387, 0.0007327478, 0.1996778995, -0.2637887299, -0.0252385326, 0.3556055427, -0.04554452, -0.5671121478, -0.1211097389, 0.0051766206, 0.419986546, -0.1998381168, 0.5653173923, -0.3575083613, -0.2795490324, 0.4127808213, 0.1079264432, 0.8483623862, -0.3829786181, -0.0730304718, 0.0671945661, 0.0268008411, 0.1179682016, 0.106417194, 0.0879062563, -0.3112837374, -0.0391734838, -0.0562888198, 0.027516827, 0.2479698807, 0.1572479457, -0.2271204591, 0.4100711346, -0.0520968959, -0.138106972, -0.008578999, 0.103373751, 0.3932399154, -0.0094014592, 0.1235440522, 0.002807226, -0.0387011543, -0.1200738475, 0.0052092783, 0.0790052414, 0.241916433, -0.1296008378, -0.5135248899, -0.0409996696, -0.1335324198, 0.0750276595, 0.2297870219, 0.0510988832, 0.0216665864, 0.1184398085, -0.0464342795, 0.0676526576, -0.1435887665, -0.0928344429, 0.0835634023, 0.3170044422, 0.1958111376, 0.0367336236, 0.3408839703, 0.093686074, -0.250216186, 0.0252610184, -0.2338946909, -0.2264738977, -0.0180615187, 0.0726364553, -0.0585123524, 0.1486911923, 0.1381905675, 0.0091720521, -0.11441347, -0.191119507, 0.0475962162, -0.0709196031, -0.0892757326, 0.5210028887, -0.0350486822, -0.3730114102, 0.011976013, 0.4310204387, 0.2444378734, -0.0667292476, 0.1065037251, 0.1603958309, 0.0003414825, -0.2310717106, -0.2428838313, 0.1716859937, -0.1189797372, 0.3094240427, 0.0381290726, -0.1944820881, 0.252836585, 0.0619287901, 0.0322726294, 0.3911169767, -0.1844839603, -0.3591222465, -0.4126454592, 0.2500077784, 0.0484349914, 0.1106884181, -0.1846469641, 0.1446647495, -0.0302458666, 0.2589549124, -0.212210983, 0.1527973264, -0.1018272489, 0.2051261365, -0.0535929427, -0.0008354424, -0.0993240774, -0.1431499869, 0.0514931343, -0.0185621697, -0.1838876754, -0.0198912565, 0.0312996656, 0.1376205832, 0.0847335607, -0.1993035227, -0.0601022989, -0.0665639937, -0.1236639023, -0.2029265165, 0.2158675194, 0.3959046602, -0.2078519613, 0.1407669187, 0.1893223822, 0.1952369511, 0.1439823359, 0.3243098855, -0.2149883807, 0.0778925046, 0.0814794227, 0.4155634642, 0.1775706112, -0.1909879446, 0.0620324612, 0.1037419587, -0.1445656866, -0.0004130667, 0.3068567514, -0.2390922159, 0.0455286056, -0.0586905591, 0.4123046398, 0.4355722964, -0.117985785, 0.1110971943, 0.1410022527, 0.1002048999, 0.0493219271, -0.1471813023, 0.4749157727, 0.1149629205, 0.0238514096, 0.3524883389, 0.2848345041, -0.1469901055, 0.1850553751, 0.1636670232, -0.0338793732, -0.0026761387, 0.2539981008, 0.0815501213, 0.0690929592, 0.2622005045, 0.1822578758, -0.3103899956, -0.1121326461, 0.2957110405, 0.0121205263, 0.4673572779, -0.0692836419, 0.0651765615, -0.0891272426, -0.6329889894, 0.0578953661, 0.1388345659, -0.5441830158, 0.2122362256, -0.0096857958, 0.2614073753, -0.3864083886, -0.4462727606, -0.1971578598, 0.4130581319, -0.2505245209, -0.3078999817, -0.2264650315, -0.1703546941, -0.12696518, 0.1096105874, -0.0965497792, 0.193041265, 0.825537622, 0.0335464701, -0.1240116805, -0.1953441948, -0.3356648088, 0.0009226287, 0.3389829397, -0.0118955225, 0.3929479122, -0.0182974003, 0.0001004692, -0.0541469418, 0.1687266827, 0.4923532009, 0.7186238766, -0.520319283, -0.0086600445, 0.1379936934, -0.0658949167, -0.0626947135, 0.3690218925, 0.1728775054, 0.2969073951, 0.2561486959, -0.066046834, 0.0455558822, -0.1087192297, 0.3275896609, 0.3738721013, 0.0244513676, 0.2912444472, 0.0596283898, -0.0805849135, 0.2058670223, 0.0789361745, -0.3390071988, -0.2918818593, 0.4837103486, -0.4562887251, 0.2314362079, 0.261420399, 0.0443422459, 0.1067173779, 0.4937182367, 0.3471800685, 0.1562868655, -0.2608990967, -0.2374809384, -0.3877944946, -0.0001012571, -0.196065858, 0.3422501981, -0.029125195, 0.0112806261, -0.0700282007, 0.0646047145, -0.0437067151, -0.2140721679, 0.3408593535, 0.3193296194, -0.5734247565, 0.205778718, 0.0849360675, -0.0592480823, -0.0117021389, -0.2920574546, 0.3183324337, 0.0405557081, -0.1181762516, 0.0914626718, 0.158492282, 0.0675963089, 0.164799884, 0.0278845616, 0.3071470857, 0.6216836572, 0.002416946, -0.0157923326, -0.2677760124, 0.0197451338, -0.0921238363, -0.0592221022, 0.0696673244, 0.3804355264, -0.158795774, -0.0497491658, 0.0476219691, 0.2289707065, 0.0303715467, 0.2470555604, -0.442496419, 0.1602878571, 0.194182232, 0.0280080549, -0.1409194767, 0.3804940283, 0.1256973445, 0.3354587555, -0.3165168166, -0.1907715499, 0.290499717, -0.00063866, -0.1257483214, -0.603508532, 0.2105589509, -0.1881111711, 0.1225868613, -0.526596427, -0.3557214439, 0.1513364911, -0.1979541481, -0.5773920417, 0.2608497441, -0.1053142399, -0.1814596951, -0.0679982156, -0.207314536, 0.1469495296, -0.1275481284, 0.1519544125, -0.0671169013 ]
https://github.com/huggingface/datasets/issues/620
map/filter multiprocessing raises errors and corrupts datasets
Thanks @timothyjlaurent ! I just merged a fix on master. I also checked your notebook and it looks like it's working now. I added some tests to make sure it works as expected now :)
After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ```
35
map/filter multiprocessing raises errors and corrupts datasets After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ``` Thanks @timothyjlaurent ! I just merged a fix on master. I also checked your notebook and it looks like it's working now. I added some tests to make sure it works as expected now :)
[ -0.4193555415, -0.1071408316, -0.0392602608, 0.2147694826, 0.1458775848, -0.1598081142, 0.2604387999, 0.3937073648, 0.0989543796, 0.1513859481, 0.019559443, 0.4504708052, -0.3637402356, 0.4398678541, -0.3191183209, 0.0345803462, 0.0604739562, -0.0270158499, -0.3903121352, 0.2088315934, -0.2636983693, 0.2989493608, -0.3305637836, 0.1506017596, -0.5216232538, 0.1386965364, -0.1130685359, 0.265317142, -0.1030746251, -0.4939133823, 0.2864599824, 0.2612055242, 0.0879325122, 0.468968153, -0.000114397, 0.1324707121, 0.3333234191, -0.1248082742, -0.079808563, -0.4356536269, -0.1579090655, -0.255066216, 0.28728351, 0.1007007509, 0.074927032, 0.0389388241, -0.0903253406, -0.1414097399, 0.2836946249, 0.4652613997, 0.193415761, 0.2085707486, 0.2621925473, 0.055845473, 0.0510042831, 0.0678697228, -0.0874516517, 0.0078278296, 0.3059770763, -0.4177973866, -0.1298086345, 0.3312831223, -0.1521879137, -0.2131038308, -0.0349630751, -0.1924247593, 0.5803085566, -0.5957254171, 0.212593466, 0.0949260443, -0.1266431063, -0.0087136794, -0.3816847205, -0.1348067224, -0.2131177187, -0.4743328691, 0.2450662255, 0.0600353256, -0.2198232114, 0.1639678031, -0.4446832538, -0.1741558909, -0.0448269024, -0.1304738075, -0.1257173717, 0.5451285839, 0.1489353031, 0.2062474936, 0.2068863958, 0.1561336815, 0.2369425297, 0.0344260596, -0.0569591001, 0.189568013, -0.4668981433, 0.0611613467, 0.0921529979, -0.0954743326, -0.2770741582, 0.0083476994, -0.088118732, 0.1364299357, -0.001621753, 0.0187216774, 0.3237476647, -0.0298316609, 0.1760365665, 0.2339706868, 0.2529139817, -0.1339114606, 0.0106225461, -0.0320672914, 0.2401254326, -0.161889568, -0.123891905, 0.2203140557, 0.1702691168, -0.2124933004, -0.2123427689, 0.1235535294, -0.0652695075, -0.1341297626, 0.088326253, 0.2103798091, -0.0055051781, 0.636483252, 0.2363898754, 0.1668307483, -0.3084967136, -0.0742961466, -0.1081770658, -0.0752083361, -0.4380477667, 0.0819579586, 0.3093464971, 0.0654765666, 0.0247547477, 0.0464761406, -0.3424580097, -0.1757464409, 0.1699208766, -0.2142256349, 0.0935266539, 0.6487028003, -0.0624322779, -0.07298778, -0.0194962285, -0.1211730465, 0.1072039604, 0.2887681425, -0.4147420824, -0.1756030321, -0.144833073, 0.1784070581, 0.0624779835, 0.1968681663, -0.1354860514, 0.2495195568, 0.341288656, -0.2765167654, -0.295017153, -0.2627282739, -0.3219246864, -0.2649621367, 0.008839421, 0.1723242551, -0.4067886472, -0.0042886063, 0.0150059462, -0.1794582009, 0.2717071772, 0.1691128612, -0.0282547325, 0.0070829652, -0.0883457884, 0.0955987573, 0.0768796951, -0.1691039503, -0.1678663641, 0.2952839136, 0.03390266, 0.2496965826, -0.0105179474, -0.2364145815, 0.3208428621, -0.0797642469, 0.2444716096, 0.0074710958, -0.1422508359, 0.0394447669, -0.4914547801, -0.0142362304, -0.0799177289, -0.0029364079, 0.4351004064, 0.1397290826, -0.1918193102, -0.5129802227, 0.3801808655, -0.1166737452, 0.3654145002, 0.0820591003, 0.1434986591, 0.1009434238, 0.1733540297, -0.2495147288, -0.357304275, 0.2332868576, -0.2872868776, 0.1226685867, -0.0040783696, 0.0095777735, 0.0459843352, 0.0689832121, -0.2158462107, -0.2534332573, 0.1461217254, -0.0497325249, -0.2649740875, -0.0695729852, -0.2448035181, 0.6643337607, 0.1081317663, 0.128610909, -0.120522432, 0.2059647143, -0.2012666017, -0.413680464, -0.2300349176, 0.17719993, -0.0337787345, -0.0788786411, -0.1963766366, 0.4693377614, 0.4146350622, -0.2028422654, -0.0868469775, 0.1255910844, 0.19121252, -0.150628686, -0.1443392932, 0.1170539707, 0.0710577965, -0.1876677573, 0.13430655, 0.5086151361, 0.0340315327, 0.3584800363, 0.2193986326, 0.2494550049, 0.3088573515, -0.0572911277, -0.0569905378, -0.1041018367, 0.1401895583, -0.165710479, 0.1885277927, 0.0321425721, -0.1542482972, -0.2205115557, 0.0744972825, 0.0366966128, -0.2033706754, 0.0992500484, 0.0222153999, -0.07387124, 0.1572188437, 0.0868036672, 0.4352721572, 0.0814120397, -0.1514408886, 0.2476586103, -0.268573761, -0.0336245336, 0.1898120642, -0.1886326075, 0.3337744474, 0.3131486177, 0.0016618632, -0.0645069852, -0.2526162863, -0.2312239408, 0.1290822476, 0.3988621235, -0.4570884109, 0.1193112656, -0.2042875886, 0.2391608655, 0.0630440563, -0.1325521916, -0.2879110575, -0.491448909, -0.0683127046, 0.5388637185, -0.0008213818, 0.1363488734, 0.0390865505, 0.1202902794, -0.1639821976, 0.3817958236, -0.1218740642, -0.3393019736, -0.234685272, 0.0099810399, 0.3615125716, -0.15585123, 0.2152870744, 0.089822121, -0.2461387366, 0.1259488016, -0.3414518535, 0.0274592191, -0.1188349649, 0.0474602953, 0.0826992467, -0.0263414606, -0.0158583745, -0.1403869092, 0.1464484185, -0.1827645302, -0.2847910523, 0.337277621, -0.101634182, 0.0589414872, -0.2651586533, -0.358774811, -0.3944588006, -0.178994447, -0.0360177718, -0.203137964, 0.2997395396, 0.159422949, -0.0117755383, 0.0170342401, -0.0087830611, 0.0573238134, -0.1083472073, 0.010376364, -0.1428324431, -0.0703979582, -0.2448673248, 0.0341879502, 0.1254528463, 0.1972051114, 0.3103442192, -0.3343865573, 0.0510899425, -0.0141257625, -0.0532011501, 0.2020153105, -0.1372795999, 0.4295085669, 0.3030901253, -0.0185518377, -0.0289914533, -0.1327069551, 0.0661372989, -0.0121596139, 0.111738272, 0.026613418, 0.2794806957, 0.1757087857, 0.5256997347, 0.3726459146, 0.1225583553, 0.3527097702, -0.0770343393, 0.1140256673, -0.2462856174, -0.422267139, -0.1674268842, -0.3445953131, -0.030351501, -0.1905118823, -0.1223336905, -0.4686016738, -0.3261253238, 0.4740754068, -0.2929596305, -0.2703279853, -0.0009389278, -0.3815760911, 0.2553584576, -0.0103184208, 0.0509468094, -0.0073611997, 0.0065026581, -0.2341532111, 0.2373699695, 0.0275295079, -0.3430078924, -0.2495887876, -0.2013951242, -0.4698580503, 0.3255583346, 0.0736431181, 0.5758504868, 0.0664781928, -0.1557085663, 0.0528419316, -0.1451818943, 0.8551934361, -0.5291438699, -0.4163533151, 0.3530488312, -0.4446541071, -0.2986303568, -0.2293095887, -0.2401957363, 0.4804591537, 0.3630556464, 0.4405976236, -0.0835955292, -0.3271422088, 0.2073356509, -0.2588737011, -0.0956456587, -0.0193275623, -0.3289504349, 0.0026641563, -0.1439436078, 0.0806324631, -0.2556238472, 0.1615919769, -0.2231429964, -0.1014842242, -0.13607198, -0.1502159685, 0.1863696277, 0.1315479428, 0.0114302561, -0.2366319299, 0.0945368409, 0.0136123803, 0.1998774707, 0.2887660861, 0.2445152104, -0.4175809324, -0.1409907192, 0.1858454496, -0.0981284082, 0.2522604167, 0.3151223361, 0.2014823109, -0.1213719249, 0.0429505706, 0.1488639265, -0.2819695771, -0.0143474974, 0.3474960029, -0.0276893992, -0.4713940024, -0.1396847665, -0.0640942603, 0.3640648127, -0.1863229275, 0.4500079453, -0.424341917, -0.3039737344, 0.4157575369, 0.0357804149, 0.8078531027, -0.3515240252, -0.008246691, 0.0785526186, -0.0528229885, 0.0444092602, 0.1244348437, 0.1305841208, -0.3808038235, -0.1677160859, -0.0221061185, 0.0908226222, 0.3004826605, 0.159740746, -0.2783437073, 0.3571473956, 0.0343681574, -0.132884413, -0.0370978862, 0.161580503, 0.3707353175, -0.0322962962, 0.053435754, 0.0826027468, 0.0088986661, -0.1095329523, 0.0225269236, 0.0060695149, 0.2187006474, -0.1285784394, -0.3923016489, -0.1209193766, -0.1604750454, 0.1002306491, 0.1534647346, 0.0861689001, -0.0856872723, 0.0591870546, -0.0495352037, 0.1008598357, -0.1607716084, -0.0901115015, 0.0909275264, 0.3196434975, 0.1687978804, -0.0326879993, 0.3603780866, 0.0404225066, -0.2821493149, 0.0309959948, -0.1847841144, -0.230251357, 0.092438817, 0.0527892187, 0.0068404037, 0.0545469113, 0.1035414562, -0.0089265108, -0.1424870789, -0.2472642809, 0.1489106417, -0.1039117575, -0.2403705269, 0.4878769815, -0.0406043418, -0.3871282935, 0.0060363654, 0.5196285844, 0.2512221634, -0.0421018787, 0.1161166131, 0.1134921312, -0.0052114502, -0.260379076, -0.1053169668, 0.3555202782, -0.1472781897, 0.269765377, 0.0772921741, -0.232399255, 0.2793720663, 0.1730700582, 0.1864338964, 0.343708545, -0.2330507934, -0.2502617836, -0.5016089678, 0.2002421618, 0.0906714201, 0.1555781215, -0.1567182243, 0.1033435166, 0.0227642283, 0.2380064428, -0.3144082427, 0.161868602, -0.1278698593, 0.1995832473, -0.0874242485, -0.0639707968, -0.1013249978, -0.1288786083, 0.1278576404, 0.0110289138, -0.218310982, -0.1269756407, 0.0414432064, 0.0959979519, 0.0903681517, -0.1801793575, -0.082994394, -0.0894428492, -0.1912934929, -0.1733567417, 0.2130331993, 0.3244562149, -0.2227032185, 0.1553201377, 0.2211142778, 0.052062057, 0.1144619361, 0.212403357, -0.2451877743, -0.0374416932, 0.0753423348, 0.33918643, 0.1671212763, -0.156478405, 0.069988966, 0.0722217858, -0.1351472437, 0.0325766467, 0.3185409009, -0.2506427169, -0.0078419149, 0.0133409798, 0.4798088372, 0.4465433061, -0.0354574844, 0.0300006121, 0.1677161455, 0.2217456251, -0.0637998134, -0.1488430798, 0.4273239374, 0.0264336504, 0.0895142555, 0.2835230231, 0.2823503613, -0.1218490377, 0.1831268966, 0.1957114786, -0.1121931225, -0.0428236574, 0.2549168468, 0.1816566139, 0.0929491818, 0.2404346615, 0.2450755686, -0.3198013306, -0.1236823499, 0.3709717691, -0.0579793118, 0.4727439284, -0.0739560127, 0.1659109294, -0.1547361463, -0.603544116, 0.046697855, 0.0995773524, -0.5765849352, 0.1664244831, -0.0589707606, 0.2135978043, -0.3429758847, -0.4492861331, -0.2299191654, 0.3815017939, -0.3193085194, -0.2784167528, -0.2968037426, -0.1927313209, -0.1144059375, 0.0155420899, -0.0980922803, 0.2728919983, 0.7926215529, 0.0167064518, -0.0766708404, -0.1761129349, -0.3733614683, -0.0104629733, 0.2841352224, -0.0526120812, 0.3761050701, -0.0376657434, 0.007854471, -0.0527077988, 0.2472107112, 0.4495351613, 0.7589015961, -0.4396674931, 0.0291391835, 0.1385424137, -0.0769143328, -0.1264016032, 0.3904379904, 0.2272908688, 0.3436400294, 0.2906417847, 0.0186777562, -0.0276865475, -0.0531581268, 0.1893318743, 0.2868371308, -0.059209343, 0.2951443195, 0.1460801363, -0.0888968706, 0.1683427393, 0.1084668785, -0.3899185658, -0.2188167125, 0.5099228621, -0.4190783203, 0.2469061613, 0.14334324, 0.0981122106, 0.0651601106, 0.5027341247, 0.2623903155, 0.0760018528, -0.3008479476, -0.2206921875, -0.4730150104, -0.0301168561, -0.2012248039, 0.293489188, 0.0006073602, 0.0507427976, -0.0133373551, 0.0263470709, -0.0222062021, -0.3289546967, 0.2724011838, 0.3368607163, -0.5882843137, 0.2075415254, 0.1327597499, -0.055043906, 0.0063747093, -0.3181928694, 0.3201174438, 0.0733074695, -0.0068844855, 0.1621212661, 0.1773039103, 0.1711472422, 0.2259214818, 0.0468775779, 0.2819206417, 0.4912547171, 0.0361806601, -0.0009608865, -0.3127284646, 0.1062096655, -0.1507224292, -0.0046324059, 0.1694762111, 0.4342520237, -0.1230453029, -0.1056579426, -0.0087014101, 0.2750090957, 0.0663077012, 0.1348061562, -0.3960787356, 0.2369553149, 0.231428802, 0.0715241283, -0.1251009256, 0.4445699155, 0.1168591231, 0.2360393405, -0.3626089096, -0.234695673, 0.2616027594, -0.0078331307, -0.1696406454, -0.6140576005, 0.181479156, -0.2047001123, 0.1055036187, -0.5150936842, -0.3300754726, 0.1696690619, -0.1691147983, -0.5518654585, 0.2802895606, -0.0358732194, -0.1203281134, -0.0870615989, -0.2423839569, 0.1596477032, -0.1952756941, 0.0383123904, -0.1134012863 ]
https://github.com/huggingface/datasets/issues/620
map/filter multiprocessing raises errors and corrupts datasets
Great, @lhoestq . I'm trying to verify in the colab: changed ``` !pip install datasets ``` to ``` !pip install git+https://github.com/huggingface/datasets@master ``` But I'm still seeing the error - I wonder why?
After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ```
32
map/filter multiprocessing raises errors and corrupts datasets After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ``` Great, @lhoestq . I'm trying to verify in the colab: changed ``` !pip install datasets ``` to ``` !pip install git+https://github.com/huggingface/datasets@master ``` But I'm still seeing the error - I wonder why?
[ -0.4416671991, -0.1638825536, -0.0333689637, 0.2726552784, 0.1398596019, -0.148854062, 0.2888737917, 0.2428043783, 0.1233454719, 0.2103497386, 0.0198595077, 0.5049684048, -0.3794002235, 0.3664865196, -0.3114084303, 0.0422242917, 0.08237423, -0.016495131, -0.3751585782, 0.2027817667, -0.1774682105, 0.2914340198, -0.3229135573, 0.1249254793, -0.5289020538, 0.0481471792, -0.0232044049, 0.2264145166, -0.1429691613, -0.5102816224, 0.441221714, 0.2774358988, 0.0068001151, 0.4950698316, -0.0001267765, 0.1621521711, 0.4178574085, -0.1281794757, -0.1998688579, -0.4242437482, -0.2842195034, -0.2430697381, 0.3733288944, 0.0529887676, 0.1932427436, 0.1888235509, -0.0460893139, -0.2160017639, 0.268132776, 0.3944464624, 0.0823459849, 0.2147491425, 0.2474288195, 0.1046159044, 0.0297514889, 0.1779660732, -0.0468290858, 0.0709810406, 0.3152909875, -0.275164187, -0.0240558311, 0.2880619168, -0.258749485, -0.0838412344, -0.2140990645, -0.1705379337, 0.4943110645, -0.6263380647, 0.1955991834, 0.0815576464, -0.1406649798, -0.0748994723, -0.4533850849, -0.2244769931, -0.2576600909, -0.5030207038, 0.2255760282, 0.0383145958, -0.2167300731, 0.1187202483, -0.5351098776, -0.2030428499, 0.0226367861, -0.0981147438, -0.1145921052, 0.5615175962, 0.0796509162, 0.3092216849, 0.2317219079, 0.2045358568, 0.2542182505, -0.0397199541, -0.0713265464, 0.1538145244, -0.3840870857, 0.0325843394, 0.066913344, 0.1032205373, -0.1992351711, -0.0264024623, -0.2357918322, 0.0283500291, -0.1383862048, 0.0962060988, 0.4001835883, 0.004329592, 0.138394475, 0.2871504128, 0.2604064643, -0.1034701616, 0.0161072165, 0.0157794692, 0.3302516043, -0.0691926554, -0.1319175661, 0.3151137233, 0.2251547426, -0.2654106617, -0.2431906909, 0.0499480814, -0.0663371533, -0.2566159964, 0.0794135705, 0.1817842722, -0.04913323, 0.6901753545, 0.0495948344, 0.1465943158, -0.3301030695, -0.0150012039, -0.0359649435, -0.0239536818, -0.3363149166, 0.0713681355, 0.2595930099, -0.0516899973, 0.0886393636, 0.0327408873, -0.3011176586, -0.2104311436, 0.1050219163, -0.3483008742, 0.0540351197, 0.7567945123, -0.0330231152, 0.1069522575, 0.0345673449, -0.1978489012, 0.0832137764, 0.214373529, -0.5603201389, -0.231054157, -0.2536770403, 0.032403674, -0.0642948672, 0.2737278342, -0.3409032226, 0.2217782736, 0.4327419698, -0.394970268, -0.2841384113, -0.3484717011, -0.379827857, -0.2361612171, -0.0399759598, 0.3014242649, -0.4129751921, -0.0434270725, -0.0027414076, -0.2266345173, 0.4127742052, 0.208297044, -0.0585957244, 0.0484886914, -0.1754722446, -0.0414130688, 0.1345537454, -0.2448754907, -0.1387611926, 0.2229837775, 0.0678845793, 0.3529626727, 0.0010163337, -0.2929704785, 0.2706998289, -0.1585669667, 0.2899982631, -0.1811328232, -0.2223001271, 0.0575291738, -0.4279884398, -0.0444759279, -0.1362227052, -0.0758407861, 0.3197179139, 0.1904803962, -0.0917356312, -0.4803736806, 0.2914342582, -0.0974826068, 0.274980247, 0.0469413511, 0.1833523065, 0.0583573729, 0.086080566, -0.2779420316, -0.5560168028, 0.1758854389, -0.1128364801, 0.073460713, -0.181999594, -0.0033156723, -0.0047647581, 0.2145401239, -0.2078026533, -0.1872451305, 0.0109828748, 0.0678199008, -0.2397843897, -0.0476465188, -0.1803410053, 0.7802556157, 0.0948858932, 0.1407018602, -0.1968694627, 0.2188931108, -0.177064985, -0.3737702966, -0.2619193196, 0.2147623599, -0.0545190312, -0.1538221836, -0.220083341, 0.3961576223, 0.4616065025, -0.1174414754, 0.0055228844, 0.1043646038, 0.2650129795, -0.1628607959, -0.1947048753, -0.0392285883, 0.0907205939, -0.1257075965, 0.1508947015, 0.3900073171, 0.0292395391, 0.3952741027, 0.1306958199, 0.1663777381, 0.3614394367, 0.0658567622, -0.0800637901, -0.1253149211, 0.1883156002, -0.0916731805, 0.1900224835, 0.0967750922, -0.1025896966, -0.1702996343, 0.1979155689, 0.0816695541, -0.2172454596, 0.0747726411, 0.1999749243, -0.1187729165, 0.2165573388, 0.1139416397, 0.497181952, -0.0219782367, -0.1903030872, 0.1335119903, -0.2798820138, -0.016076684, 0.1259056926, -0.0240723044, 0.3989959061, 0.2772563398, 0.1201241389, 0.0499582551, -0.1512105912, -0.3175324798, 0.0898073465, 0.4272669554, -0.4888765812, 0.2710613608, -0.2688697577, 0.2669653296, 0.0520053357, -0.2310134172, -0.2184807956, -0.5585610867, -0.1386850625, 0.4793066978, -0.0087357759, 0.1128477901, 0.0227434337, 0.0532077551, -0.1948924661, 0.3044201136, -0.2504920959, -0.3013235033, -0.1368225813, -0.1110468656, 0.3174394667, -0.2189760953, 0.19033584, 0.0410232246, -0.2808038592, 0.1614136696, -0.3627012968, 0.149413228, -0.1711052358, 0.2128979266, 0.189096868, 0.02449641, -0.0382205024, -0.1943717599, 0.104463622, -0.1192779765, -0.2969189286, 0.1680941135, -0.194264695, 0.0368096158, -0.2712196708, -0.3120831251, -0.4605070651, -0.0989578441, -0.0250401832, -0.1841690242, 0.2989398539, 0.1648742706, 0.0333092846, -0.0042152107, -0.0330411084, 0.082272172, -0.1137122214, -0.2012419105, -0.2248910367, 0.0750419572, -0.0986691564, -0.0048673451, 0.055027023, 0.1905470639, 0.3111855388, -0.3943759203, -0.1212724447, 0.023776656, -0.0814214125, 0.2374363244, -0.0792717785, 0.3408547044, 0.3825592995, 0.1019416898, -0.0926320553, -0.2722113132, 0.0375121124, 0.0219185725, 0.1019781232, 0.0665002167, 0.3042420149, 0.1053646058, 0.6797085404, 0.4086299837, 0.0721284077, 0.3929720223, -0.1085143834, 0.256846875, -0.2528786063, -0.5017324686, -0.1630623937, -0.3097388744, -0.0505200848, -0.1380235553, -0.1240748316, -0.4375979006, -0.3680297732, 0.6295223236, -0.4248672128, -0.2594051659, -0.102857776, -0.3583773077, 0.2096515596, 0.028438732, 0.0086304247, -0.0358936079, 0.0070898347, -0.2760525048, 0.3160624504, 0.0908877775, -0.2581807971, -0.183015123, -0.3940322995, -0.497841686, 0.3102759123, 0.2822065949, 0.5873110294, -0.0067520812, 0.0540216789, 0.0516174287, -0.1117564291, 0.8583498001, -0.5854833722, -0.4201729, 0.2404562831, -0.3516247272, -0.2097999156, -0.1688064039, -0.2396242321, 0.5918942094, 0.418576926, 0.3985381424, -0.2252148241, -0.3151472211, 0.3050109446, -0.1486114264, -0.0294340849, 0.0809883997, -0.4181890488, -0.1203430519, -0.2003677338, 0.0380933769, -0.214343369, 0.0938273519, -0.2853964269, -0.0301858112, -0.0720046461, 0.0014541224, 0.2060997784, 0.1112165749, -0.0534039922, -0.0441467352, 0.0129580051, -0.0271880273, 0.1129582822, 0.2813892365, 0.4119316339, -0.4542776048, -0.1205613017, 0.1787271649, 0.120381996, 0.2637926638, 0.2700822353, 0.1726485789, -0.1320697218, -0.0085809696, 0.2180731297, -0.1424890459, 0.1053131223, 0.3505743444, 0.0558370613, -0.5953501463, -0.15591681, 0.0195550695, 0.4437177777, -0.157619521, 0.5006453395, -0.2792439759, -0.2009060383, 0.3072484732, 0.0240256153, 0.7952353954, -0.3570335507, -0.0737250894, 0.1223780885, 0.0380155444, 0.183730185, 0.1191254705, 0.0349604785, -0.2109851539, -0.0451514311, -0.0598008558, 0.0279015601, 0.3046394885, 0.0712547153, -0.2756307125, 0.4917840064, 0.1518163085, -0.1268106103, 0.0093978234, 0.0813793167, 0.3517756462, -0.0840344876, 0.0386387557, -0.0358655117, -0.0800582618, -0.0291141029, -0.0343142711, 0.0512453616, 0.1818653494, -0.1677497625, -0.4614765346, -0.1657683253, -0.2538039684, 0.069696188, 0.1904787421, 0.0104360282, -0.0155544318, 0.1314590573, -0.0035093017, 0.0700658113, -0.1248801351, -0.1123268753, 0.0263515972, 0.3060842752, 0.1026720852, 0.0255046003, 0.3119353354, 0.1221000627, -0.2198853493, -0.0286990777, -0.3010326922, -0.2245678902, 0.2098783702, 0.1740581393, 0.0645140633, 0.0788518935, 0.0688025653, 0.029461652, -0.075245209, -0.21824117, 0.0113634923, -0.0564786904, -0.0776329562, 0.5352756977, -0.0748245567, -0.4235488772, -0.0209018961, 0.4053625166, 0.2002406418, -0.030287452, 0.1166048571, 0.0365924835, -0.0573126525, -0.1606635153, -0.2089416087, 0.0354487747, -0.2038573921, 0.2944544256, 0.0864200369, -0.2173196077, 0.2485120595, 0.1789841503, 0.0880026072, 0.3290329576, -0.193305403, -0.3450644612, -0.4518954456, 0.2483119518, 0.0230610576, 0.1391381621, -0.0045210794, 0.1524776071, -0.0317644142, 0.2493467182, -0.1979569346, 0.1260659993, -0.1137506664, 0.2610044479, -0.0130677177, -0.01834598, -0.0460456163, -0.096503213, 0.0362859592, 0.0321510881, -0.2552456856, -0.0227033179, 0.0939757079, 0.1423315704, 0.0806937516, -0.1288042963, -0.0636430383, -0.0062539726, -0.0939161032, -0.2363377362, 0.2578297853, 0.2937583327, -0.1800922006, 0.1154792011, 0.2336886227, 0.2067111135, 0.1157898605, 0.3178339303, -0.158494696, 0.0974223614, 0.0603501499, 0.4072880447, 0.0689703152, -0.150117144, 0.1449743658, 0.0592921712, 0.0115349144, 0.0536512323, 0.2454898655, -0.2691058815, 0.1378957778, -0.0138419941, 0.3978849947, 0.3953822851, -0.0577861071, 0.113966696, 0.1548535824, 0.0471148752, 0.0295292214, -0.0768274516, 0.5202013254, 0.1809115112, 0.1295008063, 0.4198644161, 0.3191752136, -0.1101615503, 0.3200749159, 0.1504368782, 0.0289671123, 0.0775547773, 0.2483991981, 0.1680109501, 0.0022479817, 0.2887739241, 0.2247948349, -0.2614194155, -0.1135329604, 0.3078895211, -0.0695222914, 0.5286961794, -0.0701678544, 0.1286159903, -0.0728626177, -0.6161950827, 0.1335552037, 0.0361780301, -0.4510497749, 0.182756722, 0.0080106687, 0.2531335354, -0.3910348713, -0.3703324199, -0.2195661068, 0.4271660149, -0.2437701672, -0.3405459225, -0.1598484814, -0.1082893312, -0.0566914305, 0.0873433575, -0.1157142371, 0.1754128039, 0.7584129572, 0.0234504454, -0.1393941939, -0.2420701981, -0.3078983426, 0.0167938657, 0.2915056348, 0.0562273115, 0.5065603256, -0.062696144, -0.0322506055, -0.1617340297, 0.28365767, 0.5438069701, 0.6444223523, -0.3587298691, -0.102691181, 0.1454257816, -0.0588431582, -0.1368187964, 0.3955861032, 0.0659538135, 0.3027462065, 0.2671024203, -0.0589363798, 0.0205622762, -0.1275094599, 0.2445566058, 0.2430924177, -0.0351937041, 0.3081419468, 0.02295205, -0.0472321883, 0.2128296793, 0.1174854636, -0.3980330229, -0.174776718, 0.4242063165, -0.5149040818, 0.2643514276, 0.1827849299, 0.0437713899, 0.0698602051, 0.4881623983, 0.3512018025, 0.1873044968, -0.2638375759, -0.2302778959, -0.4505084455, -0.0517943874, -0.2045983374, 0.2934899032, -0.0522415303, -0.0020608902, -0.0752916336, 0.1396797746, -0.0422461107, -0.3041801453, 0.46819067, 0.3093081117, -0.5451641083, 0.1230487451, 0.0579693168, -0.1336420327, 0.0283420533, -0.3188731074, 0.289842695, -0.0201226063, -0.1442963183, 0.139299497, 0.1771519184, 0.0797943175, 0.0918296054, -0.0014131069, 0.2944307625, 0.5707868934, -0.0234764032, 0.0339960754, -0.2556169331, -0.018579673, -0.1071840078, 0.0360855833, 0.0612752363, 0.4165353775, -0.1944234371, 0.0195666719, 0.0290838853, 0.1536455899, 0.1593731195, 0.2374945432, -0.3469746709, 0.1436147988, 0.1193897948, 0.0425019898, -0.1771375239, 0.3741258383, 0.1736967862, 0.3392373323, -0.3373233378, -0.2500855625, 0.3955123723, -0.0497906022, -0.1436158419, -0.5925133824, 0.1727861613, -0.1346154064, -0.0207736716, -0.5948375463, -0.2907794118, 0.1976027936, -0.2115630656, -0.5768406391, 0.3014443517, -0.0852305591, -0.0988465548, -0.0711721703, -0.0490695946, 0.1106261536, -0.103562437, 0.2021355778, -0.0847771764 ]
https://github.com/huggingface/datasets/issues/620
map/filter multiprocessing raises errors and corrupts datasets
It works on my side @timothyjlaurent on google colab. Did you try to uninstall datasets first, before updating it to master's version ?
After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ```
23
map/filter multiprocessing raises errors and corrupts datasets After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing. ```python ... ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) ner_ds_dict["validation"] = ner_ds_dict["test"] rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed) rel_ds_dict["validation"] = rel_ds_dict["test"] return ner_ds_dict, rel_ds_dict ``` The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable. The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`. Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads. I also see errors with other map and filter functions when `num_proc` is set. ``` Done writing 67 indices in 536 bytes . Done writing 67 indices in 536 bytes . Fatal Python error: PyCOND_WAIT(gil_cond) failed ``` It works on my side @timothyjlaurent on google colab. Did you try to uninstall datasets first, before updating it to master's version ?
[ -0.4354046583, -0.021331206, -0.0379032269, 0.2603292167, 0.1033044457, -0.0524396673, 0.2046661973, 0.3077457845, 0.002661109, 0.1889889538, 0.0298014209, 0.4250425398, -0.4757446647, 0.3481320739, -0.3869569898, 0.0821056366, 0.0981037766, -0.0260530561, -0.4865825176, 0.201020658, -0.218277216, 0.28013587, -0.3317171931, 0.1242953092, -0.5650151372, 0.1543961614, -0.0827688202, 0.2532314956, -0.0937667489, -0.5869693756, 0.3462377787, 0.2859278917, 0.082723856, 0.5430803895, -0.0001225756, 0.1585458219, 0.3996104002, -0.0728133023, -0.123432532, -0.4519863725, -0.2428453863, -0.2225295901, 0.3537072241, 0.1101014465, 0.157296598, 0.0896836445, -0.0942777544, -0.1369977593, 0.218571201, 0.4498013556, 0.1162435189, 0.149444744, 0.2604188323, 0.0649836808, 0.0276059229, 0.0701153725, -0.0304336362, 0.0448699668, 0.2290816009, -0.3252802491, -0.0746490061, 0.2701992989, -0.192547664, -0.2442041188, -0.0867798775, -0.252414465, 0.704878211, -0.6269379854, 0.2332709134, 0.029663628, -0.1000974923, -0.0182876643, -0.4334955215, -0.1587343067, -0.2270743698, -0.4209839106, 0.2868670225, 0.0445030779, -0.1510621458, 0.172666803, -0.5132813454, -0.2258364856, -0.005310975, -0.0845076144, -0.1335171163, 0.5947372317, 0.1725468785, 0.2310222536, 0.232888639, 0.0781650692, 0.2869794667, 0.0087680407, -0.1394616961, 0.1362143457, -0.4196227491, 0.0641665012, 0.0493195057, -0.0538374931, -0.309158802, -0.0214067549, -0.1810189784, 0.0600140803, -0.0766374245, 0.0922531262, 0.4507496953, -0.1021424234, 0.1322089434, 0.277572751, 0.2553697824, -0.1338892281, 0.0027396828, 0.0537193529, 0.2788274288, -0.1379812509, -0.0712780356, 0.3527618647, 0.1793919206, -0.2538720071, -0.1571784317, 0.1081765592, -0.1086772755, -0.2043837607, 0.0778676271, 0.1800429225, -0.0431440286, 0.7406877875, 0.1121360362, 0.1405055523, -0.2476970851, -0.056307178, -0.0780575946, -0.0575000644, -0.4616805017, 0.0806489736, 0.2490382195, -0.1137032881, 0.016910769, 0.0314537548, -0.3257342577, -0.215870589, 0.0585300475, -0.2257626653, 0.0498760492, 0.6758980751, -0.0760783479, -0.0045045987, -0.0882376581, -0.1388801187, 0.133486867, 0.2678616047, -0.4715875089, -0.2385821939, -0.2375297397, 0.127195403, 0.0577778071, 0.2616372705, -0.1870079637, 0.2372547686, 0.4167129993, -0.3055346608, -0.3097492158, -0.3427871466, -0.3820720911, -0.2815802097, -0.0010893419, 0.2608671784, -0.4083784521, -0.0015211254, -0.0218611546, -0.206269145, 0.2619022131, 0.143188253, 0.0295007341, -0.0487718396, -0.1298925877, -0.0752127171, 0.020490285, -0.1485111415, -0.1331750453, 0.20664379, 0.0465193614, 0.2810731232, 0.0063182153, -0.2740145028, 0.2846454382, -0.1788802594, 0.2588834167, -0.0733955353, -0.1972064227, 0.0146352202, -0.4012228549, -0.0498293117, -0.1051110625, -0.024301324, 0.4020920098, 0.2022490054, -0.0971083269, -0.4577629268, 0.3154035509, -0.0619270653, 0.3068204224, 0.0584407412, 0.1021676213, 0.0915568098, 0.1256655306, -0.2957901061, -0.4701085687, 0.2083053291, -0.1131582856, 0.1168777868, -0.076549083, 0.0008788556, 0.1114094555, 0.1604782343, -0.1929921508, -0.162926212, 0.0659719557, 0.0154281398, -0.3101077974, -0.1127230823, -0.2120907307, 0.6583106518, 0.0757857338, 0.186114341, -0.1015594825, 0.1751903147, -0.1800919026, -0.3921668828, -0.1826353222, 0.2310135514, -0.0450027473, -0.1228731349, -0.2446097732, 0.4539423585, 0.4125307202, -0.0828962177, -0.0622484982, 0.1302371323, 0.2469466925, -0.177552253, -0.1807736158, 0.1462679356, -0.0164523385, -0.1234349981, 0.1983511895, 0.4298545122, 0.0611755475, 0.4018046856, 0.19479011, 0.2197562754, 0.2534511089, 0.0414369553, -0.0207239874, -0.2292795032, 0.1567951143, -0.0927784741, 0.1558269858, 0.0785807744, -0.0827604532, -0.2234293222, 0.0998935848, 0.0262260363, -0.2332395166, 0.0764153898, 0.0616145618, -0.0839658231, 0.1554961652, 0.1190406382, 0.3780832887, 0.0167539641, -0.1266407967, 0.2620252967, -0.2395740151, 0.0004296824, 0.1667568684, -0.0738914013, 0.3823562264, 0.2637552023, 0.0877103209, 0.0352480225, -0.1782210469, -0.2680315971, 0.0835176706, 0.4059551358, -0.4702150524, 0.1295325905, -0.3255929351, 0.2863398492, 0.1105976254, -0.1536783725, -0.3101344407, -0.5067859292, -0.1523492038, 0.5275386572, -0.0916633308, 0.1440249383, -0.0322291143, 0.0580480471, -0.218020469, 0.3890378177, -0.1477665603, -0.3442544937, -0.2100165337, -0.0725421309, 0.4129149318, -0.2999786735, 0.1684944779, 0.080876179, -0.2183599472, 0.1564553082, -0.2977088392, 0.0967224687, -0.1017197222, 0.0640533715, 0.1419490576, -0.1134020984, 0.01323203, -0.1293158382, 0.09964858, -0.1607407779, -0.2497454584, 0.2247127593, -0.1426773816, 0.0508384034, -0.225607127, -0.3213133812, -0.4195495844, -0.1224395484, -0.1029027551, -0.1816919148, 0.3723813295, 0.1235645264, 0.0041527636, -0.0326519459, -0.0467815064, 0.0780998319, -0.0517186783, -0.14820081, -0.1875607967, -0.0115968809, -0.1723947525, 0.0447335094, 0.0954320356, 0.1488699019, 0.3521502018, -0.366596669, 0.0323751457, 0.0986394063, -0.1541320384, 0.1821576953, -0.134132117, 0.2472576499, 0.4272387028, 0.0512184016, -0.0059532449, -0.2139764726, 0.0332704, 0.0367040262, 0.0944873542, 0.0269598961, 0.339687705, 0.1084801555, 0.6143609881, 0.3531053662, 0.0673898682, 0.3380682766, -0.0831014067, 0.1371386647, -0.1871896684, -0.4327066541, -0.1445244402, -0.3046979606, -0.0695495084, -0.1678984463, -0.129710868, -0.386348933, -0.3299004138, 0.5365660191, -0.301944226, -0.2281480432, -0.0869773105, -0.4763877094, 0.239738211, -0.0459841043, 0.0330859423, -0.0160613321, -0.0340642594, -0.2282371819, 0.2794630229, 0.0463744402, -0.2925070226, -0.2189246416, -0.3621684015, -0.4394077957, 0.306959331, 0.194843486, 0.6096239686, 0.0758863837, 0.0055936426, 0.0157364868, -0.1097050309, 0.898459971, -0.5756090283, -0.3717856109, 0.2971132398, -0.4429659247, -0.1966267228, -0.2272344977, -0.1986101866, 0.5277138352, 0.3542862236, 0.3339729011, -0.0812634081, -0.335170567, 0.2341610789, -0.1970131844, -0.1019082442, -0.0012263246, -0.3534617126, 0.0383473039, -0.1581905484, 0.076040335, -0.3219122589, 0.1301032901, -0.2366096079, -0.1010330021, -0.0990145057, -0.1143919975, 0.2294454873, 0.1132529303, -0.0070781521, -0.1349929869, 0.0776524693, -0.0002370225, 0.2038148791, 0.3503994942, 0.3510705233, -0.408385694, -0.1374306679, 0.2050529718, 0.0272702836, 0.3153607249, 0.2631453276, 0.2029474527, -0.1187975109, 0.0076830797, 0.1836528927, -0.2178967297, 0.0136349276, 0.3250059783, -0.0021272972, -0.6167439818, -0.1871482432, -0.027555149, 0.4484912157, -0.1468030214, 0.4348120689, -0.393832624, -0.3060362041, 0.3708010018, 0.1220897138, 0.7906650901, -0.3841527402, -0.0390993766, 0.0074557159, -0.0294349566, 0.1756472588, 0.0881980807, 0.0723475069, -0.3037422001, -0.0416940413, -0.0644235387, 0.1050787792, 0.2288897336, 0.1262171715, -0.2223208547, 0.4922863543, 0.0313615352, -0.039386712, 0.0081445388, 0.1416222155, 0.3827839494, -0.0575265251, 0.1069020405, 0.0146026686, 0.0199216306, -0.1362490654, -0.0200802982, -0.0057206601, 0.2628923357, -0.0662717521, -0.3939186037, -0.0826956332, -0.2232793719, 0.106508866, 0.1593920141, 0.0691581815, 0.0179587007, -0.0173600838, -0.0089868084, 0.0875475705, -0.0935941562, -0.1216053888, 0.0640575588, 0.3566724658, 0.1275506765, 0.0769573748, 0.3548255861, 0.0887996554, -0.229470253, 0.019658057, -0.2300751805, -0.2295104861, 0.0692364573, 0.035171967, -0.0158816855, 0.1362735331, 0.0685126185, -0.0160011053, -0.134108454, -0.1983487755, 0.0759320557, -0.0742572844, -0.1341286004, 0.456562072, -0.0208652057, -0.4040485024, -0.0150764938, 0.475389272, 0.2630131245, -0.0304917507, 0.1754938662, 0.1332137734, 0.0530627221, -0.2314902842, -0.1961840391, 0.2538500428, -0.1257030517, 0.3030163646, 0.0551358201, -0.2737409472, 0.3017932773, 0.1004651487, 0.1316058934, 0.4018439651, -0.2024610192, -0.3165368438, -0.52079916, 0.2257960141, 0.0114974678, 0.077063784, -0.1637189984, 0.2023659497, -0.0716059357, 0.2509375215, -0.2399387956, 0.143437922, -0.1084157974, 0.1871654838, -0.0967848003, 0.0084108785, -0.1806606948, -0.1770702451, 0.06533546, 0.0199867189, -0.2602193356, -0.0486227795, 0.0486737415, 0.125051707, 0.1307240427, -0.1203110591, -0.1024018228, -0.0336566158, -0.1377157569, -0.1636202931, 0.2318588942, 0.2656702399, -0.1671266258, 0.1718912572, 0.1938106567, 0.0712788403, 0.0635041445, 0.305172652, -0.1433325112, -0.0388201065, 0.0464688241, 0.4356116056, 0.1066294312, -0.1829192936, 0.0858235657, 0.0566575825, -0.1773482859, 0.0550205149, 0.2996478379, -0.2549982071, 0.0197279006, 0.0032342598, 0.4577386081, 0.4496197402, -0.0286767799, 0.0523706302, 0.1740739942, 0.1175028384, -0.0192276612, -0.0907894522, 0.4671583772, 0.1339299679, 0.07305035, 0.282296598, 0.2857060432, -0.1302194595, 0.1707009971, 0.2068175673, -0.1275930703, 0.0618729293, 0.2945267856, 0.3045641184, 0.1483749598, 0.2929369509, 0.2344318628, -0.2614426017, -0.1006952673, 0.4330562949, -0.0023203008, 0.611890316, 0.0232094079, 0.0663697496, -0.042151317, -0.597756505, 0.1128708571, 0.1524533331, -0.4710723162, 0.1441166252, -0.0279062018, 0.2607676387, -0.3617389798, -0.5522367358, -0.2278294861, 0.3568051159, -0.2619927824, -0.3144666553, -0.231820941, -0.1654921174, -0.1205217689, 0.0910989344, -0.1149415523, 0.1981252134, 0.7187824249, 0.0533017665, -0.0966018289, -0.2032718211, -0.4901353717, 0.0202737637, 0.2725246549, 0.0038026348, 0.3858623207, 0.0100089163, -0.0448271856, -0.0597184673, 0.2000954747, 0.4813219011, 0.7049611807, -0.4624642134, -0.0323621295, 0.1701077372, -0.0370743126, -0.0712416396, 0.365313977, 0.1443499923, 0.2370884717, 0.2170630246, -0.0534213781, 0.0103765279, -0.0988128334, 0.2725338936, 0.2391125858, -0.0500676632, 0.3155850768, 0.1682042927, -0.0819733813, 0.2457942367, 0.1070545167, -0.3547038436, -0.2583478689, 0.5720747709, -0.4759394526, 0.2169440538, 0.1774109602, 0.062165238, 0.0866575316, 0.5702393651, 0.3672217131, 0.1261856556, -0.2507760525, -0.2363352329, -0.4916424751, 0.002297733, -0.1355818957, 0.3526293337, 0.0493115149, -0.0417466983, -0.0722353235, 0.0806354731, -0.0273709893, -0.1785650104, 0.3432693183, 0.3163319528, -0.5723838806, 0.1845900416, 0.0346431583, -0.0773484036, -0.0028169826, -0.2534408569, 0.3371372819, -0.0350744277, -0.1019873321, 0.1381355077, 0.1860685199, 0.1227265596, 0.1683133394, 0.0822112262, 0.23604545, 0.5106273293, 0.0284205303, 0.0158381537, -0.3051037192, 0.0315127037, -0.1612468958, -0.0157734677, 0.0998522341, 0.4244031608, -0.1579048336, 0.0052358052, -0.0106179379, 0.2532093525, 0.0556075722, 0.2211028337, -0.4375541508, 0.2328783274, 0.231899485, 0.0851718932, -0.1381652653, 0.4167882204, 0.1708542407, 0.3439673483, -0.3627951741, -0.2012811303, 0.3263411224, -0.0431998149, -0.152723223, -0.6441501379, 0.1936711073, -0.1285033077, 0.0245875325, -0.5834947824, -0.2899977267, 0.13390854, -0.1852584779, -0.5932972431, 0.24384287, -0.032269001, -0.0596766099, -0.0747543424, -0.2190408558, 0.1366814375, -0.1838207543, 0.1465086937, -0.0944250003 ]