Does anyone know a (free) method for importing large datasets into Google Colab, of multiple GB? Github is severely limited, uploading a folder to google drive takes a long time.
Asked
Active
Viewed 1,874 times
2
-
A better option will be to use `wget` command. I have discussed various approaches [here](https://towardsdatascience.com/4-awesome-ways-of-loading-ml-data-in-google-colab-9a5264c61966). – Shubham Panchal Nov 02 '20 at 14:33
1 Answers
2
One option is that you can download the dataset into your system and save it in an easily accessible directory. Then, run the following codes:
from google.colab import files
data = files.upload()
After running the above line, you will get a Choose File button where you can directly browse your system and choose your file.
Added the screenshot for your reference:
desertnaut
- 1,908
- 2
- 13
- 23
Mujeebur Rahman
- 21
- 4
-
Unfortunately this method also takes a lot of time to upload. I have a 25mb file that takes about 10 minutes to upload, forget about uploading in GB's. – spectre Dec 12 '21 at 15:03
