Storing Datasets#
The Guide has moved!
The information on this website may be outdated as of July 2025. Please view the new curriculum guide located at https://curriculum-guide.datahub.berkeley.edu/
A few methods of storing datasets are outlined below. The choice of method depends on your preference and the size of the dataset. Keep in mind, regardless of the size of your dataset, each account on DataHub is provided with ~1GB RAM, so this will limit the amount of data that you can read in at any time. If you want to temporarily increase this limit on RAM, please raise a github issue.
Small Datasets (a few MBs)#
GitHub#
Datasets and the corresponding Jupyter Notebook can be stored in a folder on GitHub. You can then create a nbgitpuller link for the entire folder. When students click this link, the entire folder will appear on their DataHub account.
Outside Hosts#
You can store the data on an online host such as Box, Google Drive, or even GitHub.
Direct Upload#
Students can directly upload data files to their DataHub account. This method can get messy if notebooks expect the data to be stored at a certain filepath and students upload the files to a different location. Therefore, we recommend using the other methods listed on this page.
Larger Datasets (tens of MBs to several GBs)#
Our current recommendation is to keep the file size of the datasets below 100 MB. We recommend the following approaches to all instructors/students who plan to use large datasets for their teaching/learning plans.