How to load data efficiently, so memory can be utilized

Currently I’m trying to load some images for training purpose, here is what i’m currently doing

sats = [np.array( + "/x/" + name).convert('RGB'),dtype="float32") for name in names] masks = [np.array( + "/y/" + name),dtype="float32") for name in names] 

But this takes almost all space in colab, when running on the full dataset. So my question is can I use a better api, which will partially load data, so I don’t run out of memory ?


submitted by /u/maifee
[visit reddit] [comments]

Leave a Reply

Your email address will not be published. Required fields are marked *