Finding memory required for neural network to load on embedded device?

I was interested as to how I could determine how much memory my saved neural network model requires. The reason I’m asking is that I’d like to test on an embedded device, and I’d like to see how much memory my current model takes first, and then I’d like to see how much memory my downsampled model requires next, and compare the performance reductions. Also, I have svm performing the same classification task, so I’m simply trying to figure out which is best for embedded devices.

submitted by /u/Mother-Beyond9493
[visit reddit] [comments]

Leave a Reply

Your email address will not be published. Required fields are marked *