Categories
Misc

Is there a way to get insight into a model’s performance pre-inference?

Hi.

Given a TFLite model, is there a way to get information about it to know how it will perform? Such as, how can I know how long will it take the model to run on a given device (how does it consume CPU/GPU resources) before actual inferencing it?

submitted by /u/janissary2016
[visit reddit] [comments]

Leave a Reply

Your email address will not be published. Required fields are marked *