Anew on-device embedding-based search library has been announced that lets people find similar images, text, or audio in just a few milliseconds from millions of data samples. It works by putting the search query into a high-dimensional vector that semantically shows what the query means. Then, it uses a program called ScaNN (Scalable Nearest Neighbors) to look for similar items in a database that has already been set up. To use it with a dataset, one needs to build a custom TFLite Searcher model with the Model Maker Searcher API (tutorial) and then send it to devices with the Task Library Searcher API (vision/text). submitted by /u/No_Coffee_4638 |
Categories