给出一句话来描述想要的图片,就能从图库中搜出来符合要求的

发布时间:2024年01月05日

介绍

地址:https://github.com/mazzzystar/Queryable

The open-source code of Queryable, an iOS app, leverages the OpenAI's?CLIP?model to conduct offline searches in the 'Photos' album. Unlike the category-based search model built into the iOS Photos app, Queryable allows you to use natural language statements, such as?a brown dog sitting on a bench, to search your album. Since it's offline, your album privacy won't be compromised by any company, including Apple or Google.

How does it work?

  • Encode all album photos using the CLIP Image Encoder, compute image vectors, and save them.
  • For each new text query, compute the corresponding text vector using the Text Encoder.
  • Compare the similarity between this text vector and each image vector.
  • Rank and return the top K most similar results.

The process is as follows:

For more details, please refer to my blog:?Run CLIP on iPhone to Search Photos.

Run on Xcode

Download the?ImageEncoder_float32.mlmodelc?and?TextEncoder_float32.mlmodelc?from?Google Drive. Clone this repo, put the downloaded models below?CoreMLModels/?path and run Xcode, it should work.

?

文章来源:https://blog.csdn.net/weixin_46771779/article/details/135402601
本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。