🔥码云GVP开源项目 12k star Uniapp+ElementUI 功能强大 支持多语言、二开方便! 广告
[TOC] ## ollama 1. 可支持图片解析 ## 安装 https://ollama.com/download/ 下载对应系统的版本 ### 指定模型值 ``` export OLLAMA_MODELS=xxx ``` ## 命令 ``` Usage: ollama [flags] ollama [command] Available Commands: serve Start ollama create Create a model from a Modelfile show Show information for a model run Run a model pull Pull a model from a registry push Push a model to a registry list List models cp Copy a model rm Remove a model help Help about any command Flags: -h, --help help for ollama -v, --version Show version information Use "ollama [command] --help" for more information about a command. ``` ## 接口文档 * [Generate a completion](https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-completion) * [Generate a chat completion](https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-chat-completion) * [Create a Model](https://github.com/ollama/ollama/blob/main/docs/api.md#create-a-model) * [List Local Models](https://github.com/ollama/ollama/blob/main/docs/api.md#list-local-models) * [Show Model Information](https://github.com/ollama/ollama/blob/main/docs/api.md#show-model-information) * [Copy a Model](https://github.com/ollama/ollama/blob/main/docs/api.md#copy-a-model) * [Delete a Model](https://github.com/ollama/ollama/blob/main/docs/api.md#delete-a-model) * [Pull a Model](https://github.com/ollama/ollama/blob/main/docs/api.md#pull-a-model) * [Push a Model](https://github.com/ollama/ollama/blob/main/docs/api.md#push-a-model) * [Generate Embeddings](https://github.com/ollama/ollama/blob/main/docs/api.md#generate-embeddings) ## 示例 ### 命令行启动 部署后,就可以在命令行或api 中调用 ``` >>> """Hello, ... world! ... """ I'm a basic program that prints the famous "Hello, world!" message to the console. ``` 启动服务后,可在 调用web 接口 1. 生成文本 ``` curl http://localhost:11434/api/generate -d '{ "model": "llama3", "prompt":"Why is the sky blue?" }' ``` 2.生成对话 ``` curl http://localhost:11434/api/chat -d '{ "model": "llama3", "messages": [ { "role": "user", "content": "why is the sky blue?" } ] }' ``` ### GPU 1. 下载 [cuda ](https://developer.nvidia.com/cuda-downloads) 2. 可先通过 `ollama run 模型` 看 gpu显卡内存是否上涨 3. 重启ollama,ollama 服务需要api 接口调用时,加载到gpu 显卡中 ``` curl -X POST http://localhost:11434/api/chat -d { "model": "llama2-chinese", "messages": [ { "role": "user", "content": "why is the sky blue?" }, { "role": "assistant", "content": "due to rayleigh scattering." }, { "role": "user", "content": "how is that different than mie scattering?" } ] } ```