2 private links
💥 支持AccessToken使用账号,支持 GPT-4、GPT-4o、 GPTs 🔍 回复格式与真实api完全一致,适配几乎所有客户端 https://t.me/chat2api 要提问请先阅读完仓库介绍 GPT-3.5 对话 (传入模型名不包含gpt-4,则默认使用gpt-3.5,也就是text-davinci-002-render-sha) [...] 可作为网关使用,可多机分布部署 暂无,欢迎提 issue https://zeabur.com/templates/6HEGIZ?referralCode=LanQian528 git clone https://github.com/LanQian528/chat2api cd chat2api pip install -r requirements.txt python app.py [...] (推荐,可用4.0) Docker Compose部署 #推荐可用40-docker-compose部署 创建一个新的目录,例如chat2api,并进入该目录: mkdir chat2api cd chat2api 在此目录中下载库中的docker-compose.yml文件: wget https://raw.githubusercontent.com/LanQian528/chat2api/main/docker-compose.yml 修改docker-compose.yml文件中的环境变量,保存后: [...] AccessToken 获取: chatgpt官网登录后,再打开 https://chatgpt.com/api/auth/session 获取 accessToken 这个值 免登录 gpt3.5 无需传入 Token 推荐使用 docker-compose 方式部署, 已内置 Arkose 服务 #推荐使用-docker-compose-方式部署-已内置-arkose-服务
https://github.com/j178/chatgpt/releases A CLI for ChatGPT, powered by GPT-3.5-turbo and GPT-4 models. Get or create your OpenAI API Key from here: https://platform.openai.com/account/api-keys 💬 Start in chat mode [...] 💻 Use it in a pipeline cat config.yaml | chatgpt -p 'convert this yaml to json' echo "Hello, world" | chatgpt -p translator | say [...] You can add more prompts in the config file, for example: {"api_key": "sk-xxxxxx", "endpoint": "https://api.openai.com/v1", "prompts": {"default": "You are ChatGPT, a large language model trained by OpenAI. [...] "}, "conversation": {"prompt": "default", "context_length": 6, "model": "gpt-3.5-turbo", "stream": true, "max_tokens": 1024 }} then use -p flag to switch prompt: Note The prompt can be a predefined prompt, or come up with one on the fly.
https://i1.wp.com/a16z.com/wp-content/uploads/2023/05/stacks_of_books_and_GPU.png?ssl=1Source: Midjourney Research in artificial intelligence is increasing at an exponential rate. [...] https://jalammar.github.io/illustrated-stable-diffusion/: Introduction to latent diffusion models, the most common type of generative AI model for images. [...] New models https://arxiv.org/abs/1706.03762 (2017): The original transformer work and research paper from Google Brain that started it all. [...] Code generation https://arxiv.org/abs/2107.03374 (2021): This is OpenAI’s research paper for Codex, the code-generation model behind the GitHub Copilot product.