Skip to content

Commit e8b12a5

Browse files
authored
Merge pull request #578 from alexrudall/docs/deepseek
Add Deepseek to docs
2 parents 695f0a3 + 63901db commit e8b12a5

File tree

3 files changed

+113
-0
lines changed

3 files changed

+113
-0
lines changed

README.md

Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,7 @@ Stream text with GPT-4, transcribe and translate audio with Whisper, or create i
2929
- [Errors](#errors)
3030
- [Faraday middleware](#faraday-middleware)
3131
- [Azure](#azure)
32+
- [Deepseek](#deepseek)
3233
- [Ollama](#ollama)
3334
- [Groq](#groq)
3435
- [Counting Tokens](#counting-tokens)
@@ -228,6 +229,28 @@ end
228229

229230
where `AZURE_OPENAI_URI` is e.g. `https://custom-domain.openai.azure.com/openai/deployments/gpt-35-turbo`
230231

232+
#### Deepseek
233+
234+
[Deepseek](https://api-docs.deepseek.com/) is compatible with the OpenAI chat API. Get an access token from [here](https://platform.deepseek.com/api_keys), then:
235+
236+
```ruby
237+
client = OpenAI::Client.new(
238+
access_token: "deepseek_access_token_goes_here",
239+
uri_base: "https://api.deepseek.com/"
240+
)
241+
242+
client.chat(
243+
parameters: {
244+
model: "deepseek-chat", # Required.
245+
messages: [{ role: "user", content: "Hello!"}], # Required.
246+
temperature: 0.7,
247+
stream: proc do |chunk, _bytesize|
248+
print chunk.dig("choices", 0, "delta", "content")
249+
end
250+
}
251+
)
252+
```
253+
231254
#### Ollama
232255

233256
Ollama allows you to run open-source LLMs, such as Llama 3, locally. It [offers chat compatibility](https://github.com/ollama/ollama/blob/main/docs/openai.md) with the OpenAI API.

spec/fixtures/cassettes/deepseek_deepseek-chat_streamed_chat.yml

Lines changed: 62 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

spec/openai/client/chat_spec.rb

Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -185,6 +185,34 @@ def call(chunk)
185185
end
186186
end
187187

188+
context "with Deepseek + model: deepseek-chat" do
189+
let(:uri_base) { "https://api.deepseek.com/" }
190+
let(:provider) { "deepseek" }
191+
let(:model) { "deepseek-chat" }
192+
let(:response) do
193+
OpenAI::Client.new({ uri_base: uri_base }).chat(
194+
parameters: parameters
195+
)
196+
end
197+
let(:chunks) { [] }
198+
let(:stream) do
199+
proc do |chunk, _bytesize|
200+
chunks << chunk
201+
end
202+
end
203+
204+
it "succeeds" do
205+
VCR.use_cassette(cassette) do
206+
tap do
207+
response
208+
rescue Faraday::UnauthorizedError
209+
pending "This test needs the `OPENAI_ACCESS_TOKEN` to be a Deepseek API key"
210+
end
211+
expect(chunks.dig(0, "choices", 0, "index")).to eq(0)
212+
end
213+
end
214+
end
215+
188216
context "with Groq + model: llama3" do
189217
let(:uri_base) { "https://api.groq.com/openai" }
190218
let(:provider) { "groq" }

0 commit comments

Comments
 (0)