Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Commit cde9db1

Browse filesBrowse files
authored
Update README.md
Punctuation changes
1 parent 968db1d commit cde9db1
Copy full SHA for cde9db1

File tree

Expand file treeCollapse file tree

1 file changed

+3
-3
lines changed
Filter options
Expand file treeCollapse file tree

1 file changed

+3
-3
lines changed

‎README.md

Copy file name to clipboardExpand all lines: README.md
+3-3Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@ puts(chat_completion)
4444

4545
We provide support for streaming responses using Server-Sent Events (SSE).
4646

47-
**coming soon:** `openai.chat.completions.stream` will soon come with Python SDK style higher level streaming responses support.
47+
**coming soon:** `openai.chat.completions.stream` will soon come with Python SDK-style higher-level streaming responses support.
4848

4949
```ruby
5050
stream = openai.chat.completions.stream_raw(
@@ -224,7 +224,7 @@ puts(chat_completion[:my_undocumented_property])
224224

225225
#### Undocumented request params
226226

227-
If you want to explicitly send an extra param, you can do so with the `extra_query`, `extra_body`, and `extra_headers` under the `request_options:` parameter when making a request as seen in examples above.
227+
If you want to explicitly send an extra param, you can do so with the `extra_query`, `extra_body`, and `extra_headers` under the `request_options:` parameter when making a request, as seen in the examples above.
228228

229229
#### Undocumented endpoints
230230

@@ -242,7 +242,7 @@ response = client.request(
242242

243243
### Concurrency & connection pooling
244244

245-
The `OpenAI::Client` instances are threadsafe, but only are fork-safe when there are no in-flight HTTP requests.
245+
The `OpenAI::Client` instances are threadsafe, but are only fork-safe when there are no in-flight HTTP requests.
246246

247247
Each instance of `OpenAI::Client` has its own HTTP connection pool with a default size of 99. As such, we recommend instantiating the client once per application in most settings.
248248

0 commit comments

Comments
0 (0)
Morty Proxy This is a proxified and sanitized view of the page, visit original site.