@@ -5,6 +5,8 @@ API](https://platform.openai.com/docs/api-reference) for Lua. Compatible with
55any HTTP library that supports LuaSocket's http request interface. Compatible
66with OpenResty using
77[ ` lapis.nginx.http ` ] ( https://leafo.net/lapis/reference/utilities.html#making-http-requests ) .
8+ This project implements both the classic Chat Completions API in addition to
9+ the modern Responses API.
810
911<details >
1012<summary >AI Generated Disclaimer</summary >
@@ -64,7 +66,6 @@ if status == 200 then
6466end
6567```
6668
67-
6869## Chat Session Example
6970
7071A chat session instance can be created to simplify managing the state of a back
@@ -103,100 +104,37 @@ local response = chat:send("What's the most boring color?", function(chunk)
103104end )
104105```
105106
106- ## Chat Session With Functions
107-
108- OpenAI allows [ sending a list of function
109- declarations] ( https://openai.com/blog/function-calling-and-other-api-updates )
110- that the LLM can decide to call based on the prompt. The function calling
111- interface must be used with chat completions and the ` gpt-4-0613 ` or
112- ` gpt-3.5-turbo-0613 ` models or later.
113-
114- > See < https://github.com/leafo/lua-openai/blob/main/examples/example5.lua > for
115- > a full example that implements basic math functions to compute the standard
116- > deviation of a list of numbers
117-
118- Here's a quick example of how to use functions in a chat exchange. First you
119- will need to create a chat session with the ` functions ` option containing an
120- array of available functions.
121107
122- > The functions are stored on the ` functions ` field on the chat object. If the
123- > functions need to be adjusted for future message, the field can be modified.
108+ ## Streaming Response Example
124109
125- ``` lua
126- local chat = openai :new_chat_session ({
127- model = " gpt-3.5-turbo-0613" ,
128- functions = {
129- {
130- name = " add" ,
131- description = " Add two numbers together" ,
132- parameters = {
133- type = " object" ,
134- properties = {
135- a = { type = " number" },
136- b = { type = " number" }
137- }
138- }
139- }
140- }
141- })
142- ```
110+ Under normal circumstances the API will wait until the entire response is
111+ available before returning the response. Depending on the prompt this may take
112+ some time. The streaming API can be used to read the output one chunk at a
113+ time, allowing you to display content in real time as it is generated.
143114
144- Any prompt you send will be aware of all available functions, and may request
145- any of them to be called. If the response contains a function call request,
146- then an object will be returned instead of the standard string return value.
115+ Using the Responses API:
147116
148117``` lua
149- local res = chat :send (" Using the provided function, calculate the sum of 2923 + 20839" )
118+ local openai = require (" openai" )
119+ local client = openai .new (os.getenv (" OPENAI_API_KEY" ))
150120
151- if type (res ) == " table" and res .function_call then
152- -- The function_call object has the following fields:
153- -- function_call.name --> name of function to be called
154- -- function_call.arguments --> A string in JSON format that should match the parameter specification
155- -- Note that res may also include a content field if the LLM produced a textual output as well
121+ client :create_response ({
122+ {role = " system" , content = " You work for Streak.Club, a website to track daily creative habits" },
123+ {role = " user" , content = " Who do you work for?" }
124+ }, {
125+ stream = true
126+ }, function (chunk )
127+ if chunk .text_delta then
128+ io.stdout :write (chunk .text_delta )
129+ io.stdout :flush ()
130+ end
131+ end )
156132
157- local cjson = require " cjson"
158- local name = res .function_call .name
159- local arguments = cjson .decode (res .function_call .arguments )
160- -- ... compute the result and send it back ...
161- end
133+ print () -- print a newline
162134```
163135
164- You can evaluate the requested function & arguments and send the result back to
165- the client so it can resume operation with a ` role=function ` message object:
136+ Using the Chat Completions API:
166137
167- > Since the LLM can hallucinate every part of the function call, you'll want to
168- > do robust type validation to ensure that function name and arguments match
169- > what you expect. Assume every stage can fail, including receiving malformed
170- > JSON for the arguments.
171-
172- ``` lua
173- local name , arguments = ... -- the name and arguments extracted from above
174-
175- if name == " add" then
176- local value = arguments .a + arguments .b
177-
178- -- send the response back to the chat bot using a `role = function` message
179-
180- local cjson = require " cjson"
181-
182- local res = chat :send ({
183- role = " function" ,
184- name = name ,
185- content = cjson .encode (value )
186- })
187-
188- print (res ) -- Print the final output
189- else
190- error (" Unknown function: " .. name )
191- end
192- ```
193-
194- ## Streaming Response Example
195-
196- Under normal circumstances the API will wait until the entire response is
197- available before returning the response. Depending on the prompt this may take
198- some time. The streaming API can be used to read the output one chunk at a
199- time, allowing you to display content in real time as it is generated.
200138
201139``` lua
202140local openai = require (" openai" )
@@ -455,3 +393,98 @@ and the raw request response.
455393- ` stream_callback ` : (optional) A function to enable streaming output.
456394
457395See ` chat:send ` for details on the ` stream_callback `
396+
397+
398+ ## Appendix
399+
400+ ### Chat Session With Functions
401+
402+ > Note: Functions are the legacy format for what is now known as tools, this
403+ > example is left here just as a reference
404+
405+ OpenAI allows [ sending a list of function
406+ declarations] ( https://openai.com/blog/function-calling-and-other-api-updates )
407+ that the LLM can decide to call based on the prompt. The function calling
408+ interface must be used with chat completions and the ` gpt-4-0613 ` or
409+ ` gpt-3.5-turbo-0613 ` models or later.
410+
411+ > See < https://github.com/leafo/lua-openai/blob/main/examples/example5.lua > for
412+ > a full example that implements basic math functions to compute the standard
413+ > deviation of a list of numbers
414+
415+ Here's a quick example of how to use functions in a chat exchange. First you
416+ will need to create a chat session with the ` functions ` option containing an
417+ array of available functions.
418+
419+ > The functions are stored on the ` functions ` field on the chat object. If the
420+ > functions need to be adjusted for future message, the field can be modified.
421+
422+ ``` lua
423+ local chat = openai :new_chat_session ({
424+ model = " gpt-3.5-turbo-0613" ,
425+ functions = {
426+ {
427+ name = " add" ,
428+ description = " Add two numbers together" ,
429+ parameters = {
430+ type = " object" ,
431+ properties = {
432+ a = { type = " number" },
433+ b = { type = " number" }
434+ }
435+ }
436+ }
437+ }
438+ })
439+ ```
440+
441+ Any prompt you send will be aware of all available functions, and may request
442+ any of them to be called. If the response contains a function call request,
443+ then an object will be returned instead of the standard string return value.
444+
445+ ``` lua
446+ local res = chat :send (" Using the provided function, calculate the sum of 2923 + 20839" )
447+
448+ if type (res ) == " table" and res .function_call then
449+ -- The function_call object has the following fields:
450+ -- function_call.name --> name of function to be called
451+ -- function_call.arguments --> A string in JSON format that should match the parameter specification
452+ -- Note that res may also include a content field if the LLM produced a textual output as well
453+
454+ local cjson = require " cjson"
455+ local name = res .function_call .name
456+ local arguments = cjson .decode (res .function_call .arguments )
457+ -- ... compute the result and send it back ...
458+ end
459+ ```
460+
461+ You can evaluate the requested function & arguments and send the result back to
462+ the client so it can resume operation with a ` role=function ` message object:
463+
464+ > Since the LLM can hallucinate every part of the function call, you'll want to
465+ > do robust type validation to ensure that function name and arguments match
466+ > what you expect. Assume every stage can fail, including receiving malformed
467+ > JSON for the arguments.
468+
469+ ``` lua
470+ local name , arguments = ... -- the name and arguments extracted from above
471+
472+ if name == " add" then
473+ local value = arguments .a + arguments .b
474+
475+ -- send the response back to the chat bot using a `role = function` message
476+
477+ local cjson = require " cjson"
478+
479+ local res = chat :send ({
480+ role = " function" ,
481+ name = name ,
482+ content = cjson .encode (value )
483+ })
484+
485+ print (res ) -- Print the final output
486+ else
487+ error (" Unknown function: " .. name )
488+ end
489+ ```
490+
0 commit comments