Skip to content

Commit 27754f4

Browse files
committed
Merge remote-tracking branch 'template/main'
2 parents 56c179d + fb07084 commit 27754f4

File tree

3 files changed

+144
-12
lines changed

3 files changed

+144
-12
lines changed

README.md

Lines changed: 45 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -95,13 +95,20 @@ You agent can be based on an LLM hosted anywhere, you have available currently O
9595
main.yml # deploys the STAGING function to Lambda Feedback
9696
test-report.yml # gathers Pytest Report of function tests
9797

98+
docs/ # docs for devs and users
99+
98100
src/module.py # chat_module function implementation
99101
src/module_test.py # chat_module function tests
100102
src/agents/ # find all agents developed for the chat functionality
101103
src/agents/utils/test_prompts.py # allows testing of any LLM agent on a couple of example inputs containing Lambda Feedback Questions and synthetic student conversations
102104
```
103105

104-
## Run the Chat Script
106+
107+
## Testing the Chat Function
108+
109+
To test your function, you can either call the code directly through a python script. Or you can build the respective chat function docker container locally and call it through an API request. Below you can find details on those processes.
110+
111+
### Run the Chat Script
105112

106113
You can run the Python function itself. Make sure to have a main function in either `src/module.py` or `index.py`.
107114

@@ -114,25 +121,25 @@ You can also use the `testbench_agents.py` script to test the agents with exampl
114121
python src/agents/utils/testbench_agents.py
115122
```
116123

117-
### Building the Docker Image
124+
### Calling the Docker Image Locally
118125

119126
To build the Docker image, run the following command:
120127

121128
```bash
122129
docker build -t llm_chat .
123130
```
124131

125-
### Running the Docker Image
132+
#### Running the Docker Image
126133

127134
To run the Docker image, use the following command:
128135

129-
#### Without .env file:
136+
##### A. Without .env file:
130137

131138
```bash
132139
docker run -e OPENAI_API_KEY={your key} -e OPENAI_MODEL={your LLM chosen model name} -p 8080:8080 llm_chat
133140
```
134141

135-
#### With container name (for interaction, e.g. copying file from inside the docker container):
142+
##### B. With container name (for interaction, e.g. copying file from inside the docker container):
136143

137144
```bash
138145
docker run --env-file .env -it --name my-lambda-container -p 8080:8080 llm_chat
@@ -143,18 +150,23 @@ This will start the chat function and expose it on port `8080` and it will be op
143150
```bash
144151
curl --location 'http://localhost:8080/2015-03-31/functions/function/invocations' \
145152
--header 'Content-Type: application/json' \
146-
--data '{"body":"{\"message\": \"hi\", \"params\": {\"conversation_id\": \"12345Test\", \"conversation_history\": [{\"type\": \"user\", \"content\": \"hi\"}]}}"}'
153+
--data '{"body":"{\"message\": \"hi\", \"params\": {\"conversation_id\": \"12345Test\", \"conversation_history\": [{\"type\": \"user\",
147154
```
148155
149-
### Call Docker Container From Postman
156+
#### Call Docker Container
157+
##### A. Call Docker with Python Requests
158+
159+
In the `src/agents/utils` folder you can find the `requests_testscript.py` script that calls the POST URL of the running docker container. It reads any kind of input files with the expected schema. You can use this to test your curl calls of the chatbot.
160+
161+
##### B. Call Docker Container through API request
150162
151163
POST URL:
152164
153165
```bash
154166
http://localhost:8080/2015-03-31/functions/function/invocations
155167
```
156168
157-
Body:
169+
Body (stringified within body for API request):
158170
159171
```JSON
160172
{"body":"{\"message\": \"hi\", \"params\": {\"conversation_id\": \"12345Test\", \"conversation_history\": [{\"type\": \"user\", \"content\": \"hi\"}]}}"}
@@ -176,10 +188,6 @@ Body with optional Params:
176188
}
177189
```
178190
179-
### Call Docker with Python Requests
180-
181-
In the `src/agents/utils` folder you can find the `requests_test.py` script that calls the POST URL of the running docker container. It reads any kind of input files with the expected schema. You can use this to test your curl calls of the chatbot.
182-
183191
### Deploy to Lambda Feedback
184192
185193
Deploying the chat function to Lambda Feedback is simple and straightforward, as long as the repository is within the [Lambda Feedback organization](https://github.com/lambda-feedback).
@@ -206,3 +214,28 @@ Make sure that all run-time dependencies are installed in the Docker image.
206214
- System packages: If you need to install system packages, add the installation command to the Dockerfile.
207215
- ML models: If your chat function depends on ML models, make sure to include them in the Docker image.
208216
- Data files: If your chat function depends on data files, make sure to include them in the Docker image.
217+
218+
### Pull Changes from the Template Repository
219+
220+
If you want to pull changes from the template repository to your repository, follow these steps:
221+
222+
1. Add the template repository as a remote:
223+
224+
```bash
225+
git remote add template https://github.com/lambda-feedback/chat-function-boilerplate.git
226+
```
227+
228+
2. Fetch changes from all remotes:
229+
230+
```bash
231+
git fetch --all
232+
```
233+
234+
3. Merge changes from the template repository:
235+
236+
```bash
237+
git merge template/main --allow-unrelated-histories
238+
```
239+
240+
> [!WARNING]
241+
> Make sure to resolve any conflicts and keep the changes you want to keep.

docs/dev.md

Lines changed: 96 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,96 @@
1+
# YourFunctionName
2+
*Brief description of what this chat function does, from the developer perspective*
3+
4+
## Inputs
5+
*Specific input parameters which can be supplied when the calling this chat function.*
6+
7+
## Outputs
8+
*Output schema/values for this function*
9+
10+
## Examples
11+
*List of example inputs and outputs for this function, each under a different sub-heading*
12+
13+
## Testing the Chat Function
14+
15+
To test your function, you can either call the code directly through a python script. Or you can build the respective chat function docker container locally and call it through an API request. Below you can find details on those processes.
16+
17+
### Run the Chat Script
18+
19+
You can run the Python function itself. Make sure to have a main function in either `src/module.py` or `index.py`.
20+
21+
```bash
22+
python src/module.py
23+
```
24+
25+
You can also use the `testbench_agents.py` script to test the agents with example inputs from Lambda Feedback questions and synthetic conversations.
26+
```bash
27+
python src/agents/utils/testbench_agents.py
28+
```
29+
30+
### Calling the Docker Image Locally
31+
32+
To build the Docker image, run the following command:
33+
34+
```bash
35+
docker build -t llm_chat .
36+
```
37+
38+
#### Running the Docker Image
39+
40+
To run the Docker image, use the following command:
41+
42+
##### A. Without .env file:
43+
44+
```bash
45+
docker run -e OPENAI_API_KEY={your key} -e OPENAI_MODEL={your LLM chosen model name} -p 8080:8080 llm_chat
46+
```
47+
48+
##### B. With container name (for interaction, e.g. copying file from inside the docker container):
49+
50+
```bash
51+
docker run --env-file .env -it --name my-lambda-container -p 8080:8080 llm_chat
52+
```
53+
54+
This will start the chat function and expose it on port `8080` and it will be open to be curl:
55+
56+
```bash
57+
curl --location 'http://localhost:8080/2015-03-31/functions/function/invocations' \
58+
--header 'Content-Type: application/json' \
59+
--data '{"body":"{\"message\": \"hi\", \"params\": {\"conversation_id\": \"12345Test\", \"conversation_history\": [{\"type\": \"user\",
60+
```
61+
62+
#### Call Docker Container
63+
##### A. Call Docker with Python Requests
64+
65+
In the `src/agents/utils` folder you can find the `requests_testscript.py` script that calls the POST URL of the running docker container. It reads any kind of input files with the expected schema. You can use this to test your curl calls of the chatbot.
66+
67+
##### B. Call Docker Container through API request
68+
69+
POST URL:
70+
71+
```bash
72+
http://localhost:8080/2015-03-31/functions/function/invocations
73+
```
74+
75+
Body (stringified within body for API request):
76+
77+
```JSON
78+
{"body":"{\"message\": \"hi\", \"params\": {\"conversation_id\": \"12345Test\", \"conversation_history\": [{\"type\": \"user\", \"content\": \"hi\"}]}}"}
79+
```
80+
81+
Body with optional Params:
82+
```JSON
83+
{
84+
"message":"hi",
85+
"params":{
86+
"conversation_id":"12345Test",
87+
"conversation_history":[{"type":"user","content":"hi"}],
88+
"summary":" ",
89+
"conversational_style":" ",
90+
"question_response_details": "",
91+
"include_test_data": true,
92+
"agent_type": {agent_name}
93+
}
94+
}
95+
```
96+

docs/user.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
# YourChatFunctionName
2+
3+
Teacher- & Student-facing documentation for this function.

0 commit comments

Comments
 (0)