Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions content/en/docs/marketplace/genai/concepts/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@ Often, you can use prompt engineering, RAG, and ReAct to build your use case and

## Prompt Engineering {#prompt-engineering}

Prompt engineering is the activity of designing the input text that will be send to the LLM. This typically contains input from the end-user, enriched with instructions from the developer / administrator. A prompt typically contains:
Prompt engineering is the activity of designing the input text that will be sent to the LLM. This typically contains input from the end-user, enriched with instructions from the developer / administrator. A prompt typically contains:

* instructions on what the model should do
* context and information that the model needs to follow the instructions
Expand Down Expand Up @@ -122,7 +122,7 @@ For example, Amazon Bedrock has the concept of [knowledge bases for Amazon Bedro

### PgVector Knowledge Base {#pgvectorknowledgebase}

If your chosen architecture doesn't have fully-integrated RAG capabilities, or if you want tighter control of the RAG process, you can create and use your own knowledge base.
If your chosen architecture does not have fully-integrated RAG capabilities, or if you want tighter control of the RAG process, you can create and use your own knowledge base.

In this case you will have to index and store your knowledge yourself, and index your input data in order to retrieve the information with which you want to augment your prompt. For this you can use the [PgVector Knowledge Base module](/appstore/modules/genai/pgvector/) in combination with an embeddings model, to maintain and use your knowledge base.

Expand Down