You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Jul 4, 2025. It is now read-only.
> **NOTE:**Installing Nitro will add new files and configurations to your system to enable it to run.
23
+
> Installing Nitro will add new files and configurations to your system to enable it to run.
24
24
25
25
For a manual installation process, see: [Install from Source](install.md)
26
26
27
27
## Step 2: Downloading a Model
28
28
29
-
Next, we need to download a model. For this example, we'll use the [Llama2 7B chat model](https://huggingface.co/TheBloke/Llama-2-7B-Chat-GGUF/tree/main).
30
-
31
-
- Create a `/model` and navigate into it:
29
+
For this example, we'll use the [Llama2 7B chat model](https://huggingface.co/TheBloke/Llama-2-7B-Chat-GGUF/tree/main).
This command sends a request to Nitro, asking it about the 2020 World Series winner.
86
-
87
-
- As you can see, A key benefit of Nitro is its alignment with [OpenAI's API structure](https://platform.openai.com/docs/guides/text-generation?lang=curl). Its inference call syntax closely mirrors that of OpenAI's API, facilitating an easier shift for those accustomed to OpenAI's framework.
79
+
As you can see, a key benefit of Nitro is its alignment with [OpenAI's API structure](https://platform.openai.com/docs/guides/text-generation?lang=curl). Its inference call syntax closely mirrors that of OpenAI's API, facilitating an easier shift for those accustomed to OpenAI's framework.
0 commit comments