Skip to content

Conversation

@derrekcoleman
Copy link
Contributor

@derrekcoleman derrekcoleman commented Oct 23, 2025

Update landing page to an AI chat
Suggest questions to help users discover what to ask
Offer follow-up questions relevant to conversation as it unfolds
Update existing "Ask AI" pop-up styling and functionality to match

Landing page:
Screenshot 2025-10-23 at 9 03 28 AM

Ask AI button pop-up:
Screenshot 2025-10-23 at 9 03 38 AM

2025-10-23.09-02-39.mp4

@vercel
Copy link
Contributor

vercel bot commented Oct 23, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
docs Ready Ready Preview Comment Nov 10, 2025 10:34pm

Comment on lines +5 to +16
export const SUGGESTED_QUESTIONS = [
"Where can I buy RECALL tokens?",
"How do AI competitions work on Recall?",
"What are skill markets?",
"How do I enter my first competition?",
"How can I stake RECALL tokens?",
"What is boosting and how does it work?",
"What rewards can I earn on Recall?",
"How do agents prove their performance?",
"What makes Recall decentralized?",
"How does staking relate to boosting?",
] as const;
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is where we can hard-code the questions we want the initial suggestions and follow-up suggestions to draw from.

@andrewxhill
Copy link
Contributor

it's looking great!

minor suggestion on the interface itself. bigger chat interface, smaller text around it. "get eyes to the text box". if you have control on the color of the textbox, a small delta from the background color would make it obvious to me what to do next. note both cases of chat box more primary and color differentiation here.
Screenshot 2025-11-05 at 3 48 08 PM
Screenshot 2025-11-05 at 3 47 59 PM

@derrekcoleman
Copy link
Contributor Author

@andrewxhill wdyt? I'm reluctant to double the height of the box when we don't have model toggle, photo upload, or other buttons asking for it.

Screenshot 2025-11-06 at 3 57 33 PM Screenshot 2025-11-06 at 3 57 41 PM

@andrewxhill
Copy link
Contributor

Lgtm

update Ask AI styling and functionality to match
The generateContextualSuggestions function now receives the complete conversation history including the assistant's streamed response, making suggestions truly contextual to what was just answered.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
Copy link
Contributor

@vercel vercel bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Additional Suggestion:

The client-side stream parser will crash with a JSON parsing error when it encounters the data: [DONE] marker that the OpenAI API sends to signal the end of the stream, causing the entire response to fail with "Sorry, something went wrong."

View Details
📝 Patch Details
diff --git a/components/ai/engines/openai.ts b/components/ai/engines/openai.ts
index 8e47d93..92f7b5d 100644
--- a/components/ai/engines/openai.ts
+++ b/components/ai/engines/openai.ts
@@ -89,7 +89,15 @@ export async function createOpenAIEngine(): Promise<Engine> {
         for (const line of lines) {
           if (aborted || !line.trim()) continue;
 
-          const json = JSON.parse(line);
+          // Handle Server-Sent Events format
+          let jsonData = line;
+          if (line.startsWith("data: ")) {
+            const data = line.slice(6);
+            if (data === "[DONE]") continue;
+            jsonData = data;
+          }
+
+          const json = JSON.parse(jsonData);
           if ("choices" in json && Array.isArray(json.choices)) {
             const delta = (json as OpenAIResponse).choices[0]?.delta?.content || "";
             if (delta) {

Analysis

Client-side SSE parser crashes on OpenAI's data: [DONE] marker

What fails: The JSON.parse(line) call in components/ai/engines/openai.ts:92 attempts to parse Server-Sent Events formatted lines directly as JSON, causing crashes on both valid SSE messages and the data: [DONE] termination marker.

How to reproduce:

 ]}" 
# "data: [DONE]"
# Both cause JSON.parse to fail since they include the "data: " SSE prefix

Result: JSON.parse("data: [DONE]") throws SyntaxError: Unexpected token 'd', caught by the try-catch block at line 114, replacing successful AI responses with "Sorry, something went wrong."

Expected: Client should handle SSE format correctly by parsing only the content after "data: " prefix, as the server-side transform already does at app/api/chat/route.ts:179-182

@derrekcoleman derrekcoleman merged commit 790d150 into main Nov 10, 2025
4 checks passed
@derrekcoleman derrekcoleman deleted the derrek/chat-landing-page branch November 10, 2025 23:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants