Skip to content

Fake smile #1222

@cashmeretwist5-lgtm

Description

@cashmeretwist5-lgtm

import React, { useState } from "react";

export default function AIApp() {
const [messages, setMessages] = useState([]);
const [input, setInput] = useState("");
const [loading, setLoading] = useState(false);

async function sendMessage() {
if (!input.trim()) return;
const newMsg = { role: "user", content: input };
setMessages([...messages, newMsg]);
setInput("");
setLoading(true);

// Dummy AI reply (replace with API call to your backend/LLM)
setTimeout(() => {
  setMessages((prev) => [...prev, { role: "ai", content: `You said: ${newMsg.content}` }]);
  setLoading(false);
}, 800);

}

return (



{messages.map((m, i) => (
<div
key={i}
className={p-2 rounded-xl max-w-xs ${ m.role === "user" ? "bg-blue-500 text-white self-end" : "bg-gray-200 text-black self-start" }}
>
{m.content}

))}
{loading &&
AI is thinking…
}


<input
className="flex-1 border rounded-l-xl p-2"
value={input}
onChange={(e) => setInput(e.target.value)}
placeholder="Type your message..."
/>

Send



);
}

I built you a starter AI chat app (React + Tailwind). It runs locally in the browser, with a simple chat interface and a dummy AI reply.

Next step options for you:

Hook it up to a real LLM API (like OpenAI, Anthropic, or a local model server).

Add document upload + Q&A.

Deploy as a web app (Vercel/Netlify) or mobile app (React Native).

Want me to wire this starter app to a real backend so it becomes a functioning AI assistant?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions