Documentation Index
Fetch the complete documentation index at: https://docs.hawkings.education/llms.txt
Use this file to discover all available pages before exploring further.
A Hawkings tutor is a chat that knows the lesson. It has read the
lesson content and the activities, and it answers in the language and
tone you configure on the parent course.
This guide shows you how to wire one up in ~15 lines of client code.
What you’ll build
A chat panel next to a lesson. The student types a question; the tutor
streams a grounded answer; the student can keep going.
1. Start a thread
A thread is the unit of conversation. You start one per
(student, activity) pair:
const thread = await hk.activities.tutor.start({
activity_id: "act_123",
student_id: "usr_42",
});
// thread.id → thr_01HX9...
// thread.activity_id
// thread.created_at
Persist thread.id somewhere keyed on (student, activity). Reuse it
across page loads.
2. Send a message
const reply = await hk.activities.tutor.send(thread.id, {
message: "Why does time dilate at high speeds?",
});
console.log(reply.text);
For streaming responses (recommended for UI):
const stream = hk.activities.tutor.sendStream(thread.id, {
message: "Why does time dilate at high speeds?",
});
for await (const chunk of stream) {
process.stdout.write(chunk.text_delta);
}
The chunk object also carries citations — pointers into the lesson
content the tutor used. Render them as footnote-style links next to the
streamed text.
3. Read the history
When the student returns later, replay the thread:
const history = await hk.activities.tutor.history("act_123", {
student_id: "usr_42",
});
for (const msg of history.messages) {
console.log(msg.role, msg.text);
}
4. Wire the UI
Minimal React example:
"use client";
import { useEffect, useState } from "react";
import { Hawkings } from "@hawkings/sdk";
export function TutorChat({ activityId, studentId }: { activity_id: string; student_id: string }) {
const hk = new Hawkings({ api_key: process.env.NEXT_PUBLIC_HAWKINGS_SESSION_KEY! });
const [threadId, setThreadId] = useState<string | null>(null);
const [messages, setMessages] = useState<{ role: string; text: string }[]>([]);
useEffect(() => {
hk.activities.tutor.start({ activityId, studentId }).then(t => setThreadId(t.id));
}, [activityId, studentId]);
async function ask(input: string) {
if (!threadId) return;
setMessages(m => [...m, { role: "user", text: input }, { role: "assistant", text: "" }]);
const stream = hk.activities.tutor.sendStream(threadId, { message: input });
for await (const chunk of stream) {
setMessages(m => {
const next = [...m];
next[next.length - 1] = { role: "assistant", text: next.at(-1)!.text + chunk.text_delta };
return next;
});
}
}
return (
<div className="tutor">
{messages.map((m, i) => <div key={i} data-role={m.role}>{m.text}</div>)}
<input
onKeyDown={e => {
if (e.key === "Enter") {
ask((e.target as HTMLInputElement).value);
(e.target as HTMLInputElement).value = "";
}
}}
/>
</div>
);
}
How tutor grounding works
The tutor’s context window is built from:
- The parent
Course’s ai.instructions (tone, voice, scope).
- The
Lesson content + activities.
- The student’s prior thread messages (truncated by token budget).
It does not see other students’ submissions, other lessons in the
course, or the rest of the platform. That isolation is intentional —
it’s what makes the tutor predictable.
For grounded research-augmented tutoring, attach a research artefact:
await hk.activities.tutor.start({
activityId,
studentId,
research_id: "res_...", // optional
});
Use the token flow so the
session key in the browser is scoped to one student:
// Server
const { token } = await hk.auth.tokenStart({ email: student.email });
res.json({ token });
// Client
const { api_key } = await hk.auth.tokenFinish(token);
const hk = new Hawkings({ api_key });
Cost & limits
- ~$0.02 per student-question on default settings.
- Threads have a soft cap of 50 messages; over that, the SDK summarises
the oldest messages automatically.
- A
tutor.send call returns within 5–15 seconds for non-streaming and
starts streaming within 1 second.