Templates & Starter Kits
Jumpstart your consciousness interface projects with professionally crafted templates and starter kits
Template Preview
Consciousness Dashboard
starter-kitComplete dashboard template with AI reasoning primitives, voice integration, and data visualization
Features:
- Complete dashboard
- Voice integration
- AI reasoning
- Responsive design
Template Code
// consciousness-dashboard-template
import { CognitiveGraph, ThoughtStack, RealtimeVoiceSession } from "@/components"
import { useState } from "react"
export default function ConsciousnessDashboard() {
const [thoughts, setThoughts] = useState([])
const [isVoiceActive, setIsVoiceActive] = useState(false)
return (
<div className="min-h-screen bg-gradient-to-br from-background to-muted/20">
<header className="border-b bg-background/95 backdrop-blur supports-[backdrop-filter]:bg-background/60">
<div className="container flex h-14 items-center">
<div className="mr-4 hidden md:flex">
<a className="mr-6 flex items-center space-x-2" href="/">
<Brain className="h-6 w-6" />
<span className="hidden font-bold sm:inline-block">
Consciousness Dashboard
</span>
</a>
</div>
</div>
</header>
<main className="container py-6">
<div className="grid gap-6 lg:grid-cols-2">
<CognitiveGraph
nodes={nodes}
edges={edges}
onNodeClick={(node) => console.log(node)}
/>
<ThoughtStack
thoughts={thoughts}
editable
onThoughtsChange={setThoughts}
/>
</div>
<div className="mt-6">
<RealtimeVoiceSession
config={{
model: "gpt-4o-realtime-preview",
voice: "alloy",
vadEnabled: true
}}
onConnect={() => setIsVoiceActive(true)}
onDisconnect={() => setIsVoiceActive(false)}
/>
</div>
</main>
</div>
)
}Template Preview
Voice Chat Application
component-templateReal-time voice chat application with multi-agent support and conversation history
Features:
- Multi-agent support
- Real-time voice
- Conversation history
- Audio visualization
Template Code
// voice-chat-app-template
import {
RealtimeVoiceSession,
VoiceAgentOrchestrator,
VoiceConversationHistory,
VoiceAudioVisualizer
} from "@/components/voice"
import { useState } from "react"
export default function VoiceChatApp() {
const [isConnected, setIsConnected] = useState(false)
const [currentAgent, setCurrentAgent] = useState("general")
const [messages, setMessages] = useState([])
const agents = [
{
id: "general",
name: "General Assistant",
description: "Handles general queries",
instructions: "You are a helpful assistant."
},
{
id: "technical",
name: "Technical Support",
description: "Specialized in technical help",
instructions: "You are a technical support specialist."
}
]
return (
<div className="min-h-screen bg-background">
<div className="container mx-auto px-4 py-8">
<div className="grid gap-8 lg:grid-cols-2">
<div className="space-y-6">
<RealtimeVoiceSession
config={{
model: "gpt-4o-realtime-preview",
voice: "alloy",
vadEnabled: true
}}
onConnect={() => setIsConnected(true)}
onDisconnect={() => setIsConnected(false)}
onTranscript={(text, isFinal) => {
if (isFinal) {
setMessages(prev => [...prev, {
id: Date.now(),
role: "user",
content: text,
timestamp: new Date()
}])
}
}}
/>
<VoiceAgentOrchestrator
agents={agents}
currentAgentId={currentAgent}
onAgentChange={setCurrentAgent}
/>
</div>
<div className="space-y-6">
<VoiceConversationHistory messages={messages} />
<VoiceAudioVisualizer
isActive={isConnected}
variant="bars"
/>
</div>
</div>
</div>
</div>
)
}Template Preview
Data Insights Widget
widget-templateEmbeddable data insights widget with semantic analysis and pattern recognition
Features:
- Semantic analysis
- Pattern detection
- Temporal visualization
- Embeddable
Template Code
// data-insights-widget-template
import { SemanticTable, PatternSynth, TemporalMirror } from "@/components"
import { useState, useEffect } from "react"
export default function DataInsightsWidget({ data, onInsight }) {
const [patterns, setPatterns] = useState([])
const [insights, setInsights] = useState([])
useEffect(() => {
// Analyze data for patterns
const analyzeData = async () => {
const detectedPatterns = await PatternSynth.analyze(data)
setPatterns(detectedPatterns)
const generatedInsights = await SemanticTable.generateInsights(data)
setInsights(generatedInsights)
}
analyzeData()
}, [data])
return (
<div className="w-full max-w-4xl mx-auto p-6 bg-card rounded-lg border">
<div className="mb-6">
<h2 className="text-2xl font-bold mb-2">Data Insights</h2>
<p className="text-muted-foreground">
AI-powered analysis of your data patterns and trends
</p>
</div>
<div className="grid gap-6 lg:grid-cols-2">
<SemanticTable
data={data}
columns={[
{ key: "metric", label: "Metric", type: "text" },
{ key: "value", label: "Value", type: "number" },
{ key: "trend", label: "Trend", type: "badge" }
]}
onRowClick={(row) => onInsight?.(row)}
/>
<PatternSynth
data={data}
patterns={patterns}
onPatternSelect={(pattern) => onInsight?.(pattern)}
/>
</div>
<div className="mt-6">
<TemporalMirror
data={data}
timeRange="6months"
onTimeChange={(time) => console.log('Time changed:', time)}
/>
</div>
</div>
)
}Template Preview
Emotive Profile Card
component-templateInteractive profile card that responds to emotional context and user mood
Features:
- Mood detection
- Empathy levels
- Interactive elements
- Responsive design
Template Code
// emotive-profile-card-template
import { MoodMesh, EmpathyBar, InnerVoice } from "@/components"
import { useState } from "react"
export default function EmotiveProfileCard({ user, onMoodChange }) {
const [mood, setMood] = useState(user.mood || "neutral")
const [empathy, setEmpathy] = useState(0.7)
const handleMoodChange = (newMood) => {
setMood(newMood)
onMoodChange?.(newMood)
}
return (
<div className="w-full max-w-md mx-auto bg-card rounded-lg border p-6">
<div className="text-center mb-6">
<div className="w-20 h-20 mx-auto mb-4 rounded-full bg-gradient-to-br from-primary/20 to-purple-500/20 flex items-center justify-center">
<span className="text-2xl font-bold text-primary">
{user.name?.charAt(0) || "U"}
</span>
</div>
<h3 className="text-xl font-semibold mb-1">{user.name}</h3>
<p className="text-muted-foreground text-sm">{user.role}</p>
</div>
<div className="space-y-4">
<div className="flex items-center justify-between">
<span className="text-sm font-medium">Current Mood</span>
<MoodMesh
mood={mood}
onMoodChange={handleMoodChange}
size="sm"
/>
</div>
<div className="space-y-2">
<div className="flex items-center justify-between">
<span className="text-sm font-medium">Empathy Level</span>
<span className="text-sm text-muted-foreground">
{Math.round(empathy * 100)}%
</span>
</div>
<EmpathyBar
level={empathy}
onLevelChange={setEmpathy}
/>
</div>
<div className="pt-4 border-t">
<InnerVoice
emotion={mood}
onEmotionChange={handleMoodChange}
showSuggestions
/>
</div>
</div>
</div>
)
}Template Preview
Predictive Search Interface
component-templateSmart search interface that predicts user intent and provides intelligent suggestions
Features:
- Intent prediction
- Smart suggestions
- Outcome simulation
- Real-time updates
Template Code
// predictive-search-template
import { IntentPredictorField, SmartPlaceholder, OutcomeSimulator } from "@/components"
import { useState, useEffect } from "react"
export default function PredictiveSearchInterface({ onSearch, suggestions = [] }) {
const [query, setQuery] = useState("")
const [predictions, setPredictions] = useState([])
const [outcomes, setOutcomes] = useState([])
const [isLoading, setIsLoading] = useState(false)
useEffect(() => {
if (query.length > 2) {
setIsLoading(true)
// Simulate prediction API call
setTimeout(() => {
const mockPredictions = [
{ id: "1", text: "Search for components", confidence: 0.9 },
{ id: "2", text: "Find documentation", confidence: 0.8 },
{ id: "3", text: "Browse examples", confidence: 0.7 }
]
setPredictions(mockPredictions)
setIsLoading(false)
}, 500)
} else {
setPredictions([])
}
}, [query])
const handleSearch = (searchQuery) => {
setQuery(searchQuery)
onSearch?.(searchQuery)
}
return (
<div className="w-full max-w-2xl mx-auto">
<div className="relative">
<IntentPredictorField
value={query}
onChange={setQuery}
onIntentChange={setPredictions}
placeholder="What are you looking for?"
className="w-full"
/>
{isLoading && (
<div className="absolute right-3 top-1/2 transform -translate-y-1/2">
<div className="animate-spin rounded-full h-4 w-4 border-b-2 border-primary"></div>
</div>
)}
</div>
{predictions.length > 0 && (
<div className="mt-2 space-y-1">
{predictions.map((prediction) => (
<button
key={prediction.id}
onClick={() => handleSearch(prediction.text)}
className="w-full text-left p-2 rounded-md hover:bg-muted transition-colors"
>
<div className="flex items-center justify-between">
<span className="text-sm">{prediction.text}</span>
<span className="text-xs text-muted-foreground">
{Math.round(prediction.confidence * 100)}%
</span>
</div>
</button>
))}
</div>
)}
<div className="mt-6">
<OutcomeSimulator
data={{ query, predictions }}
onOutcomesChange={setOutcomes}
/>
</div>
</div>
)
}Template Preview
Meta Component Library
library-templateSelf-monitoring component library with performance tracking and accessibility scanning
Features:
- Performance monitoring
- Usage tracking
- Accessibility scanning
- Cognitive load analysis
Template Code
// meta-component-library-template
import {
UXOracle,
ComponentArchaeologist,
AccessibilityOracle,
CognitiveLoadMonitor
} from "@/components"
import { useState, useEffect } from "react"
export default function MetaComponentLibrary() {
const [metrics, setMetrics] = useState({})
const [componentHistory, setComponentHistory] = useState([])
const [a11yIssues, setA11yIssues] = useState([])
const [cognitiveLoad, setCognitiveLoad] = useState(0)
useEffect(() => {
// Initialize performance monitoring
const observer = new PerformanceObserver((list) => {
const entries = list.getEntries()
entries.forEach((entry) => {
if (entry.entryType === 'measure') {
setMetrics(prev => ({
...prev,
[entry.name]: entry.duration
}))
}
})
})
observer.observe({ entryTypes: ['measure'] })
// Initialize accessibility scanning
const scanAccessibility = () => {
const issues = document.querySelectorAll('[aria-invalid="true"]')
setA11yIssues(Array.from(issues))
}
scanAccessibility()
const interval = setInterval(scanAccessibility, 5000)
return () => {
observer.disconnect()
clearInterval(interval)
}
}, [])
return (
<div className="min-h-screen bg-background">
<div className="container mx-auto px-4 py-8">
<div className="mb-8">
<h1 className="text-3xl font-bold mb-2">Meta Component Library</h1>
<p className="text-muted-foreground">
Self-monitoring components that track their own performance and usage
</p>
</div>
<div className="grid gap-6 lg:grid-cols-2">
<UXOracle
metrics={metrics}
onRecommendations={(recs) => console.log('UX Recommendations:', recs)}
/>
<ComponentArchaeologist
onHistoryChange={setComponentHistory}
maxHistory={100}
/>
<AccessibilityOracle
onIssuesFound={setA11yIssues}
autoScan={true}
scanInterval={5000}
/>
<CognitiveLoadMonitor
onLoadChange={setCognitiveLoad}
threshold={0.8}
/>
</div>
<div className="mt-8 p-6 bg-muted/50 rounded-lg">
<h3 className="text-lg font-semibold mb-4">System Status</h3>
<div className="grid gap-4 sm:grid-cols-2 lg:grid-cols-4">
<div className="text-center">
<div className="text-2xl font-bold text-green-500">
{Object.keys(metrics).length}
</div>
<div className="text-sm text-muted-foreground">Metrics Tracked</div>
</div>
<div className="text-center">
<div className="text-2xl font-bold text-blue-500">
{componentHistory.length}
</div>
<div className="text-sm text-muted-foreground">Components Used</div>
</div>
<div className="text-center">
<div className="text-2xl font-bold text-orange-500">
{a11yIssues.length}
</div>
<div className="text-sm text-muted-foreground">A11y Issues</div>
</div>
<div className="text-center">
<div className="text-2xl font-bold text-purple-500">
{Math.round(cognitiveLoad * 100)}%
</div>
<div className="text-sm text-muted-foreground">Cognitive Load</div>
</div>
</div>
</div>
</div>
</div>
)
}Can't Find What You're Looking For?
Create your own template or request a custom one from our community.