Live Examples
Examples & Demos
See consciousness interfaces in action with interactive examples and real-world use cases
Live Preview
Cognitive Dashboard
A complete dashboard showcasing AI reasoning primitives with real-time data visualization
intermediate15 min
CognitiveGraphThoughtStackIntentWeaver
Features:
- Real-time updates
- Interactive graphs
- Editable thoughts
- Intent management
Code Example
import { CognitiveGraph, ThoughtStack, IntentWeaver } from "@/components/ui"
import { useState } from "react"
export function CognitiveDashboard() {
const [thoughts, setThoughts] = useState([
{
id: "1",
content: "User wants to build a responsive dashboard",
timestamp: new Date(),
confidence: 0.95
}
])
const nodes = [
{ id: "1", label: "User Input", type: "data", confidence: 1.0 },
{ id: "2", label: "Parse Intent", type: "reasoning", confidence: 0.95 }
]
const edges = [
{ from: "1", to: "2", strength: 1.0 }
]
const intents = [
{ id: "1", text: "Create responsive layout", priority: 3, color: "#8b5cf6" }
]
return (
<div className="grid gap-6 lg:grid-cols-2">
<CognitiveGraph nodes={nodes} edges={edges} />
<ThoughtStack thoughts={thoughts} editable />
<IntentWeaver intents={intents} />
</div>
)
}Live Preview
Voice Assistant Interface
Complete voice interaction system with OpenAI Realtime API integration
advanced25 min
RealtimeVoiceSessionVoiceAgentOrchestratorVoiceToolExecutor
Features:
- Real-time voice
- Multi-agent support
- Tool execution
- Conversation history
Code Example
import {
RealtimeVoiceSession,
VoiceAgentOrchestrator,
VoiceToolExecutor
} from "@/components/voice"
import { useState } from "react"
export function VoiceAssistantInterface() {
const [isConnected, setIsConnected] = useState(false)
const [currentAgent, setCurrentAgent] = useState("general")
const agents = [
{
id: "general",
name: "General Assistant",
description: "Handles general queries",
instructions: "You are a helpful assistant."
}
]
const tools = [
{
name: "search",
description: "Search the web",
parameters: {
type: "object",
properties: {
query: { type: "string", description: "Search query" }
}
}
}
]
return (
<div className="space-y-6">
<RealtimeVoiceSession
config={{
model: "gpt-4o-realtime-preview",
voice: "alloy",
vadEnabled: true
}}
onConnect={() => setIsConnected(true)}
onDisconnect={() => setIsConnected(false)}
/>
<VoiceAgentOrchestrator
agents={agents}
currentAgentId={currentAgent}
onAgentChange={setCurrentAgent}
/>
<VoiceToolExecutor
tools={tools}
autoExecute={false}
requireApproval={true}
/>
</div>
)
}Live Preview
Data Insights Platform
Intelligent data visualization with pattern recognition and semantic understanding
intermediate20 min
SemanticTablePatternSynthTemporalMirror
Features:
- Semantic analysis
- Pattern detection
- Temporal visualization
- Interactive exploration
Code Example
import { SemanticTable, PatternSynth, TemporalMirror } from "@/components/ui"
import { useState } from "react"
export function DataInsightsPlatform() {
const [data, setData] = useState([
{ id: 1, name: "Sales Q1", value: 150000, trend: "up" },
{ id: 2, name: "Sales Q2", value: 180000, trend: "up" }
])
const columns = [
{ key: "name", label: "Metric", type: "text" },
{ key: "value", label: "Value", type: "currency" },
{ key: "trend", label: "Trend", type: "badge" }
]
const patterns = [
{ id: "1", name: "Seasonal Growth", confidence: 0.85, type: "trend" }
]
return (
<div className="space-y-6">
<div className="grid gap-6 lg:grid-cols-2">
<SemanticTable
data={data}
columns={columns}
onRowClick={(row) => console.log(row)}
/>
<PatternSynth
data={data.map(d => d.value)}
patterns={patterns}
onPatternSelect={(pattern) => console.log(pattern)}
/>
</div>
<TemporalMirror
data={data}
timeRange="6months"
onTimeChange={(time) => console.log(time)}
/>
</div>
)
}Live Preview
Emotive Chat Interface
Chat interface that understands and responds to emotional context
beginner10 min
MoodMeshEmpathyBarInnerVoice
Features:
- Mood detection
- Empathy levels
- Emotional responses
- Context awareness
Code Example
import { MoodMesh, EmpathyBar, InnerVoice } from "@/components/ui"
import { useState } from "react"
export function EmotiveChatInterface() {
const [mood, setMood] = useState("curious")
const [empathy, setEmpathy] = useState(0.7)
const [messages, setMessages] = useState([
{ id: "1", text: "Hello! How are you feeling today?", emotion: "friendly" }
])
return (
<div className="space-y-6">
<div className="flex items-center gap-4">
<MoodMesh
mood={mood}
onMoodChange={setMood}
/>
<EmpathyBar
level={empathy}
onLevelChange={setEmpathy}
/>
</div>
<div className="space-y-4">
{messages.map((message) => (
<div key={message.id} className="p-4 rounded-lg bg-muted">
<p>{message.text}</p>
<InnerVoice
emotion={message.emotion}
onEmotionChange={(emotion) =>
setMessages(prev => prev.map(m =>
m.id === message.id ? { ...m, emotion } : m
))
}
/>
</div>
))}
</div>
</div>
)
}Live Preview
Predictive Form Builder
Smart forms that predict user intent and adapt in real-time
advanced30 min
IntentPredictorFieldSmartPlaceholderOutcomeSimulator
Features:
- Intent prediction
- Smart placeholders
- Outcome simulation
- Real-time adaptation
Code Example
import {
IntentPredictorField,
SmartPlaceholder,
OutcomeSimulator
} from "@/components/ui"
import { useState } from "react"
export function PredictiveFormBuilder() {
const [formData, setFormData] = useState({})
const [predictions, setPredictions] = useState([])
const [outcomes, setOutcomes] = useState([])
return (
<div className="space-y-6">
<IntentPredictorField
onIntentChange={(intent) => setPredictions(intent.predictions)}
placeholder="What would you like to build?"
/>
<div className="grid gap-4">
<SmartPlaceholder
type="text"
predictions={predictions}
onValueChange={(value) => setFormData(prev => ({ ...prev, name: value }))}
/>
<SmartPlaceholder
type="email"
predictions={predictions}
onValueChange={(value) => setFormData(prev => ({ ...prev, email: value }))}
/>
</div>
<OutcomeSimulator
data={formData}
onOutcomesChange={setOutcomes}
/>
</div>
)
}Live Preview
Meta Component Monitor
Self-monitoring components that track their own performance and usage
expert45 min
UXOracleComponentArchaeologistAccessibilityOracle
Features:
- Performance monitoring
- Usage tracking
- Accessibility scanning
- Self-optimization
Code Example
import {
UXOracle,
ComponentArchaeologist,
AccessibilityOracle
} from "@/components/ui"
import { useState, useEffect } from "react"
export function MetaComponentMonitor() {
const [uxMetrics, setUxMetrics] = useState({})
const [componentHistory, setComponentHistory] = useState([])
const [a11yIssues, setA11yIssues] = useState([])
useEffect(() => {
// Monitor component performance
const observer = new PerformanceObserver((list) => {
setUxMetrics(prev => ({
...prev,
renderTime: list.getEntries()[0]?.duration || 0
}))
})
observer.observe({ entryTypes: ['measure'] })
return () => observer.disconnect()
}, [])
return (
<div className="space-y-6">
<UXOracle
metrics={uxMetrics}
onRecommendations={(recs) => console.log(recs)}
/>
<ComponentArchaeologist
onHistoryChange={setComponentHistory}
maxHistory={50}
/>
<AccessibilityOracle
onIssuesFound={setA11yIssues}
autoScan={true}
/>
</div>
)
}Ready to Build Your Own?
Use these examples as inspiration and starting points for your own consciousness interfaces.