2:I[7012,["4765","static/chunks/4765-f5afdf8061f456f3.js","9856","static/chunks/9856-3b185291364d9bef.js","6687","static/chunks/app/docs/%5B...slug%5D/page-e07536548216bee4.js"],"MarkdownRenderer"]
4:I[9856,["4765","static/chunks/4765-f5afdf8061f456f3.js","9856","static/chunks/9856-3b185291364d9bef.js","6687","static/chunks/app/docs/%5B...slug%5D/page-e07536548216bee4.js"],""]
5:I[4126,[],""]
7:I[9630,[],""]
8:I[4278,["9856","static/chunks/9856-3b185291364d9bef.js","8172","static/chunks/8172-b3a2d6fe4ae10d40.js","3185","static/chunks/app/layout-2814fa5d15b84fe4.js"],"HeadingProvider"]
9:I[1476,["9856","static/chunks/9856-3b185291364d9bef.js","8172","static/chunks/8172-b3a2d6fe4ae10d40.js","3185","static/chunks/app/layout-2814fa5d15b84fe4.js"],"Header"]
a:I[3167,["9856","static/chunks/9856-3b185291364d9bef.js","8172","static/chunks/8172-b3a2d6fe4ae10d40.js","3185","static/chunks/app/layout-2814fa5d15b84fe4.js"],"Sidebar"]
b:I[7409,["9856","static/chunks/9856-3b185291364d9bef.js","8172","static/chunks/8172-b3a2d6fe4ae10d40.js","3185","static/chunks/app/layout-2814fa5d15b84fe4.js"],"PageFrame"]
3:T1d8b,
# VoiceFirstInputBar Component
**Phase 3 - Voice Mode v4.1**
A unified voice-first input bar that prioritizes voice interaction while providing text fallback. This component is the primary input interface for voice mode.
## Overview
```
+------------------------------------------------------------------+
| [PHI] | [Voice prompt / Text input area] | [Mic] | [Kbd] |
+------------------------------------------------------------------+
| |
v v
VAD-powered Push-to-talk
auto-detection or always-on
```
## Features
- **Voice-First Design**: Prominent microphone button with visual feedback
- **VAD Preset Integration**: Respects sensitivity settings from voiceSettingsStore
- **RTL Support**: Full bidirectional layout with auto-detection
- **PHI Mode Indicator**: Visual indicator of current PHI routing status
- **Text Fallback**: Expandable text input for hybrid interaction
- **Keyboard Shortcuts**: Space to talk, Escape to cancel
## Usage
```tsx
import { VoiceFirstInputBar } from "@/components/voice/VoiceFirstInputBar";
function VoiceChat() {
const handleSubmit = (input: string, isVoice: boolean) => {
console.log(`Received ${isVoice ? "voice" : "text"} input:`, input);
};
return (
);
}
```
## Props
| Prop | Type | Default | Description |
| --------------------- | ------------------------------------------- | ------------------ | --------------------------------------- |
| `onSubmit` | `(input: string, isVoice: boolean) => void` | required | Callback when input is submitted |
| `onRecordingStart` | `() => void` | - | Called when recording begins |
| `onRecordingStop` | `() => void` | - | Called when recording ends |
| `phiMode` | `"local" \| "hybrid" \| "cloud"` | `"cloud"` | Current PHI routing mode |
| `phiScore` | `number` | `0` | PHI probability score (0-1) |
| `isAssistantSpeaking` | `boolean` | `false` | Whether assistant is currently speaking |
| `disabled` | `boolean` | `false` | Disable all input |
| `detectedLanguage` | `string` | - | Language code for RTL detection |
| `placeholder` | `string` | `"Press space..."` | Placeholder text |
| `className` | `string` | - | Additional CSS classes |
## Keyboard Shortcuts
| Key | Action | Mode |
| ----------------- | -------------------- | ----------------------------- |
| `Space` | Start recording | Idle state, not in text input |
| `Space` (release) | Stop recording | Push-to-talk mode |
| `Escape` | Cancel recording | Recording state |
| `Tab` | Switch to text input | Focused on mic button |
| `Enter` | Submit text | Text input mode |
## VAD Preset Integration
The component reads VAD settings from `voiceSettingsStore`:
```tsx
// Settings automatically applied from store
const {
vadPreset, // "sensitive" | "balanced" | "relaxed" | "accessibility" | "custom"
vadCustomEnergyThresholdDb,
vadCustomSilenceDurationMs,
voiceModeType, // "always-on" | "push-to-talk"
rtlEnabled,
rtlAutoDetect,
} = useVoiceSettingsStore();
```
### VAD Preset Behavior
| Preset | Energy Threshold | Silence Duration | Best For |
| ------------- | ---------------- | ---------------- | ------------------------ |
| Sensitive | -45 dB | 300 ms | Quiet rooms, soft speech |
| Balanced | -35 dB | 500 ms | General use (default) |
| Relaxed | -25 dB | 800 ms | Noisy environments |
| Accessibility | -42 dB | 1000 ms | Speech impairments |
| Custom | User-defined | User-defined | Advanced users |
## RTL Support
The component supports RTL languages with automatic layout mirroring:
```tsx
// RTL auto-detection for Arabic, Hebrew, Farsi, Urdu
const RTL_LANGUAGES = ["ar", "he", "fa", "ur", "yi", "ps", "sd"];
// Manual override
;
```
### RTL Layout Changes
- Mic button moves to left side
- Text flows right-to-left
- Energy visualizer bars reverse order
- PHI indicator repositions appropriately
## PHI Mode Indicator
The colored dot indicates current PHI routing:
| Color | Mode | Description |
| ------ | ------ | --------------------------------- |
| Green | LOCAL | On-device processing, most secure |
| Yellow | HYBRID | Cloud with PHI redaction |
| Blue | CLOUD | Standard cloud processing |
```tsx
```
## States
```
idle → listening → processing → idle
↓ ↑
text-input ─────────────┘
```
| State | Visual | Behavior |
| ------------ | -------------------------- | ------------------- |
| `idle` | Placeholder text, blue mic | Ready for input |
| `listening` | Energy bars, red mic | Recording audio |
| `processing` | Spinner | Transcribing speech |
| `text-input` | Text field visible | Text input mode |
| `error` | Red border, error message | Error occurred |
## Styling
The component uses Tailwind CSS with dark mode support:
```tsx
// Custom className for container
// Dark mode automatically applied
// Uses dark: variants for all colors
```
## Integration with Voice Pipeline
```tsx
import { VoiceFirstInputBar } from "@/components/voice/VoiceFirstInputBar";
import { useVoicePipeline } from "@/hooks/useVoicePipeline";
function IntegratedVoiceChat() {
const { sendMessage, isAssistantSpeaking, phiState } = useVoicePipeline();
return (
sendMessage(input, { isVoice })}
isAssistantSpeaking={isAssistantSpeaking}
phiMode={phiState.mode}
phiScore={phiState.score}
/>
);
}
```
## Accessibility
- ARIA labels for all interactive elements
- Keyboard navigation support
- Screen reader announcements for state changes
- Focus management during mode switches
## Related Documentation
- [Adaptive VAD Presets](./adaptive-vad-presets.md)
- [RTL Support Guide](./rtl-support-guide.md)
- [PHI-Aware STT Routing](./phi-aware-stt-routing.md)
- [Voice Mode v4 Overview](./voice-mode-v4-overview.md)
6:["slug","voice/voice-first-input-bar","c"]
0:["X7oMT3VrOffzp0qvbeOas",[[["",{"children":["docs",{"children":[["slug","voice/voice-first-input-bar","c"],{"children":["__PAGE__?{\"slug\":[\"voice\",\"voice-first-input-bar\"]}",{}]}]}]},"$undefined","$undefined",true],["",{"children":["docs",{"children":[["slug","voice/voice-first-input-bar","c"],{"children":["__PAGE__",{},[["$L1",["$","div",null,{"children":[["$","div",null,{"className":"mb-6 flex items-center justify-between gap-4","children":[["$","div",null,{"children":[["$","p",null,{"className":"text-sm text-gray-500 dark:text-gray-400","children":"Docs / Raw"}],["$","h1",null,{"className":"text-3xl font-bold text-gray-900 dark:text-white","children":"VoiceFirstInputBar Component"}],["$","p",null,{"className":"text-sm text-gray-600 dark:text-gray-400","children":["Sourced from"," ",["$","code",null,{"className":"font-mono text-xs","children":["docs/","voice/voice-first-input-bar.md"]}]]}]]}],["$","a",null,{"href":"https://github.com/mohammednazmy/VoiceAssist/edit/main/docs/voice/voice-first-input-bar.md","target":"_blank","rel":"noreferrer","className":"inline-flex items-center gap-2 rounded-md border border-gray-200 dark:border-gray-700 px-3 py-1.5 text-sm text-gray-700 dark:text-gray-200 hover:border-primary-500 dark:hover:border-primary-400 hover:text-primary-700 dark:hover:text-primary-300","children":"Edit on GitHub"}]]}],["$","div",null,{"className":"rounded-lg border border-gray-200 dark:border-gray-800 bg-white dark:bg-gray-900 p-6","children":["$","$L2",null,{"content":"$3"}]}],["$","div",null,{"className":"mt-6 flex flex-wrap gap-2 text-sm","children":[["$","$L4",null,{"href":"/reference/all-docs","className":"inline-flex items-center gap-1 rounded-md bg-gray-100 px-3 py-1 text-gray-700 hover:bg-gray-200 dark:bg-gray-800 dark:text-gray-200 dark:hover:bg-gray-700","children":"← All documentation"}],["$","$L4",null,{"href":"/","className":"inline-flex items-center gap-1 rounded-md bg-gray-100 px-3 py-1 text-gray-700 hover:bg-gray-200 dark:bg-gray-800 dark:text-gray-200 dark:hover:bg-gray-700","children":"Home"}]]}]]}],null],null],null]},[null,["$","$L5",null,{"parallelRouterKey":"children","segmentPath":["children","docs","children","$6","children"],"error":"$undefined","errorStyles":"$undefined","errorScripts":"$undefined","template":["$","$L7",null,{}],"templateStyles":"$undefined","templateScripts":"$undefined","notFound":"$undefined","notFoundStyles":"$undefined"}]],null]},[null,["$","$L5",null,{"parallelRouterKey":"children","segmentPath":["children","docs","children"],"error":"$undefined","errorStyles":"$undefined","errorScripts":"$undefined","template":["$","$L7",null,{}],"templateStyles":"$undefined","templateScripts":"$undefined","notFound":"$undefined","notFoundStyles":"$undefined"}]],null]},[[[["$","link","0",{"rel":"stylesheet","href":"/_next/static/css/7f586cdbbaa33ff7.css","precedence":"next","crossOrigin":"$undefined"}]],["$","html",null,{"lang":"en","className":"h-full","children":["$","body",null,{"className":"__className_f367f3 h-full bg-white dark:bg-gray-900","children":[["$","a",null,{"href":"#main-content","className":"skip-to-content","children":"Skip to main content"}],["$","$L8",null,{"children":[["$","$L9",null,{}],["$","$La",null,{}],["$","main",null,{"id":"main-content","className":"lg:pl-64","role":"main","aria-label":"Documentation content","children":["$","$Lb",null,{"children":["$","$L5",null,{"parallelRouterKey":"children","segmentPath":["children"],"error":"$undefined","errorStyles":"$undefined","errorScripts":"$undefined","template":["$","$L7",null,{}],"templateStyles":"$undefined","templateScripts":"$undefined","notFound":[["$","title",null,{"children":"404: This page could not be found."}],["$","div",null,{"style":{"fontFamily":"system-ui,\"Segoe UI\",Roboto,Helvetica,Arial,sans-serif,\"Apple Color Emoji\",\"Segoe UI Emoji\"","height":"100vh","textAlign":"center","display":"flex","flexDirection":"column","alignItems":"center","justifyContent":"center"},"children":["$","div",null,{"children":[["$","style",null,{"dangerouslySetInnerHTML":{"__html":"body{color:#000;background:#fff;margin:0}.next-error-h1{border-right:1px solid rgba(0,0,0,.3)}@media (prefers-color-scheme:dark){body{color:#fff;background:#000}.next-error-h1{border-right:1px solid rgba(255,255,255,.3)}}"}}],["$","h1",null,{"className":"next-error-h1","style":{"display":"inline-block","margin":"0 20px 0 0","padding":"0 23px 0 0","fontSize":24,"fontWeight":500,"verticalAlign":"top","lineHeight":"49px"},"children":"404"}],["$","div",null,{"style":{"display":"inline-block"},"children":["$","h2",null,{"style":{"fontSize":14,"fontWeight":400,"lineHeight":"49px","margin":0},"children":"This page could not be found."}]}]]}]}]],"notFoundStyles":[]}]}]}]]}]]}]}]],null],null],["$Lc",null]]]]
c:[["$","meta","0",{"name":"viewport","content":"width=device-width, initial-scale=1"}],["$","meta","1",{"charSet":"utf-8"}],["$","title","2",{"children":"VoiceFirstInputBar Component | Docs | VoiceAssist Docs"}],["$","meta","3",{"name":"description","content":"Primary voice-first input component with text fallback for Voice Mode v4.1."}],["$","meta","4",{"name":"keywords","content":"VoiceAssist,documentation,medical AI,voice assistant,healthcare,HIPAA,API"}],["$","meta","5",{"name":"robots","content":"index, follow"}],["$","meta","6",{"name":"googlebot","content":"index, follow"}],["$","link","7",{"rel":"canonical","href":"https://assistdocs.asimo.io"}],["$","meta","8",{"property":"og:title","content":"VoiceAssist Documentation"}],["$","meta","9",{"property":"og:description","content":"Comprehensive documentation for VoiceAssist - Enterprise Medical AI Assistant"}],["$","meta","10",{"property":"og:url","content":"https://assistdocs.asimo.io"}],["$","meta","11",{"property":"og:site_name","content":"VoiceAssist Docs"}],["$","meta","12",{"property":"og:type","content":"website"}],["$","meta","13",{"name":"twitter:card","content":"summary"}],["$","meta","14",{"name":"twitter:title","content":"VoiceAssist Documentation"}],["$","meta","15",{"name":"twitter:description","content":"Comprehensive documentation for VoiceAssist - Enterprise Medical AI Assistant"}],["$","meta","16",{"name":"next-size-adjust"}]]
1:null