feat: timer engine + full-screen timer UI
Implement the core timer feature following the src/features/ architecture: - useTimerEngine hook: drift-free Date.now() delta countdown (100ms tick), explicit state machine (IDLE → GET_READY → WORK → REST → COMPLETE), event emitter for external consumers (PHASE_CHANGED, ROUND_COMPLETED, COUNTDOWN_TICK, SESSION_COMPLETE), auto-pause on AppState interruption (phone calls, background), expo-keep-awake during session - TimerDisplay component: full-screen animated UI with 600ms color transitions between phases, pulse animation on countdown, flash red on last 3 seconds, round progress dots, IDLE/active/COMPLETE views - TimerControls component: stop/pause-resume/skip buttons with Ionicons - Timer route (app/timer.tsx): fullScreenModal wiring hook → display - Home screen: dark theme with START button navigating to /timer - Project docs: CLAUDE.md (constitution), PRD v1.1, skill files - Shared constants: PHASE_COLORS, TIMER_DEFAULTS, formatTime utility - Types: TimerPhase, TimerState, TimerConfig, TimerActions, TimerEvent Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
369
.claude/skills/audio/SKILL.md
Normal file
369
.claude/skills/audio/SKILL.md
Normal file
@@ -0,0 +1,369 @@
|
||||
# Skill — Audio Engine (useAudioEngine)
|
||||
> Lis ce skill AVANT d'implémenter quoi que ce soit lié à l'audio.
|
||||
|
||||
## Responsabilité de ce module
|
||||
L'audio engine gère 4 couches sonores indépendantes qui s'activent
|
||||
en réponse aux événements du timer. Il ne connaît pas le timer
|
||||
directement — il s'abonne à ses événements via useTimerSync.
|
||||
|
||||
**Objectif principal : zéro latence audio sur les transitions de phase.**
|
||||
Les sons de phase doivent jouer dans les 50ms suivant l'événement.
|
||||
|
||||
---
|
||||
|
||||
## Architecture du module audio
|
||||
|
||||
```
|
||||
src/features/audio/
|
||||
types.ts
|
||||
hooks/
|
||||
useAudioEngine.ts ← moteur central expo-av
|
||||
useAudioSettings.ts ← préférences utilisateur persistées
|
||||
data/
|
||||
tracks.ts ← catalogue musique (offline, paths locaux)
|
||||
sounds.ts ← catalogue signaux de phase
|
||||
components/
|
||||
MusicPicker.tsx ← sélecteur d'ambiance musicale
|
||||
SoundPicker.tsx ← sélecteur de signal de phase
|
||||
AudioSettingsPanel.tsx ← panneau complet des réglages
|
||||
index.ts
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Types
|
||||
|
||||
```typescript
|
||||
// src/features/audio/types.ts
|
||||
|
||||
export type MusicAmbiance = 'ELECTRO' | 'HIP_HOP' | 'ROCK' | 'SILENCE'
|
||||
export type MusicIntensity = 'LOW' | 'MEDIUM' | 'HIGH'
|
||||
export type PhaseSound = 'beep' | 'whistle' | 'voice_go' | 'air_horn' | 'bell' | 'silence'
|
||||
|
||||
export interface AudioTrack {
|
||||
id: string
|
||||
ambiance: MusicAmbiance
|
||||
intensity: MusicIntensity
|
||||
asset: number // require('../assets/audio/...') — offline
|
||||
bpm: number
|
||||
durationMs: number
|
||||
}
|
||||
|
||||
export interface AudioSettings {
|
||||
musicEnabled: boolean
|
||||
ambiance: MusicAmbiance
|
||||
musicVolume: number // 0.0 → 1.0
|
||||
soundsEnabled: boolean
|
||||
soundsVolume: number // 0.0 → 1.0
|
||||
hapticsEnabled: boolean
|
||||
voiceEnabled: boolean // annonces vocales
|
||||
overrideMuteSwitch: boolean // iOS : jouer même en mode silencieux
|
||||
workPhaseSound: PhaseSound
|
||||
restPhaseSound: PhaseSound
|
||||
countdownSound: PhaseSound
|
||||
completionSound: PhaseSound
|
||||
}
|
||||
|
||||
export interface AudioState {
|
||||
isLoaded: boolean
|
||||
isMusicPlaying: boolean
|
||||
currentAmbiance: MusicAmbiance
|
||||
currentIntensity: MusicIntensity
|
||||
error: string | null
|
||||
}
|
||||
|
||||
export interface AudioActions {
|
||||
preloadAll: () => Promise<void>
|
||||
startMusic: (ambiance: MusicAmbiance, intensity: MusicIntensity) => Promise<void>
|
||||
switchIntensity: (intensity: MusicIntensity, fadeMs?: number) => Promise<void>
|
||||
stopMusic: (fadeMs?: number) => Promise<void>
|
||||
playPhaseSound: (sound: PhaseSound) => Promise<void>
|
||||
playCountdown: (seconds: number) => Promise<void> // 3, 2, 1
|
||||
unloadAll: () => Promise<void>
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Implémentation — useAudioEngine
|
||||
|
||||
### Configuration de la session audio iOS/Android
|
||||
|
||||
**Critique à faire en premier, avant tout chargement de son.**
|
||||
|
||||
```typescript
|
||||
import { Audio } from 'expo-av'
|
||||
|
||||
// À appeler une seule fois au démarrage de l'app (dans _layout.tsx)
|
||||
export async function configureAudioSession(): Promise<void> {
|
||||
await Audio.setAudioModeAsync({
|
||||
// iOS : jouer par-dessus la musique de l'utilisateur
|
||||
playsInSilentModeIOS: true, // ignore le switch mute (si overrideMuteSwitch)
|
||||
allowsRecordingIOS: false,
|
||||
staysActiveInBackground: true, // CRITIQUE — timer continue en background
|
||||
|
||||
// Android
|
||||
shouldDuckAndroid: true, // baisser le volume autres apps sur nos signaux
|
||||
playThroughEarpieceAndroid: false,
|
||||
interruptionModeIOS: 1, // DO_NOT_MIX = 1 (nos sons ont priorité sur signaux)
|
||||
interruptionModeAndroid: 1,
|
||||
})
|
||||
}
|
||||
```
|
||||
|
||||
### Preloading — charger AVANT la séance
|
||||
|
||||
Tous les sons doivent être chargés en mémoire AVANT que la séance commence.
|
||||
Si on charge pendant la séance → latence inacceptable.
|
||||
|
||||
```typescript
|
||||
const soundRefs = useRef<Record<string, Audio.Sound>>({})
|
||||
|
||||
async function preloadAll(settings: AudioSettings): Promise<void> {
|
||||
// 1. Charger la track musicale correspondante à l'ambiance + les 2 intensités
|
||||
const trackWork = getTrack(settings.ambiance, 'HIGH')
|
||||
const trackRest = getTrack(settings.ambiance, 'LOW')
|
||||
|
||||
soundRefs.current['music_work'] = await loadSound(trackWork.asset)
|
||||
soundRefs.current['music_rest'] = await loadSound(trackRest.asset)
|
||||
|
||||
// 2. Charger tous les signaux de phase
|
||||
const soundAssets = getSoundAssets(settings)
|
||||
for (const [key, asset] of Object.entries(soundAssets)) {
|
||||
soundRefs.current[key] = await loadSound(asset)
|
||||
}
|
||||
|
||||
// 3. Mettre en loop les tracks musicales
|
||||
await soundRefs.current['music_work'].setIsLoopingAsync(true)
|
||||
await soundRefs.current['music_rest'].setIsLoopingAsync(true)
|
||||
}
|
||||
|
||||
async function loadSound(asset: number): Promise<Audio.Sound> {
|
||||
const { sound } = await Audio.Sound.createAsync(asset, {
|
||||
shouldPlay: false,
|
||||
volume: 1.0,
|
||||
})
|
||||
return sound
|
||||
}
|
||||
```
|
||||
|
||||
### Crossfade entre phases
|
||||
|
||||
```typescript
|
||||
async function switchIntensity(
|
||||
to: MusicIntensity,
|
||||
fadeMs: number = 500
|
||||
): Promise<void> {
|
||||
const from = currentIntensityRef.current
|
||||
if (from === to) return
|
||||
|
||||
const outKey = from === 'HIGH' ? 'music_work' : 'music_rest'
|
||||
const inKey = to === 'HIGH' ? 'music_work' : 'music_rest'
|
||||
|
||||
const outSound = soundRefs.current[outKey]
|
||||
const inSound = soundRefs.current[inKey]
|
||||
|
||||
if (!outSound || !inSound) return
|
||||
|
||||
// Démarrer la track entrante depuis le début (ou depuis la position actuelle ?)
|
||||
await inSound.setPositionAsync(0)
|
||||
await inSound.setVolumeAsync(0)
|
||||
await inSound.playAsync()
|
||||
|
||||
// Fade simultané
|
||||
const steps = 10
|
||||
const stepMs = fadeMs / steps
|
||||
const volumeStep = settings.musicVolume / steps
|
||||
|
||||
for (let i = 0; i <= steps; i++) {
|
||||
await Promise.all([
|
||||
outSound.setVolumeAsync(Math.max(0, settings.musicVolume - i * volumeStep)),
|
||||
inSound.setVolumeAsync(Math.min(settings.musicVolume, i * volumeStep)),
|
||||
])
|
||||
await delay(stepMs)
|
||||
}
|
||||
|
||||
await outSound.stopAsync()
|
||||
currentIntensityRef.current = to
|
||||
}
|
||||
|
||||
const delay = (ms: number) => new Promise(resolve => setTimeout(resolve, ms))
|
||||
```
|
||||
|
||||
### Répondre aux événements timer
|
||||
|
||||
```typescript
|
||||
// Dans useTimerSync — pont entre timer et audio
|
||||
|
||||
import { useTimerEngine } from '../timer'
|
||||
import { useAudioEngine } from '../audio'
|
||||
|
||||
export function useTimerSync() {
|
||||
const timer = useTimerEngine()
|
||||
const audio = useAudioEngine()
|
||||
|
||||
useEffect(() => {
|
||||
const unsubscribe = timer.addEventListener((event) => {
|
||||
switch (event.type) {
|
||||
case 'PHASE_CHANGED':
|
||||
handlePhaseChange(event.from, event.to)
|
||||
break
|
||||
case 'COUNTDOWN_TICK':
|
||||
audio.playCountdown(event.secondsLeft)
|
||||
break
|
||||
case 'SESSION_COMPLETE':
|
||||
audio.playPhaseSound('completion')
|
||||
audio.stopMusic(1000)
|
||||
break
|
||||
}
|
||||
})
|
||||
return unsubscribe
|
||||
}, [])
|
||||
|
||||
async function handlePhaseChange(from: TimerPhase, to: TimerPhase) {
|
||||
switch (to) {
|
||||
case 'GET_READY':
|
||||
await audio.startMusic(settings.ambiance, 'MEDIUM')
|
||||
break
|
||||
case 'WORK':
|
||||
await audio.playPhaseSound(settings.workPhaseSound)
|
||||
await audio.switchIntensity('HIGH', 500)
|
||||
break
|
||||
case 'REST':
|
||||
await audio.playPhaseSound(settings.restPhaseSound)
|
||||
await audio.switchIntensity('LOW', 500)
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Catalogue des tracks (offline)
|
||||
|
||||
```typescript
|
||||
// src/features/audio/data/tracks.ts
|
||||
import { MusicAmbiance, MusicIntensity, AudioTrack } from '../types'
|
||||
|
||||
export const TRACKS: AudioTrack[] = [
|
||||
{
|
||||
id: 'electro_high',
|
||||
ambiance: 'ELECTRO',
|
||||
intensity: 'HIGH',
|
||||
asset: require('../../../assets/audio/music/electro_high.mp3'),
|
||||
bpm: 140,
|
||||
durationMs: 180000,
|
||||
},
|
||||
{
|
||||
id: 'electro_low',
|
||||
ambiance: 'ELECTRO',
|
||||
intensity: 'LOW',
|
||||
asset: require('../../../assets/audio/music/electro_low.mp3'),
|
||||
bpm: 90,
|
||||
durationMs: 180000,
|
||||
},
|
||||
// ... (9 tracks au total)
|
||||
]
|
||||
|
||||
export function getTrack(ambiance: MusicAmbiance, intensity: MusicIntensity): AudioTrack {
|
||||
const track = TRACKS.find(t => t.ambiance === ambiance && t.intensity === intensity)
|
||||
if (!track) throw new Error(`Track non trouvée : ${ambiance} ${intensity}`)
|
||||
return track
|
||||
}
|
||||
```
|
||||
|
||||
## Catalogue des sons de phase
|
||||
|
||||
```typescript
|
||||
// src/features/audio/data/sounds.ts
|
||||
export const PHASE_SOUNDS = {
|
||||
beep: require('../../../assets/audio/sounds/beep_long.mp3'),
|
||||
whistle: require('../../../assets/audio/sounds/whistle.mp3'),
|
||||
voice_go: require('../../../assets/audio/sounds/voice_go.mp3'),
|
||||
air_horn: require('../../../assets/audio/sounds/air_horn.mp3'),
|
||||
bell: require('../../../assets/audio/sounds/bell.mp3'),
|
||||
double_beep: require('../../../assets/audio/sounds/beep_double.mp3'),
|
||||
countdown_3: require('../../../assets/audio/sounds/count_3.mp3'),
|
||||
countdown_2: require('../../../assets/audio/sounds/count_2.mp3'),
|
||||
countdown_1: require('../../../assets/audio/sounds/count_1.mp3'),
|
||||
fanfare: require('../../../assets/audio/sounds/fanfare.mp3'),
|
||||
clap: require('../../../assets/audio/sounds/applause.mp3'),
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Gestion du switch silencieux iOS
|
||||
|
||||
```typescript
|
||||
// Si overrideMuteSwitch = true dans les settings
|
||||
await Audio.setAudioModeAsync({
|
||||
playsInSilentModeIOS: settings.overrideMuteSwitch,
|
||||
})
|
||||
```
|
||||
|
||||
**Règle UX :** Les signaux de phase ont l'override activé par défaut
|
||||
(l'utilisateur a besoin de savoir quand changer d'exercice même en mode silencieux).
|
||||
La musique d'ambiance respecte le switch silencieux par défaut.
|
||||
|
||||
---
|
||||
|
||||
## Cleanup obligatoire
|
||||
|
||||
```typescript
|
||||
// Dans useAudioEngine — cleanup au unmount ou fin de séance
|
||||
async function unloadAll(): Promise<void> {
|
||||
await Promise.all(
|
||||
Object.values(soundRefs.current).map(sound => sound.unloadAsync())
|
||||
)
|
||||
soundRefs.current = {}
|
||||
}
|
||||
|
||||
useEffect(() => {
|
||||
return () => {
|
||||
// Cleanup synchrone au unmount
|
||||
Object.values(soundRefs.current).forEach(sound => {
|
||||
sound.unloadAsync().catch(console.error)
|
||||
})
|
||||
}
|
||||
}, [])
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Tests minimaux
|
||||
|
||||
```typescript
|
||||
describe('useAudioEngine', () => {
|
||||
it('configure la session audio au mount', () => {})
|
||||
it('preloadAll charge les sons sans erreur', () => {})
|
||||
it('playPhaseSound joue le bon son dans les 50ms', () => {})
|
||||
it('switchIntensity effectue un crossfade', () => {})
|
||||
it('stopMusic fait un fade-out', () => {})
|
||||
it('unloadAll libère toutes les ressources', () => {})
|
||||
it('ne plante pas si un asset est manquant (fallback)', () => {})
|
||||
})
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Erreurs classiques à éviter
|
||||
|
||||
```typescript
|
||||
// ❌ Charger les sons à la demande → latence
|
||||
case 'PHASE_CHANGED':
|
||||
const { sound } = await Audio.Sound.createAsync(asset) // ← trop lent !
|
||||
await sound.playAsync()
|
||||
|
||||
// ✅ Utiliser les sons préchargés
|
||||
case 'PHASE_CHANGED':
|
||||
await soundRefs.current['work_start'].replayAsync() // ← instantané
|
||||
|
||||
// ❌ Oublier de configurer la session audio → muet sur iOS background
|
||||
await sound.playAsync() // sans Audio.setAudioModeAsync() → silencieux en background
|
||||
|
||||
// ❌ Pas de gestion du duck Android → nos sons se mélangent avec d'autres apps
|
||||
interruptionModeAndroid: 2 // DUCK_OTHERS — OK pour la musique d'ambiance
|
||||
interruptionModeAndroid: 1 // DO_NOT_MIX — pour les signaux de phase
|
||||
```
|
||||
Reference in New Issue
Block a user