HarmonyOS 6(API 23)实战:打造“AR 远程协作白板“——基于 Face AR 表情反馈 + Body AR 手势标注的 PC 端沉浸式会议系统
后疫情时代,远程协作已成为常态,但传统视频会议存在"互动感缺失"的痛点——参会者只能被动观看屏幕,无法像线下会议那样用手势指向重点、用表情传递态度。HarmonyOS 6(API 23)带来的Face AR与Body AR能力,让 PC 端设备可以化身为"空间协作终端":演讲者抬手即可在白板上圈画重点,听众皱眉自动触发"疑问"标记,点头表示"赞同"实时统计,结合沉浸光感营造会议室氛围,让远程协作回

每日一句正能量
学会给自己腾出空间,才能容纳更多精彩。
房间堆满杂物,新家具进不来。日程塞满、情绪填满,新体验、新关系也无处安放。拒绝、休息、独处、清理旧物,是主动准备容器。
一、前言:当远程会议遇见空间交互
后疫情时代,远程协作已成为常态,但传统视频会议存在"互动感缺失"的痛点——参会者只能被动观看屏幕,无法像线下会议那样用手势指向重点、用表情传递态度。HarmonyOS 6(API 23)带来的 Face AR 与 Body AR 能力,让 PC 端设备可以化身为"空间协作终端":演讲者抬手即可在白板上圈画重点,听众皱眉自动触发"疑问"标记,点头表示"赞同"实时统计,结合沉浸光感营造会议室氛围,让远程协作回归"面对面"的自然体验。
本文将实战开发一款 “AR 远程协作白板” 应用,面向 HarmonyOS PC 端。核心创新点在于:
- Face AR 表情反馈:实时捕捉点头(赞同)、摇头(反对)、皱眉(疑问)、挑眉(惊讶)等微表情,转化为会议情绪数据
- Body AR 手势标注:单手隔空指向白板坐标,双手捏合圈选区域,挥手翻页,实现"无接触式"演示
- 沉浸光感氛围:根据会议情绪热度(赞同/反对比例)动态调整 UI 光效——共识时绿光、争议时橙光、僵局时红光
- 悬浮导航控制:底部悬浮面板显示参会者情绪统计、手势模式切换,支持透明度调节,不遮挡白板内容
二、系统架构设计
2.1 空间协作架构
┌─────────────────────────────────────────────────────────────┐
│ 空间感知层(AR Engine 6.1.0) │
│ ┌─────────────────────┐ ┌─────────────────────────────┐ │
│ │ Face AR 模块 │ │ Body AR 模块 │ │
│ │ · 68点人脸Mesh │ │ · 20+骨骼关键点 │ │
│ │ · 64种BlendShape │ │ · 6种手势状态识别 │ │
│ │ · 头部姿态追踪 │ │ · 3D空间位置追踪 │ │
│ └──────────┬──────────┘ └──────────────┬──────────────┘ │
└─────────────┼────────────────────────────────┼────────────────┘
│ │
▼ ▼
┌─────────────────────────────────────────────────────────────┐
│ 语义映射引擎(ArkTS) │
│ ┌─────────────────────────────────────────────────────────┐ │
│ │ 表情指令映射: │ │
│ │ · 点头 (headPitch > 0.3) → agree() │ │
│ │ · 摇头 (headYaw > 0.4) → disagree() │ │
│ │ · 皱眉 (browDown > 0.5) → question() │ │
│ │ · 挑眉 (browInnerUp > 0.6) → surprise() │ │
│ │ · 微笑 (mouthSmile > 0.5) → like() │ │
│ └─────────────────────────────────────────────────────────┘ │
│ ┌─────────────────────────────────────────────────────────┐ │
│ │ 手势指令映射: │ │
│ │ · 食指指向 (indexTip位置) → cursorMove() │ │
│ │ · 双手捏合 (handDistance < 0.15) → selectArea() │ │
│ │ · 单手挥动 (wristVelocity > 0.5) → flipPage() │ │
│ │ · 双手张开 (handDistance > 0.4) → zoomIn() │ │
│ │ · 握拳 (fingerCurl > 0.7) → drawMode() │ │
│ └─────────────────────────────────────────────────────────┘ │
└─────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────┐
│ 协作数据层(分布式软总线) │
│ · 情绪数据广播:本地表情 → 远端参会者情绪面板 │
│ · 手势坐标同步:本地手势 → 远端白板标注叠加层 │
│ · 白板状态同步:本地绘制 → 远端实时渲染 │
└─────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────┐
│ 沉浸交互层(ArkUI + HDS) │
│ ┌─────────────────────┐ ┌─────────────────────────────┐ │
│ │ 情绪光感标题栏 │ │ 悬浮协作面板 │ │
│ │ · 会议情绪热度色映射 │ │ · 参会者情绪统计 │ │
│ │ · 共识度光效指示 │ │ · 手势模式切换 │ │
│ │ · 发言者高亮光晕 │ │ · 白板工具栏 │ │
│ └─────────────────────┘ └─────────────────────────────┘ │
└─────────────────────────────────────────────────────────────┘
2.2 会议情绪光感映射
| 会议状态 | 情绪指标 | 光效颜色 | 标题栏氛围 | 触发条件 |
|---|---|---|---|---|
| 高度共识 | 赞同>80% | 翠绿 #00D4AA |
稳定呼吸光 | 全票通过 |
| 多数赞同 | 赞同>60% | 青蓝 #4ECDC4 |
柔和脉冲 | 方案通过 |
| 存在争议 | 反对>30% | 琥珀 #FFD700 |
渐强闪烁 | 意见分歧 |
| 严重分歧 | 反对>50% | 赤红 #FF6B6B |
急促警示 | 需要调解 |
| 疑问集中 | 疑问>40% | 紫蓝 #9B59B6 |
呼吸渐变 | 需要解释 |
| 惊讶反馈 | 惊讶>20% | 明黄 #FFE66D |
脉冲爆发 | 亮点发现 |
三、环境配置与权限声明
3.1 模块依赖配置
{
"dependencies": {
"@hms.core.ar.arengine": "^6.1.0",
"@kit.UIDesignKit": "^6.0.0",
"@kit.DistributedServiceKit": "^6.0.0",
"@kit.MultimediaKit": "^6.0.0",
"@kit.Graphics2DKit": "^6.0.0"
}
}
3.2 权限声明
{
"module": {
"requestPermissions": [
{ "name": "ohos.permission.CAMERA" },
{ "name": "ohos.permission.INTERNET" },
{ "name": "ohos.permission.DISTRIBUTED_DATASYNC" },
{ "name": "ohos.permission.DISTRIBUTED_SOFTBUS_CENTER" }
]
}
}
四、核心代码实战
4.1 Face AR 情绪识别引擎(EmotionRecognitionEngine.ets)
代码亮点:将 Face AR 的 BlendShape 参数与头部姿态融合,识别 5 种会议情绪状态,并计算情绪置信度。
// entry/src/main/ets/engine/EmotionRecognitionEngine.ets
import { arEngine } from '@hms.core.ar.arengine';
export enum MeetingEmotion {
AGREE = 'AGREE', // 赞同
DISAGREE = 'DISAGREE', // 反对
QUESTION = 'QUESTION', // 疑问
SURPRISE = 'SURPRISE', // 惊讶
LIKE = 'LIKE', // 喜欢
NEUTRAL = 'NEUTRAL' // 中性
}
export interface EmotionResult {
emotion: MeetingEmotion;
confidence: number; // 置信度 0-1
intensity: number; // 强度 0-1
timestamp: number;
}
export class EmotionRecognitionEngine {
private static instance: EmotionRecognitionEngine;
// 表情阈值与权重配置
private readonly EMOTION_CONFIG: Map<MeetingEmotion, {
blendShapes: { [key: string]: number };
headPose?: { pitch?: number; yaw?: number; roll?: number };
weight: number;
cooldown: number;
}> = new Map([
[MeetingEmotion.AGREE, {
blendShapes: { mouthSmileLeft: 0.3, mouthSmileRight: 0.3 },
headPose: { pitch: 0.25 },
weight: 1.0,
cooldown: 1500
}],
[MeetingEmotion.DISAGREE, {
blendShapes: { mouthFrownLeft: 0.4, mouthFrownRight: 0.4 },
headPose: { yaw: 0.35 },
weight: 1.0,
cooldown: 1500
}],
[MeetingEmotion.QUESTION, {
blendShapes: { browDownLeft: 0.4, browDownRight: 0.4, mouthPucker: 0.3 },
weight: 0.8,
cooldown: 2000
}],
[MeetingEmotion.SURPRISE, {
blendShapes: { browInnerUp: 0.6, eyeWideLeft: 0.5, eyeWideRight: 0.5 },
weight: 0.9,
cooldown: 2000
}],
[MeetingEmotion.LIKE, {
blendShapes: { mouthSmileLeft: 0.5, mouthSmileRight: 0.5, cheekSquintLeft: 0.4 },
weight: 0.7,
cooldown: 1000
}]
]);
private lastEmotionTime: Map<MeetingEmotion, number> = new Map();
private emotionHistory: EmotionResult[] = [];
static getInstance(): EmotionRecognitionEngine {
if (!EmotionRecognitionEngine.instance) {
EmotionRecognitionEngine.instance = new EmotionRecognitionEngine();
}
return EmotionRecognitionEngine.instance;
}
/**
* 识别当前面部情绪
*/
recognizeEmotion(face: arEngine.ARFace): EmotionResult {
const blendShapes = face.getBlendShapes();
const headPose = face.getPose();
const now = Date.now();
if (!blendShapes) {
return { emotion: MeetingEmotion.NEUTRAL, confidence: 0, intensity: 0, timestamp: now };
}
let bestMatch: { emotion: MeetingEmotion; confidence: number; intensity: number } = {
emotion: MeetingEmotion.NEUTRAL,
confidence: 0,
intensity: 0
};
// 遍历所有情绪配置,计算匹配度
this.EMOTION_CONFIG.forEach((config, emotion) => {
// 检查冷却期
const lastTime = this.lastEmotionTime.get(emotion) || 0;
if (now - lastTime < config.cooldown) return;
let matchScore = 0;
let totalWeight = 0;
// 计算 BlendShape 匹配度
Object.entries(config.blendShapes).forEach(([key, threshold]) => {
const value = (blendShapes as any)[key] || 0;
const normalized = Math.min(value / threshold, 1.0);
matchScore += normalized * config.weight;
totalWeight += config.weight;
});
// 计算头部姿态匹配度
if (config.headPose && headPose) {
if (config.headPose.pitch && headPose.pitch > config.headPose.pitch) {
matchScore += config.weight * 0.5;
totalWeight += config.weight * 0.5;
}
if (config.headPose.yaw && Math.abs(headPose.yaw) > config.headPose.yaw) {
matchScore += config.weight * 0.5;
totalWeight += config.weight * 0.5;
}
}
const confidence = totalWeight > 0 ? matchScore / totalWeight : 0;
const intensity = matchScore / Object.keys(config.blendShapes).length;
if (confidence > bestMatch.confidence && confidence > 0.5) {
bestMatch = { emotion, confidence, intensity };
}
});
// 更新最后触发时间
if (bestMatch.confidence > 0.5) {
this.lastEmotionTime.set(bestMatch.emotion, now);
}
const result: EmotionResult = {
emotion: bestMatch.confidence > 0.5 ? bestMatch.emotion : MeetingEmotion.NEUTRAL,
confidence: bestMatch.confidence,
intensity: bestMatch.intensity,
timestamp: now
};
// 记录历史
this.emotionHistory.push(result);
if (this.emotionHistory.length > 100) {
this.emotionHistory.shift();
}
return result;
}
/**
* 获取会议情绪统计
*/
getEmotionStatistics(): Map<MeetingEmotion, number> {
const stats = new Map<MeetingEmotion, number>();
const recentEmotions = this.emotionHistory.filter(
e => Date.now() - e.timestamp < 60000
);
recentEmotions.forEach(e => {
stats.set(e.emotion, (stats.get(e.emotion) || 0) + e.intensity);
});
return stats;
}
/**
* 获取会议共识度
*/
getConsensusLevel(): number {
const stats = this.getEmotionStatistics();
const agree = stats.get(MeetingEmotion.AGREE) || 0;
const disagree = stats.get(MeetingEmotion.DISAGREE) || 0;
const total = agree + disagree;
if (total === 0) return 0.5;
return agree / total;
}
reset(): void {
this.emotionHistory = [];
this.lastEmotionTime.clear();
}
}
4.2 Body AR 手势标注引擎(GestureAnnotationEngine.ets)
代码亮点:将 Body AR 骨骼关键点映射为白板坐标,支持隔空指向、圈选、翻页等协作手势。
// entry/src/main/ets/engine/GestureAnnotationEngine.ets
import { arEngine } from '@hms.core.ar.arengine';
export interface WhiteboardPoint {
x: number; // 0-1 归一化坐标
y: number;
type: 'move' | 'down' | 'up' | 'select';
pressure: number;
}
export interface GestureCommand {
type: 'cursor' | 'select' | 'draw' | 'flip' | 'zoom';
points: WhiteboardPoint[];
metadata: { [key: string]: any };
}
export class GestureAnnotationEngine {
private static instance: GestureAnnotationEngine;
private lastRightWrist: { x: number; y: number; z: number } | null = null;
private lastLeftWrist: { x: number; y: number; z: number } | null = null;
private lastGestureTime: number = 0;
private readonly GESTURE_COOLDOWN = 500;
// 手势状态机
private gestureState: 'idle' | 'pointing' | 'selecting' | 'drawing' = 'idle';
static getInstance(): GestureAnnotationEngine {
if (!GestureAnnotationEngine.instance) {
GestureAnnotationEngine.instance = new GestureAnnotationEngine();
}
return GestureAnnotationEngine.instance;
}
/**
* 处理 Body AR 数据,生成白板手势命令
*/
processBodyFrame(body: arEngine.ARBody): GestureCommand | null {
const landmarks = body.getLandmarks3D();
if (!landmarks) return null;
const floatView = new Float32Array(landmarks);
const now = Date.now();
// 获取双手关键点
const rightWrist = this.getLandmark3D(floatView, arEngine.ARBodyLandmarkType.RIGHT_WRIST);
const rightIndex = this.getLandmark3D(floatView, arEngine.ARBodyLandmarkType.RIGHT_INDEX_FINGER_TIP);
const leftWrist = this.getLandmark3D(floatView, arEngine.ARBodyLandmarkType.LEFT_WRIST);
const leftIndex = this.getLandmark3D(floatView, arEngine.ARBodyLandmarkType.LEFT_INDEX_FINGER_TIP);
if (!rightWrist || !rightIndex) return null;
// 计算右手食指指向的屏幕坐标(映射到白板)
const cursorPoint = this.mapToWhiteboard(rightIndex);
// 检测手势状态
const handDistance = leftWrist ?
Math.sqrt(Math.pow(rightWrist.x - leftWrist.x, 2) + Math.pow(rightWrist.y - leftWrist.y, 2)) : 0;
// 状态机转换
if (handDistance < 0.12 && leftWrist) {
// 双手捏合 → 圈选模式
this.gestureState = 'selecting';
return this.createSelectCommand(rightIndex, leftIndex!);
} else if (this.isFist(rightWrist, rightIndex, floatView)) {
// 握拳 → 绘制模式
this.gestureState = 'drawing';
return this.createDrawCommand(cursorPoint);
} else if (handDistance > 0.45 && leftWrist) {
// 双手张开 → 缩放
return this.createZoomCommand(handDistance);
} else if (this.detectSwipe(rightWrist)) {
// 挥手 → 翻页
if (now - this.lastGestureTime > this.GESTURE_COOLDOWN) {
this.lastGestureTime = now;
return { type: 'flip', points: [], metadata: { direction: 'next' } };
}
}
// 默认:指向模式
this.gestureState = 'pointing';
return this.createCursorCommand(cursorPoint);
// 更新历史位置
this.lastRightWrist = rightWrist;
this.lastLeftWrist = leftWrist;
}
private mapToWhiteboard(fingerTip: { x: number; y: number; z: number }): WhiteboardPoint {
// 将 3D 空间坐标映射到白板 2D 坐标
// 假设摄像头前方 0.5-1.5m 为有效交互区域
const normalizedX = Math.max(0, Math.min(1, (fingerTip.x + 0.5) / 1.0));
const normalizedY = Math.max(0, Math.min(1, (0.5 - fingerTip.y) / 1.0));
return {
x: normalizedX,
y: normalizedY,
type: 'move',
pressure: Math.max(0, Math.min(1, 1.5 - fingerTip.z)) // 距离越近压力越大
};
}
private isFist(
wrist: { x: number; y: number; z: number },
indexTip: { x: number; y: number; z: number },
floatView: Float32Array
): boolean {
// 检测是否握拳:食指指尖距离手腕较近
const distance = Math.sqrt(
Math.pow(indexTip.x - wrist.x, 2) +
Math.pow(indexTip.y - wrist.y, 2)
);
return distance < 0.08;
}
private detectSwipe(currentWrist: { x: number; y: number; z: number }): boolean {
if (!this.lastRightWrist) return false;
const velocity = Math.sqrt(
Math.pow(currentWrist.x - this.lastRightWrist.x, 2) +
Math.pow(currentWrist.y - this.lastRightWrist.y, 2)
);
// 速度阈值判定挥手
return velocity > 0.3;
}
private createCursorCommand(point: WhiteboardPoint): GestureCommand {
return {
type: 'cursor',
points: [point],
metadata: { state: this.gestureState }
};
}
private createDrawCommand(point: WhiteboardPoint): GestureCommand {
return {
type: 'draw',
points: [{ ...point, type: 'down' }, point, { ...point, type: 'up' }],
metadata: { strokeWidth: 2 + point.pressure * 4 }
};
}
private createSelectCommand(p1: { x: number; y: number }, p2: { x: number; y: number }): GestureCommand {
const rect = {
x: Math.min(p1.x, p2.x),
y: Math.min(p1.y, p2.y),
width: Math.abs(p1.x - p2.x),
height: Math.abs(p1.y - p2.y)
};
return {
type: 'select',
points: [
{ x: rect.x, y: rect.y, type: 'down', pressure: 1 },
{ x: rect.x + rect.width, y: rect.y + rect.height, type: 'up', pressure: 1 }
],
metadata: { selection: rect }
};
}
private createZoomCommand(handDistance: number): GestureCommand {
return {
type: 'zoom',
points: [],
metadata: { scale: 1 + (handDistance - 0.45) * 2 }
};
}
private getLandmark3D(floatView: Float32Array, type: arEngine.ARBodyLandmarkType): { x: number; y: number; z: number } | null {
const index = Object.values(arEngine.ARBodyLandmarkType).indexOf(type);
if (index < 0) return null;
const offset = index * 3;
if (offset + 2 >= floatView.length) return null;
return {
x: floatView[offset],
y: floatView[offset + 1],
z: floatView[offset + 2]
};
}
reset(): void {
this.gestureState = 'idle';
this.lastRightWrist = null;
this.lastLeftWrist = null;
}
}
4.3 沉浸光感情绪标题栏(EmotionLightTitleBar.ets)
代码亮点:根据会议情绪统计动态调整光效颜色和脉冲频率,发言者高亮显示。
// entry/src/main/ets/components/EmotionLightTitleBar.ets
import { HdsNavigation, SystemMaterialEffect } from '@kit.UIDesignKit';
import { EmotionRecognitionEngine, MeetingEmotion } from '../engine/EmotionRecognitionEngine';
@Component
export struct EmotionLightTitleBar {
@Prop meetingTitle: string = '产品评审会议';
@Prop participantCount: number = 5;
@Prop speakerName: string = '';
@State consensusLevel: number = 0.5;
@State dominantEmotion: MeetingEmotion = MeetingEmotion.NEUTRAL;
@State emotionStats: Map<MeetingEmotion, number> = new Map();
@State pulsePhase: number = 0;
private emotionEngine: EmotionRecognitionEngine = EmotionRecognitionEngine.getInstance();
aboutToAppear(): void {
this.startEmotionMonitoring();
this.startPulseAnimation();
}
private startEmotionMonitoring(): void {
setInterval(() => {
this.emotionStats = this.emotionEngine.getEmotionStatistics();
this.consensusLevel = this.emotionEngine.getConsensusLevel();
this.dominantEmotion = this.getDominantEmotion();
}, 1000);
}
private startPulseAnimation(): void {
const animate = () => {
this.pulsePhase = (this.pulsePhase + 0.03) % (Math.PI * 2);
requestAnimationFrame(animate);
};
requestAnimationFrame(animate);
}
private getDominantEmotion(): MeetingEmotion {
let maxIntensity = 0;
let dominant = MeetingEmotion.NEUTRAL;
this.emotionStats.forEach((intensity, emotion) => {
if (intensity > maxIntensity && emotion !== MeetingEmotion.NEUTRAL) {
maxIntensity = intensity;
dominant = emotion;
}
});
return dominant;
}
private getEmotionColor(): string {
const colors: Map<MeetingEmotion, string> = new Map([
[MeetingEmotion.AGREE, '#00D4AA'],
[MeetingEmotion.DISAGREE, '#FF6B6B'],
[MeetingEmotion.QUESTION, '#9B59B6'],
[MeetingEmotion.SURPRISE, '#FFE66D'],
[MeetingEmotion.LIKE, '#4ECDC4'],
[MeetingEmotion.NEUTRAL, '#808080']
]);
return colors.get(this.dominantEmotion) || '#808080';
}
private getMeetingStatus(): string {
if (this.consensusLevel > 0.8) return '高度共识';
if (this.consensusLevel > 0.6) return '多数赞同';
if (this.consensusLevel < 0.4) return '存在分歧';
if (this.emotionStats.get(MeetingEmotion.QUESTION)! > 0.3) return '疑问集中';
return '讨论中';
}
build() {
HdsNavigation({
title: this.meetingTitle,
subtitle: `${this.participantCount}人参与 · ${this.getMeetingStatus()}`,
systemMaterialEffect: SystemMaterialEffect.IMMERSIVE,
backgroundOpacity: 0.85,
height: 56,
leading: this.buildLeadingActions(),
trailing: this.buildTrailingActions()
})
.width('100%')
.backgroundColor(`rgba(${this.hexToRgb(this.getEmotionColor())}, 0.15)`)
.border({
width: { bottom: 2 },
color: this.getEmotionColor()
})
.shadow({
radius: 10 + Math.sin(this.pulsePhase) * 6,
color: this.getEmotionColor(),
offsetX: 0,
offsetY: 2
})
.animation({
duration: 300,
curve: Curve.EaseInOut
})
}
@Builder
buildLeadingActions(): void {
Row({ space: 12 }) {
// 情绪状态指示灯
Stack() {
Circle()
.width(14)
.height(14)
.fill(this.getEmotionColor())
.opacity(0.3 + Math.sin(this.pulsePhase) * 0.2)
Circle()
.width(8)
.height(8)
.fill(this.getEmotionColor())
}
// 共识度指示
Column({ space: 2 }) {
Text(`${Math.round(this.consensusLevel * 100)}% 共识`)
.fontSize(13)
.fontColor(this.consensusLevel > 0.6 ? '#00D4AA' : '#FFD700')
.fontWeight(FontWeight.Bold)
// 共识度进度条
Row() {
Row()
.width(`${this.consensusLevel * 100}%`)
.height(3)
.backgroundColor(this.consensusLevel > 0.6 ? '#00D4AA' : '#FFD700')
.borderRadius(1.5)
}
.width(60)
.height(3)
.backgroundColor('rgba(255,255,255,0.2)')
.borderRadius(1.5)
}
}
.padding({ left: 16 })
}
@Builder
buildTrailingActions(): void {
Row({ space: 10 }) {
// 当前发言者
if (this.speakerName) {
Text(`🎤 ${this.speakerName}`)
.fontSize(12)
.fontColor('#FFFFFF')
.padding({ left: 10, right: 10, top: 4, bottom: 4 })
.backgroundColor('rgba(255,255,255,0.15)')
.borderRadius(12)
}
// 情绪统计迷你面板
Row({ space: 4 }) {
ForEach([
{ emotion: MeetingEmotion.AGREE, icon: '✓', count: this.emotionStats.get(MeetingEmotion.AGREE) || 0 },
{ emotion: MeetingEmotion.DISAGREE, icon: '✗', count: this.emotionStats.get(MeetingEmotion.DISAGREE) || 0 },
{ emotion: MeetingEmotion.QUESTION, icon: '?', count: this.emotionStats.get(MeetingEmotion.QUESTION) || 0 }
], (item: { emotion: MeetingEmotion; icon: string; count: number }) => {
if (item.count > 0) {
Text(`${item.icon} ${Math.round(item.count)}`)
.fontSize(11)
.fontColor('rgba(255,255,255,0.7)')
.padding({ left: 6, right: 6, top: 2, bottom: 2 })
.backgroundColor('rgba(255,255,255,0.1)')
.borderRadius(6)
}
})
}
}
.padding({ right: 16 })
}
private hexToRgb(hex: string): string {
const r = parseInt(hex.slice(1, 3), 16);
const g = parseInt(hex.slice(3, 5), 16);
const b = parseInt(hex.slice(5, 7), 16);
return `${r},${g},${b}`;
}
}
4.4 悬浮协作面板(FloatCollaborationPanel.ets)
代码亮点:底部悬浮面板显示参会者情绪统计、手势模式和白板工具,支持透明度调节。
// entry/src/main/ets/components/FloatCollaborationPanel.ets
import { HdsTabs, HdsTabsController, hdsMaterial } from '@kit.UIDesignKit';
import { EmotionRecognitionEngine, MeetingEmotion } from '../engine/EmotionRecognitionEngine';
@Component
export struct FloatCollaborationPanel {
@State currentTab: number = 0;
@State transparencyLevel: number = 0.75;
@State gestureMode: string = 'cursor';
@Prop emotionStats: Map<MeetingEmotion, number> = new Map();
private controller: HdsTabsController = new HdsTabsController();
private readonly TAB_CONFIG = [
{ label: '情绪', icon: $r('sys.symbol.face_smiling') },
{ label: '手势', icon: $r('sys.symbol.hand_raised') },
{ label: '白板', icon: $r('sys.symbol.pencil') },
{ label: '设置', icon: $r('sys.symbol.gear') }
];
build() {
HdsTabs({ controller: this.controller }) {
ForEach(this.TAB_CONFIG, (item: typeof this.TAB_CONFIG[0], index: number) => {
TabContent() {
this.buildTabContent(index)
}
.tabBar(new BottomTabBarStyle({
normal: new SymbolGlyphModifier(item.icon).fontColor(['rgba(255,255,255,0.5)']),
selected: new SymbolGlyphModifier(item.icon).fontColor(['#00D4AA'])
}, item.label))
})
}
.barOverlap(true)
.vertical(false)
.barPosition(BarPosition.End)
.barFloatingStyle({
barBottomMargin: 20,
barSideMargin: 40,
systemMaterialEffect: {
materialType: hdsMaterial.MaterialType.IMMERSIVE,
materialLevel: hdsMaterial.MaterialLevel.EXQUISITE
}
})
.backgroundColor(`rgba(12,12,22,${this.transparencyLevel})`)
.backdropFilter($r('sys.blur.40'))
.borderRadius(24)
.margin({ left: '4%', right: '4%', bottom: 14 })
.shadow({
radius: 24,
color: 'rgba(0,0,0,0.35)',
offsetX: 0,
offsetY: -4
})
}
@Builder
buildTabContent(index: number): void {
Column({ space: 12 }) {
if (index === 0) {
this.buildEmotionPanel()
} else if (index === 1) {
this.buildGesturePanel()
} else if (index === 2) {
this.buildWhiteboardPanel()
} else {
this.buildSettingsPanel()
}
}
.width('100%')
.height('100%')
.padding(16)
}
@Builder
buildEmotionPanel(): void {
Column({ space: 10 }) {
Text('实时情绪统计')
.fontSize(16)
.fontColor('#FFFFFF')
.fontWeight(FontWeight.Bold)
// 情绪分布条形图
Column({ space: 8 }) {
ForEach([
{ emotion: MeetingEmotion.AGREE, label: '赞同', color: '#00D4AA', icon: '✓' },
{ emotion: MeetingEmotion.DISAGREE, label: '反对', color: '#FF6B6B', icon: '✗' },
{ emotion: MeetingEmotion.QUESTION, label: '疑问', color: '#9B59B6', icon: '?' },
{ emotion: MeetingEmotion.SURPRISE, label: '惊讶', color: '#FFE66D', icon: '!' },
{ emotion: MeetingEmotion.LIKE, label: '喜欢', color: '#4ECDC4', icon: '♥' }
], (item: { emotion: MeetingEmotion; label: string; color: string; icon: string }) => {
const count = this.emotionStats.get(item.emotion) || 0;
const maxCount = Math.max(...Array.from(this.emotionStats.values()), 1);
const percentage = (count / maxCount) * 100;
Row({ space: 8 }) {
Text(`${item.icon} ${item.label}`)
.fontSize(13)
.fontColor('rgba(255,255,255,0.8)')
.width(60)
Stack({ alignContent: Alignment.Start }) {
Row()
.width(`${percentage}%`)
.height(8)
.backgroundColor(item.color)
.borderRadius(4)
.animation({ duration: 500 })
Row()
.width('100%')
.height(8)
.backgroundColor('rgba(255,255,255,0.1)')
.borderRadius(4)
}
.width(120)
.height(8)
Text(`${Math.round(count)}`)
.fontSize(12)
.fontColor(item.color)
.width(30)
}
.width('100%')
})
}
// 情绪趋势提示
Text('最近1分钟情绪趋势')
.fontSize(12)
.fontColor('rgba(255,255,255,0.5)')
.margin({ top: 4 })
}
}
@Builder
buildGesturePanel(): void {
Column({ space: 10 }) {
Text('手势控制模式')
.fontSize(16)
.fontColor('#FFFFFF')
.fontWeight(FontWeight.Bold)
// 当前模式指示
Row({ space: 8 }) {
ForEach([
{ mode: 'cursor', label: '指针', icon: '👆' },
{ mode: 'draw', label: '绘制', icon: '✏️' },
{ mode: 'select', label: '圈选', icon: '⬚' }
], (item: { mode: string; label: string; icon: string }) => {
Column({ space: 4 }) {
Text(item.icon)
.fontSize(24)
Text(item.label)
.fontSize(12)
.fontColor(this.gestureMode === item.mode ? '#00D4AA' : 'rgba(255,255,255,0.5)')
}
.width(70)
.padding(10)
.backgroundColor(this.gestureMode === item.mode ? 'rgba(0,212,170,0.15)' : 'rgba(255,255,255,0.05)')
.borderRadius(12)
.border({
width: 1,
color: this.gestureMode === item.mode ? '#00D4AA' : 'transparent'
})
.onClick(() => {
this.gestureMode = item.mode;
AppStorage.setOrCreate('gesture_mode', item.mode);
})
})
}
// 手势指南
Column({ space: 6 }) {
ForEach([
{ gesture: '👆 食指指向', desc: '移动光标位置' },
{ gesture: '✊ 握拳', desc: '进入绘制模式' },
{ gesture: '🤏 双手捏合', desc: '圈选区域' },
{ gesture: '🙌 双手张开', desc: '放大白板' },
{ gesture: '👋 挥手', desc: '翻到下页' }
], (item: { gesture: string; desc: string }) => {
Row({ space: 10 }) {
Text(item.gesture)
.fontSize(16)
Text(item.desc)
.fontSize(12)
.fontColor('rgba(255,255,255,0.6)')
.layoutWeight(1)
}
.width('100%')
.padding(6)
})
}
.padding(10)
.backgroundColor('rgba(255,255,255,0.03)')
.borderRadius(8)
}
}
@Builder
buildWhiteboardPanel(): void {
Column({ space: 10 }) {
Text('白板工具')
.fontSize(16)
.fontColor('#FFFFFF')
.fontWeight(FontWeight.Bold)
Grid() {
ForEach([
{ tool: 'pen', icon: '✏️', color: '#FFFFFF' },
{ tool: 'marker', icon: '🖍️', color: '#FFE66D' },
{ tool: 'eraser', icon: '🧼', color: '#FF6B6B' },
{ tool: 'shape', icon: '⬜', color: '#4ECDC4' },
{ tool: 'text', icon: 'T', color: '#9B59B6' },
{ tool: 'undo', icon: '↩️', color: '#808080' }
], (item: { tool: string; icon: string; color: string }) => {
GridItem() {
Column({ space: 6 }) {
Text(item.icon)
.fontSize(22)
Text(item.tool)
.fontSize(11)
.fontColor('rgba(255,255,255,0.7)')
}
.width(60)
.height(60)
.backgroundColor('rgba(255,255,255,0.05)')
.borderRadius(12)
}
.onClick(() => {
AppStorage.setOrCreate('whiteboard_tool', item.tool);
})
})
}
.columnsTemplate('1fr 1fr 1fr')
.rowsGap(10)
.columnsGap(10)
}
}
@Builder
buildSettingsPanel(): void {
Column({ space: 14 }) {
Text('面板透明度')
.fontSize(16)
.fontColor('#FFFFFF')
.fontWeight(FontWeight.Bold)
Row({ space: 10 }) {
ForEach([
{ label: '弱', value: 0.55 },
{ label: '平衡', value: 0.75 },
{ label: '强', value: 0.90 }
], (item: { label: string; value: number }) => {
Button(item.label)
.fontSize(13)
.fontColor('#FFFFFF')
.backgroundColor(this.transparencyLevel === item.value ? '#00D4AA' : 'rgba(255,255,255,0.1)')
.padding({ left: 20, right: 20, top: 6, bottom: 6 })
.borderRadius(16)
.onClick(() => {
this.transparencyLevel = item.value;
})
})
}
Text('情绪反馈灵敏度')
.fontSize(16)
.fontColor('#FFFFFF')
.fontWeight(FontWeight.Bold)
.margin({ top: 8 })
Slider({
value: 0.7,
min: 0.3,
max: 1.0,
step: 0.1
})
.width('100%')
.selectedColor('#00D4AA')
}
}
}
4.5 主协作页面(CollaborationPage.ets)
代码亮点:整合 Face AR 情绪识别、Body AR 手势标注、沉浸光感标题栏和悬浮协作面板,实现完整的"AR 远程协作"体验。
// entry/src/main/ets/pages/CollaborationPage.ets
import { EmotionLightTitleBar } from '../components/EmotionLightTitleBar';
import { FloatCollaborationPanel } from '../components/FloatCollaborationPanel';
import { EmotionRecognitionEngine, EmotionResult, MeetingEmotion } from '../engine/EmotionRecognitionEngine';
import { GestureAnnotationEngine, GestureCommand } from '../engine/GestureAnnotationEngine';
@Entry
@Component
struct CollaborationPage {
@State meetingTitle: string = 'Q3产品规划评审';
@State participantCount: number = 8;
@State speakerName: string = '产品经理-张三';
@State emotionStats: Map<MeetingEmotion, number> = new Map();
@State whiteboardContent: string = '白板内容区域';
@State cursorPosition: { x: number; y: number } = { x: 0.5, y: 0.5 };
@State isDrawing: boolean = false;
@State trackingQuality: number = 1.0;
private emotionEngine: EmotionRecognitionEngine = EmotionRecognitionEngine.getInstance();
private gestureEngine: GestureAnnotationEngine = GestureAnnotationEngine.getInstance();
private arLoopId: number = 0;
aboutToAppear(): void {
this.initializeARSession();
}
aboutToDisappear(): void {
cancelAnimationFrame(this.arLoopId);
}
private initializeARSession(): void {
this.startARLoop();
}
private startARLoop(): void {
const loop = () => {
this.processARFrame();
this.arLoopId = requestAnimationFrame(loop);
};
this.arLoopId = requestAnimationFrame(loop);
}
private processARFrame(): void {
// 模拟AR数据处理
let quality = 0;
// Face AR情绪处理
// const emotionResult = this.emotionEngine.recognizeEmotion(face);
// if (emotionResult.confidence > 0.5) {
// this.handleEmotion(emotionResult);
// }
// quality += 0.5;
// Body AR手势处理
// const gestureCommand = this.gestureEngine.processBodyFrame(body);
// if (gestureCommand) {
// this.handleGesture(gestureCommand);
// }
// quality += 0.5;
// 模拟数据
this.simulateCollaborationData();
this.trackingQuality = quality;
this.emotionStats = this.emotionEngine.getEmotionStatistics();
}
private simulateCollaborationData(): void {
// 模拟光标移动
this.cursorPosition = {
x: 0.3 + Math.sin(Date.now() / 2000) * 0.2,
y: 0.3 + Math.cos(Date.now() / 3000) * 0.2
};
}
private handleEmotion(result: EmotionResult): void {
// 广播情绪到远端
console.info(`[Emotion] ${result.emotion} (confidence: ${result.confidence})`);
// 本地反馈
if (result.emotion === MeetingEmotion.QUESTION) {
this.showQuestionIndicator();
}
}
private handleGesture(command: GestureCommand): void {
switch (command.type) {
case 'cursor':
if (command.points.length > 0) {
this.cursorPosition = { x: command.points[0].x, y: command.points[0].y };
}
break;
case 'draw':
this.isDrawing = true;
setTimeout(() => { this.isDrawing = false; }, 500);
break;
case 'select':
console.info('[Gesture] Selection:', command.metadata.selection);
break;
case 'flip':
// 翻页逻辑
break;
case 'zoom':
// 缩放逻辑
break;
}
}
private showQuestionIndicator(): void {
// 显示疑问标记动画
}
build() {
Stack({ alignContent: Alignment.Center }) {
// 第一层:动态环境光背景
this.buildAmbientLightLayer()
// 第二层:协作内容层
Column({ space: 0 }) {
// 沉浸光感情绪标题栏
EmotionLightTitleBar({
meetingTitle: this.meetingTitle,
participantCount: this.participantCount,
speakerName: this.speakerName
})
// 白板区域
Stack({ alignContent: Alignment.Center }) {
// 白板背景
Column() {
Text(this.whiteboardContent)
.fontSize(18)
.fontColor('rgba(255,255,255,0.5)')
// 手势光标
Column()
.width(20)
.height(20)
.backgroundColor(this.isDrawing ? '#FFE66D' : '#00D4AA')
.borderRadius(10)
.opacity(0.8)
.position({
x: `${this.cursorPosition.x * 100}%`,
y: `${this.cursorPosition.y * 100}%`
})
.markAnchor({ x: 0.5, y: 0.5 })
.shadow({
radius: 10,
color: this.isDrawing ? '#FFE66D' : '#00D4AA'
})
.animation({
duration: 100,
curve: Curve.Linear
})
// AR追踪状态
if (this.trackingQuality > 0.5) {
Text('AR协作中...')
.fontSize(12)
.fontColor('#00D4AA')
.position({ x: '50%', y: '95%' })
.markAnchor({ x: 0.5, y: 1 })
}
}
.width('100%')
.layoutWeight(1)
.justifyContent(FlexAlign.Center)
.backgroundColor('rgba(255,255,255,0.02)')
.borderRadius(16)
.margin(16)
.border({
width: 1,
color: 'rgba(255,255,255,0.1)'
})
// 情绪浮动标记
this.buildFloatingEmotionMarks()
}
.layoutWeight(1)
}
.width('100%')
.height('100%')
// 第三层:悬浮协作面板
FloatCollaborationPanel({
emotionStats: this.emotionStats
})
.height(300)
.position({ x: 0, y: '100%' })
.markAnchor({ x: 0, y: 1 })
}
.width('100%')
.height('100%')
.backgroundColor('#080810')
.expandSafeArea(
[SafeAreaType.SYSTEM],
[SafeAreaEdge.TOP, SafeAreaEdge.BOTTOM, SafeAreaEdge.START, SafeAreaEdge.END]
)
}
@Builder
buildAmbientLightLayer(): void {
Column() {
// 顶部情绪光晕
Column()
.width(700)
.height(350)
.backgroundColor(this.getDominantColor())
.blur(200)
.opacity(0.1)
.position({ x: '50%', y: '0%' })
.anchor('50%')
.animation({
duration: 6000,
curve: Curve.EaseInOut,
iterations: -1,
playMode: PlayMode.Alternate
})
.scale({ x: 1.3, y: 1.0 })
// 底部氛围光
Column()
.width('100%')
.height(250)
.backgroundColor(this.getDominantColor())
.opacity(0.05)
.blur(120)
.position({ x: 0, y: '75%' })
.linearGradient({
direction: GradientDirection.Top,
colors: [
[this.getDominantColor(), 0.0],
['transparent', 1.0]
]
})
}
.width('100%')
.height('100%')
.backgroundColor('#050508')
}
@Builder
buildFloatingEmotionMarks(): void {
// 浮动情绪标记(模拟其他参会者的情绪反馈)
ForEach([
{ x: 0.2, y: 0.3, emotion: MeetingEmotion.AGREE, user: '李四' },
{ x: 0.7, y: 0.5, emotion: MeetingEmotion.QUESTION, user: '王五' },
{ x: 0.5, y: 0.8, emotion: MeetingEmotion.SURPRISE, user: '赵六' }
], (mark: { x: number; y: number; emotion: MeetingEmotion; user: string }) => {
Column({ space: 2 }) {
Text(this.getEmotionIcon(mark.emotion))
.fontSize(20)
Text(mark.user)
.fontSize(10)
.fontColor('rgba(255,255,255,0.6)')
}
.position({ x: `${mark.x * 100}%`, y: `${mark.y * 100}%` })
.markAnchor({ x: 0.5, y: 0.5 })
.padding(6)
.backgroundColor('rgba(0,0,0,0.4)')
.borderRadius(8)
.backdropFilter($r('sys.blur.10'))
})
}
private getDominantColor(): string {
const colors: Map<MeetingEmotion, string> = new Map([
[MeetingEmotion.AGREE, '#00D4AA'],
[MeetingEmotion.DISAGREE, '#FF6B6B'],
[MeetingEmotion.QUESTION, '#9B59B6'],
[MeetingEmotion.SURPRISE, '#FFE66D'],
[MeetingEmotion.LIKE, '#4ECDC4'],
[MeetingEmotion.NEUTRAL, '#808080']
]);
let dominant = MeetingEmotion.NEUTRAL;
let maxCount = 0;
this.emotionStats.forEach((count, emotion) => {
if (count > maxCount) {
maxCount = count;
dominant = emotion;
}
});
return colors.get(dominant) || '#808080';
}
private getEmotionIcon(emotion: MeetingEmotion): string {
const icons: Map<MeetingEmotion, string> = new Map([
[MeetingEmotion.AGREE, '✓'],
[MeetingEmotion.DISAGREE, '✗'],
[MeetingEmotion.QUESTION, '?'],
[MeetingEmotion.SURPRISE, '!'],
[MeetingEmotion.LIKE, '♥'],
[MeetingEmotion.NEUTRAL, '○']
]);
return icons.get(emotion) || '○';
}
}
五、关键技术总结
5.1 Face AR 情绪识别技术
| 情绪 | BlendShape 参数 | 头部姿态 | 置信度阈值 | 冷却期 |
|---|---|---|---|---|
| 赞同 | mouthSmile > 0.3 | pitch > 0.25 | 0.5 | 1500ms |
| 反对 | mouthFrown > 0.4 | yaw > 0.35 | 0.5 | 1500ms |
| 疑问 | browDown > 0.4, mouthPucker > 0.3 | - | 0.5 | 2000ms |
| 惊讶 | browInnerUp > 0.6, eyeWide > 0.5 | - | 0.5 | 2000ms |
| 喜欢 | mouthSmile > 0.5, cheekSquint > 0.4 | - | 0.5 | 1000ms |
5.2 Body AR 手势标注技术
| 手势 | 骨骼关键点计算 | 映射操作 | 交互区域 |
|---|---|---|---|
| 食指指向 | indexTip 3D位置 | 光标移动 | 白板全区域 |
| 握拳 | indexTip-wrist 距离 < 0.08m | 绘制模式 | 当前光标位置 |
| 双手捏合 | 双腕距离 < 0.12m | 圈选区域 | 双指包围区域 |
| 双手张开 | 双腕距离 > 0.45m | 缩放视图 | 白板整体 |
| 挥手 | wrist 速度 > 0.3 | 翻页 | 边缘触发区 |
5.3 沉浸光感与会议情绪联动
| 会议状态 | 共识度 | 主导情绪 | 标题栏光效 | 环境光色 | 触发动作 |
|---|---|---|---|---|---|
| 高度共识 | >80% | 赞同 | 稳定翠绿 | 青绿 | 通过决议 |
| 多数赞同 | >60% | 赞同 | 柔和青蓝 | 蓝绿 | 继续讨论 |
| 存在争议 | 40-60% | 混合 | 渐强琥珀 | 暖黄 | 深入探讨 |
| 严重分歧 | <40% | 反对 | 急促赤红 | 暗红 | 暂停调解 |
| 疑问集中 | - | 疑问 | 呼吸紫蓝 | 紫色 | 补充说明 |
| 惊讶反馈 | - | 惊讶 | 脉冲明黄 | 金黄 | 记录亮点 |
六、分布式协作设计
6.1 情绪数据同步
// 通过鸿蒙分布式软总线广播情绪数据
import { distributedDataObject } from '@kit.DistributedServiceKit';
export class EmotionSyncManager {
private syncObj: distributedDataObject.DistributedObject;
constructor() {
this.syncObj = distributedDataObject.create({
emotions: new Map(),
consensusLevel: 0.5,
timestamp: 0
});
this.syncObj.setSessionId('meeting_emotion_sync');
}
broadcastEmotion(emotion: EmotionResult): void {
this.syncObj.emotions.set(emotion.emotion,
(this.syncObj.emotions.get(emotion.emotion) || 0) + emotion.intensity);
this.syncObj.timestamp = Date.now();
}
subscribeRemoteEmotions(callback: (emotions: Map<MeetingEmotion, number>) => void): void {
this.syncObj.on('change', () => {
callback(this.syncObj.emotions);
});
}
}
6.2 手势坐标同步
// 实时同步手势坐标到远端白板
export class GestureSyncManager {
broadcastGesture(command: GestureCommand): void {
// 通过软总线发送手势命令
const message = JSON.stringify({
type: command.type,
points: command.points,
metadata: command.metadata,
timestamp: Date.now()
});
// 发送到所有参会设备
this.sendToAllParticipants(message);
}
}
七、总结与展望
本文基于 HarmonyOS 6(API 23)的 Face AR & Body AR 能力,结合 沉浸光感 + 悬浮导航,完整实战了一款 PC 端"AR 远程协作白板"应用。核心创新点总结:
- 表情驱动会议反馈:通过 Face AR 识别 5 种会议情绪,实时统计赞同/反对/疑问分布,让"沉默的大多数"被看见
- 隔空手势标注:利用 Body AR 实现无接触式白板操作,演讲者无需鼠标即可圈画重点
- 情绪光感氛围:根据会议共识度动态调整 UI 光效,让远程会议也有"现场感"
- 悬浮协作面板:底部导航实时显示情绪统计和手势模式,支持透明度调节,不遮挡白板内容
未来扩展方向:
- AI 会议纪要:结合情绪数据和发言内容,自动生成带有"情绪标签"的智能会议纪要
- 3D 空间白板:利用 AR 空间锚点,实现多设备共享的 3D 立体白板协作
- 虚拟化身:将 Face AR 表情映射到虚拟化身,让远程会议更具"面对面"感
- 脑机接口融合:未来结合脑电波检测,实现"意念标注"的终极协作体验
转载自:https://blog.csdn.net/u014727709/article/details/134040742
欢迎 👍点赞✍评论⭐收藏,欢迎指正
更多推荐

所有评论(0)