# MultiDeviceInteraction
**Repository Path**: llince/multi-device-interaction
## Basic Information
- **Project Name**: MultiDeviceInteraction
- **Description**: 本实例依托系统级交互归一化能力,实现了跨多端设备的统一交互体验。支持触控屏、触控板、鼠标、键盘等多种输入方式,涵盖基础输入事件、手势识别、拖拽操作及焦点管理等核心交互功能,为多设备交互提供完整示例。
- **Primary Language**: Unknown
- **License**: Apache-2.0
- **Default Branch**: master
- **Homepage**: None
- **GVP Project**: No
## Statistics
- **Stars**: 0
- **Forks**: 4
- **Created**: 2026-01-14
- **Last Updated**: 2026-01-14
## Categories & Tags
**Categories**: Uncategorized
**Tags**: None
## README
# Multi-Device Interaction
## Overview
This sample demonstrates how to achieve a consistent cross-device interaction experience using the system-level interaction normalization capability. Multiple input devices are supported, such as touchscreen, touchpad, mouse, and keyboard, encompassing basic input events, gesture events, drag-and-drop events, and focus management. Below is a complete example of multi-device interactions.
## Effect
| Input Device | Interaction Event | Interaction Experience |
|---------------------------------------------------------|---------------------------------------------------------|---------------------------------------------------------|
|
|
|
|
## How to Use
1. Run the default module on Bar phone, Bi-fold phone (Mate X series), Widescreen foldable phone, Tri-fold phone, Tablet, PC/2-in-1 device or Vision, and the wearable module on Wearable.
2. On the home page, select an input device to interact with. A page will be displayed showing supported interaction events.
3. Choose an interaction event, perform the interaction, and view its effect.
## Project Directory
```
├───default
│ ├───src/main/ets
│ │ ├───common
│ │ │ ├───constants
│ │ │ │ └───Constants.ets // Constants
│ │ │ └───utils
│ │ │ ├───ResourceConversion.ets // Resource conversion utility
│ │ │ ├───Utils.ets // Common utility
│ │ │ ├───WidthBreakpointType.ets // Breakpoint utility
│ │ │ └───WindowUtil.ets // Window utility
│ │ ├───entryability
│ │ │ └───EntryAbility.ets // Ability lifecycle callbacks
│ │ ├───entrybackupability
│ │ │ └───EntryBackupAbility.ets // EntryBackupAbility
│ │ ├───pages
│ │ │ └───Index.ets // Home page
│ │ └───view
│ │ ├───base
│ │ │ ├───AxisEvent.ets // Axis event
│ │ │ ├───FocusAxisEvent.ets // Focus axis event
│ │ │ ├───HandwriteEvent.ets // Stylus event
│ │ │ ├───HoverEvent.ets // Hover event
│ │ │ └───KeyEvent.ets // Key event
│ │ ├───drag
│ │ │ └───DragEvent.ets // Drag-and-drop event
│ │ ├───focus
│ │ │ └───FocusEvent.ets // Focus event
│ │ ├───gesture
│ │ │ ├───ClickEvent.ets // Click event
│ │ │ ├───LongPressGestureEvent.ets // Long-press gesture
│ │ │ ├───PanGestureEvent.ets // Pan gesture
│ │ │ ├───PinchGestureEvent.ets // Pinch gesture
│ │ │ ├───RotationGestureEvent.ets // Rotation gesture
│ │ │ ├───SwipeGestureEvent.ets // Swipe gesture
│ │ │ └───TapGestureEvent.ets // Tap gesture
│ │ ├───InteractionEvents.ets // Interaction event page
│ │ ├───UserActionTitle.ets // Interaction action title
│ │ ├───UserButton.ets // Interaction list button
│ │ ├───UserDesc.ets // Interaction device response
│ │ ├───UserText.ets // Interaction data display
│ │ └───UserTitle.ets // Interaction event title
│ └───src/main/resources
└───wearable
├───src/main/ets
│ ├───pages
│ │ └───Index.ets // Home page
│ ├───view
│ │ ├───DigitalCrownEvent.ets // Digital crown event
│ │ ├───PressureSensitive.ets // Pressure-sensitive
│ │ └───SmartGestureEvent.ets // Smart gesture event
│ ├───wearableability
│ │ └───WearableAbility.ets // WearableAbility lifecycle
│ └───wearablebackupability
│ └───WearableBackupAbility.ets // WearableBackupAbility
└───src/main/resources
```
## How to Implement
1. For basic input events like stylus events, leverage HandwriteComponent to enable handwriting with a variety of strokes. Listen for component events (such as onAxisEvent and onKeyEvent) and handle service logic within callbacks to respond to input device interactions.
2. For gesture events, add gesture objects to the component to facilitate system collection and processing. Bind gestures via the gesture API and configure gesture types such as LongPressGesture, PanGesture, and PinchGesture. onAction acts as the callback for successful gesture recognition, where relevant service logic is implemented.
3. To listen for focus state changes (focused/unfocused), leverage the onFocus and onBlur events. The system's focus traversal algorithm is applied by default. To accommodate complex UI navigation requirements, use tabIndex to adjust the focus order and nextFocus to customize focus navigation directions (up/down/left/right).
4. The drag-and-drop functionality is implemented through sequential handling of the onDragStart, onDragEnter, onDragMove, onDragLeave, and onDrop events, building a complete drag-and-drop interaction flow.
## Required Permissions
N/A
## Constraints
1. Supported devices
Local devices: Bar phone, Bi-fold phone(Mate X series), Widescreen foldable phone, Tri-fold phone, Tablet, PC/2-in-1 device, Wearable or Vision.
2. The HarmonyOS version must be HarmonyOS 6.0.0 Release or later.
3. The DevEco Studio version must be DevEco Studio 6.0.0 Release or later.
4. The HarmonyOS SDK version must be HarmonyOS 6.0.0 Release SDK or later.