背景
写HarmonyOS穿戴应用研发,仅仅是为了一线研发人员提供少许的帮助。
在有些公司,可能因为业务的需要,所以要求研发人员一定要在华为手表穿戴上研发特定的功能,并且理所应当的认为这个开发成本就一个顺手的事情。
开发应用顺手的原因无非就几点
宣传 - HarmonyOS已经看似非常成熟实践UI效果 - 研发人员在
手机上也能写出绚丽的页面和
功能汇报 - 会写H5
前端,就能立即
开发HarmonyOS
应用
穿戴研发核心注意点
developer.harmonyos.com 网站中的
文档,只能阅读3.0
版本智能穿戴 和 轻穿戴
应用均采用JS
语言开发智能穿戴 对应
产品 - HUAWEI WATCH 3轻穿戴 对应
产品 - HUAWEI WATCH GT 2 Pro,HUAWEI WATCH GT 3和你
产品功能相关的
API,切记真机
验证JS调用Java的
文档一定要阅读,因为穿戴
设备上的JS
API功能相对来说比较少JS
语言开发的
应用启动时的第一个页面是由 congfig.json
文件中的 module -> js -> pages 中的第一个
位置文件决定的,比如如下截图中的红色框
文件
重点-JS调用Java
场景
录音docs.qq.com/doc/DUmN4VVhBd3NxdExK前往。
效果图
为了快速演示效果,效果图来自IDE
这里暂且认为你已阅读过3.0的开发文档,并且已经知晓所有的目录结构和开发语言
js代码包含三部分:1. hml页面 2. js逻辑处理 3. css页面样式
布局
testrecordaudiopage.js
import router from @system.router;
import featureAbility from @ohos.ability.featureAbility;
import serviceBridge from ../../generated/ServiceBridge.js;
show_button_record: true,
if (e.direction == "right") {
async audioRecorderDemo(type) {
this.record = new serviceBridge()
if (type === recordaudio) {
if(this.button_record === 录音){
this.record.startRecord().then(value => {
if(value.abilityResult == 3){
vm.show_button_play = false
this.record.stopRecord().then(value => {
if(value.abilityResult == 1){
vm.show_button_play = true
} else if (type === playaudio) {
if(this.button_play === 播放){
this.record.play().then(value => {
if(value.abilityResult == 3){
vm.show_button_record = false
var playTimeStatus = setInterval(()=>{
this.record.isPlaying().then(value => {
if(!value.abilityResult){
vm.show_button_record = true
clearInterval(playTimeStatus)
this.record.stopPlay().then(value => {
if(value.abilityResult == 1){
vm.show_button_record = true
testrecordaudiopage.hml
<div
class=
"container" onsw
IPe=
"swIPeEvent">
<div class="audiobutton">
<button class="buttons_record" show="{{show_button_record}}" onclick="audioRecorderDemo(recordaudio)">{{button_record}}</button>
<button class="buttons_play" show="{{show_button_play}}" onclick="audioRecorderDemo(playaudio)">{{button_play}}</button>
testrecordaudiopage.css
background-color: #1F71FF;
background-color: #1F71FF;
Java
API实现的
功能
js/generated/ServiceBridge.js,注意这个文件是自动生成的.
// This file is automatically generated. Do not modify it!
const ABILITY_TYPE_EXTERNAL = 0;
const ABILITY_TYPE_INTERNAL = 1;
const BUNDLE_NAME = com.harvey.hw.wear;
const ABILITY_NAME = com.harvey.hw.wear.ServiceBridgeStub;
const OPCODE_startRecord = 11;
const OPCODE_stopRecord = 12;
const OPCODE_stopPlay = 13;
const OPCODE_isPlaying = 14;
const sendRequest = async (opcode, data) => {
action.bundleName = BUNDLE_NAME;
action.abilityName = ABILITY_NAME;
action.messageCode = opcode;
action.abilityType = ABILITY_TYPE_INTERNAL;
action.syncOption = ACTION_SYNC;
return FeatureAbility.callAbility(action);
if (arguments.length != 0) {
throw new Error("Method expected 0 arguments, got " + arguments.length);
const result = await sendRequest(OPCODE_startRecord, data);
return JSON.parse(result);
if (arguments.length != 0) {
throw new Error("Method expected 0 arguments, got " + arguments.length);
const result = await sendRequest(OPCODE_stopRecord, data);
return JSON.parse(result);
if (arguments.length != 0) {
throw new Error("Method expected 0 arguments, got " + arguments.length);
const result = await sendRequest(OPCODE_stopPlay, data);
return JSON.parse(result);
if (arguments.length != 0) {
throw new Error("Method expected 0 arguments, got " + arguments.length);
const result = await sendRequest(OPCODE_isPlaying, data);
return JSON.parse(result);
if (arguments.length != 0) {
throw new Error("Method expected 0 arguments, got " + arguments.length);
const result = await sendRequest(OPCODE_play, data);
return JSON.parse(result);
export default ServiceBridge;
既然这个文件是自动生成的,那么继续看工程配置。
首先,在 工程根目录/entry/src/main/java 文件夹下的包名(文章举例使用:com.harvey.hw.wear)中创建一个名为ServiceBridge.java的文件
其次,配置初始化和声明ServiceBridge.js文件的注解。为什么注册路径文件是MainAbility, 因为初始化的代码是在MainAbility.java文件中
package com.harvey.hw.wear;
import com.harvey.hw.wear.bluetooth.BLEMain;
import ohos.annotation.f2pautogen.ContextInject;
import ohos.annotation.f2pautogen.InternalAbility;
import ohos.app.AbilityContext;
import ohos.bundle.IBundleManager;
import ohos.dcall.DistributedCallManager;
import ohos.hiviewdfx.HiLog;
import ohos.hiviewdfx.HiLogLabel;
import ohos.media.audio.*;
@InternalAbility(registerTo = "com.harvey.hw.wear.MainAbility")
public class ServiceBridge {
private static final HiLogLabel LABEL_LOG = new HiLogLabel(3, 0xD001100, "ServiceBridge");
AbilityContext abilityContext;
public int startRecord() {
if(abilityContext.verifySelfPermission("ohos.permission.MICROPHONE") == IBundleManager.PERMISSION_DENIED){
HiLog.error(LABEL_LOG, "RecordServiceAbility::onStart");
public int stopRecord() {
if (isRecording && audioCapturer.stop()) {
private AudioRenderer renderer;
private static boolean isPlaying = false;
if(isPlaying && renderer.stop()){
public boolean isPlaying(){
String Path = "/data/data/"+abilityContext.getBundleName()+"/files/record.pcm";
File pcmFilePath = new File(Path);
if(!pcmFilePath.isFile() || !pcmFilePath.exists()){
new Thread(new Runnable() {
AudioStreamInfo audioStreamInfo = new AudioStreamInfo.Builder().sampleRate(SAMPLE_RATE)
.encodingFormat(ENCODING_FORMAT)
.channelMask(CHANNEL_OUT_MASK)
.streamUsage(AudioStreamInfo.StreamUsage.STREAM_USAGE_MEDIA)
AudioRendererInfo audioRendererInfo = new AudioRendererInfo.Builder().audioStreamInfo(audioStreamInfo)
.audioStreamOutputFlag(AudioRendererInfo.AudioStreamOutputFlag.AUDIO_STREAM_OUTPUT_FLAG_DIRECT_PCM)
.sessionID(AudioRendererInfo.SESSION_ID_UNSPECIFIED)
.bufferSizeInBytes(BUFFER_SIZE)
renderer = new AudioRenderer(audioRendererInfo, AudioRenderer.PlayMode.MODE_STREAM);
AudioInterrupt audioInterrupt = new AudioInterrupt();
AudioManager audioManager = new AudioManager();
audioInterrupt.setStreamInfo(audioStreamInfo);
audioInterrupt.setInterruptListener(new AudioInterrupt.InterruptListener() {
public void onInterrupt(int type, int hint) {
if (type == AudioInterrupt.INTERRUPT_TYPE_BEGIN
&& hint == AudioInterrupt.INTERRUPT_HINT_PAUSE) {
} else if (type == AudioInterrupt.INTERRUPT_TYPE_BEGIN
&& hint == AudioInterrupt.INTERRUPT_HINT_NONE) {
} else if (type == AudioInterrupt.INTERRUPT_TYPE_END && (
hint == AudioInterrupt.INTERRUPT_HINT_NONE
|| hint == AudioInterrupt.INTERRUPT_HINT_RESUME)) {
HiLog.error(LABEL_LOG, "unexpected type or hint");
audioManager.activateAudioInterrupt(audioInterrupt);
AudioDeviceDescr
IPtor[] devices = AudioManager.
getDevices(AudioDeviceDescr
IPtor.DeviceFlag.INPUT_DEVICES_FLAG);
for(AudioDeviceDescr
IPtor des:devices){
if(des.
getType() == AudioDeviceDescr
IPtor.DeviceType.SPEAKER){
renderer.setOutputDevice(des);
renderer.setVolume(1.0f);
BufferedInputStream bis1 = null;
bis1 = new BufferedInputStream(new FileInputStream(pcmFilePath));
int minBufferSize = renderer.getMinBufferSize(SAMPLE_RATE, ENCODING_FORMAT,
byte[] buffers = new byte[minBufferSize];
while ((bis1.read(buffers)) != -1) {
renderer.write(buffers, 0, buffers.length);
} catch (IOException e) {
private AudioCapturer audioCapturer;
private static final AudioStreamInfo.EncodingFormat ENCODING_FORMAT = AudioStreamInfo.EncodingFormat.ENCODING_PCM_16BIT;
private static final AudioStreamInfo.ChannelMask CHANNEL_IN_MASK = AudioStreamInfo.ChannelMask.CHANNEL_IN_STEREO;
private static final AudioStreamInfo.ChannelMask CHANNEL_OUT_MASK = AudioStreamInfo.ChannelMask.CHANNEL_OUT_STEREO;
private static final int SAMPLE_RATE = 16000;
private static final int BUFFER_SIZE = 1024;
private static boolean isRecording = false;
AudioDeviceDescr
IPtor[] devices = AudioManager.
getDevices(AudioDeviceDescr
IPtor.DeviceFlag.INPUT_DEVICES_FLAG);
AudioDeviceDescr
IPtor currentAudioType = null;
for(AudioDeviceDescr
IPtor des:devices){
if(des.
getType() == AudioDeviceDescr
IPtor.DeviceType.MIC){
AudioCapturerInfo.AudioInputSource source = AudioCapturerInfo.AudioInputSource.AUDIO_INPUT_SOURCE_MIC;
AudioStreamInfo audioStreamInfo = new AudioStreamInfo.Builder().audioStreamFlag(
AudioStreamInfo.AudioStreamFlag.AUDIO_STREAM_FLAG_AUDIBILITY_ENFORCED)
.encodingFormat(ENCODING_FORMAT)
.channelMask(CHANNEL_IN_MASK)
.streamUsage(AudioStreamInfo.StreamUsage.STREAM_USAGE_MEDIA)
AudioCapturerInfo audioCapturerInfo = new AudioCapturerInfo.Builder().audioStreamInfo(audioStreamInfo)
.audioInputSource(source)
audioCapturer = new AudioCapturer(audioCapturerInfo, currentAudioType);
private void runRecord() {
new Thread(new Runnable() {
File file = new File("/data/data/"+abilityContext.getBundleName()+"/files/record.pcm");
try (FileOutputStream outputStream = new FileOutputStream(file)) {
byte[] bytes = new byte[BUFFER_SIZE];
while (audioCapturer.read(bytes, 0, bytes.length) != -1) {
outputStream.write(bytes);
bytes = new byte[BUFFER_SIZE];
} catch (IOException exception) {
HiLog.error(LABEL_LOG, "record exception," + exception.getMessage());
private void requestPermissions() {
"ohos.permission.MICROPHONE"
abilityContext.requestPermissionsFromUser(Arrays.stream(permissions)
.filter(permission -> abilityContext.verifySelfPermission(permission) != IBundleManager.PERMISSION_GRANTED).toArray(String[]::new), 0);
第三,如何初始化ServiceBridge.java呢?创建一个中间文件,命名为ServiceBridgeStub.java。
public class MainAbility extends AceAbility {
public void onStart(Intent intent) {
ServiceBridgeStub.register(this);
ServiceBridgeStub.java
package com.harvey.hw.wear;
import java.lang.reflect.Field;
import java.util.HashMap;
import ohos.ace.ability.AceInternalAbility;
import ohos.app.AbilityContext;
import ohos.rpc.IRemoteObject;
import ohos.rpc.MessageOption;
import ohos.rpc.MessageParcel;
import ohos.rpc.RemoteException;
import ohos.utils.zson.ZSONObject;
public class ServiceBridgeStub extends AceInternalAbility {
public static final String BUNDLE_NAME = "com.harvey.hw.wear";
public static final String ABILITY_NAME = "com.harvey.hw.wear.ServiceBridgeStub";
public static final int ERROR = -1;
public static final int SUCCESS = 0;
public static final int OPCODE_startRecord = 11;
public static final int OPCODE_stopRecord = 12;
public static final int OPCODE_stopPlay = 13;
public static final int OPCODE_isPlaying = 14;
public static final int OPCODE_play = 15;
private static ServiceBridgeStub instance;
private ServiceBridge service;
private AbilityContext abilityContext;
public ServiceBridgeStub() {
super(BUNDLE_NAME, ABILITY_NAME);
public boolean onRemoteRequest(int code, MessageParcel data, MessageParcel reply,
Map<String, Object> result = new HashMap<String, Object>();
case OPCODE_startRecord: {
java.lang.String zsonStr = data.readString();
ZSONObject zsonObject = ZSONObject.stringToZSON(zsonStr);
result.put("code", SUCCESS);
result.put("abilityResult", service.startRecord());
case OPCODE_stopRecord: {
java.lang.String zsonStr = data.readString();
ZSONObject zsonObject = ZSONObject.stringToZSON(zsonStr);
result.put("code", SUCCESS);
result.put("abilityResult", service.stopRecord());
java.lang.String zsonStr = data.readString();
ZSONObject zsonObject = ZSONObject.stringToZSON(zsonStr);
result.put("code", SUCCESS);
result.put("abilityResult", service.stopPlay());
java.lang.String zsonStr = data.readString();
ZSONObject zsonObject = ZSONObject.stringToZSON(zsonStr);
result.put("code", SUCCESS);
result.put("abilityResult", service.isPlaying());
java.lang.String zsonStr = data.readString();
ZSONObject zsonObject = ZSONObject.stringToZSON(zsonStr);
result.put("code", SUCCESS);
result.put("abilityResult", service.play());
default: reply.writeString("Opcode is not defined!");
return sendResult(reply, result, option.getFlags() == MessageOption.TF_SYNC);
private boolean sendResult(MessageParcel reply, Map<String, Object> result, boolean isSync) {
reply.writeString(ZSONObject.toZSONString(result));
MessageParcel response = MessageParcel.obtain();
response.writeString(ZSONObject.toZSONString(result));
IRemoteObject remoteReply = reply.readRemoteObject();
remoteReply.sendRequest(0, response, MessageParcel.obtain(), new MessageOption());
} catch (RemoteException exception) {
public static void register(AbilityContext abilityContext) {
instance = new ServiceBridgeStub();
instance.onRegister(abilityContext);
private void onRegister(AbilityContext abilityContext) {
this.abilityContext = abilityContext;
this.service = new ServiceBridge();
this.setInternalAbilityHandler(this::onRemoteRequest);
Field field = ServiceBridge.class.getDeclaredField("abilityContext");
field.setAccessible(true);
field.set(this.service, abilityContext);
field.setAccessible(false);
} catch (NoSuchFieldException | IllegalAccessException e) {
ohos.hiviewdfx.HiLog.error(new ohos.hiviewdfx.HiLogLabel(0, 0, null), "context injection fail.");
public static void deregister() {
private void onDeregister() {
this.setInternalAbilityHandler(null);
最后,对自动生成js代码的功能,做编译配置
entry 主模块的 build.gradle 文件中添加如下代码
apply plugin: com.huawei.ohos.hap
apply plugin: com.huawei.ohos.decctest
def jsOutputDir = project.file("src/main/js/default/generated").toString()
annotationProcessorOptions {
arguments = [
"jsOutputDir": jsOutputDir]
// JS模板代码生成赋值
f2pautogenEnabled
true // 此处为启用js2java-codegen工具的开关
至此,关于穿戴应用研发的重点介绍已完成。
如果你还有兴趣,可以尝试实践一下BLE传输数据
结尾
这是去年研发穿戴应用的一个Demo应用工程结构
备注
看的HarmonyOS文档多了, 对于开发语言和文档中的描述容易产生混淆,这里附上一张简单的HarmonyOS 3.0 到 3.1版本的说明
鸿蒙开发岗位需要掌握那些核心要领?
目前还有很多小伙伴不知道要学习哪些鸿蒙技术?不知道重点掌握哪些?为了避免学习时频繁踩坑,最终浪费大量时间的。
自己学习时必须要有一份实用的鸿蒙(Harmony NEXT)资料非常有必要。 这里我推荐,根据鸿蒙开发官网梳理与华为内部人员的分享总结出的开发文档。内容包含了:【ArkTS、ArkUI、Stage模型、多端部署、分布式应用开发、音频、视频、WebGL、OpenHarmony多媒体技术、NAPI组件、OpenHarmony内核、Harmony南向开发、鸿蒙项目实战】等技术知识点。
废话就不多说了,接下来好好看下这份资料。
如果你是一名Android、Java、前端等等开发人员,想要转入鸿蒙方向发展。可以直接领取这份资料辅助你的学习。鸿蒙OpenHarmony知识←前往。下面是鸿蒙开发的学习路线图。
针对鸿蒙成长路线打造的鸿蒙学习文档。鸿蒙(OpenHarmony )学习手册(共计1236页)与鸿蒙(OpenHarmony )开发入门教学视频,帮助大家在技术的道路上更进一步。
其中内容包含:
《鸿蒙开发基础》鸿蒙OpenHarmony知识←前往
ArkTS
语言安装DevEco Studio运用你的第一个ArkTS
应用ArkUI声明式UI
开发.……
《鸿蒙开发进阶》鸿蒙OpenHarmony知识←前往
Stage模型入门
网络管理数据管理电话
服务分布式应用开发通知与窗口
管理多媒体
技术安全技能任务
管理WebGL国际化
开发应用测试DFX面向未来
设计鸿蒙
系统移植和裁剪定制……
《鸿蒙开发实战》鸿蒙OpenHarmony知识←前往
ArkTS实践UIAbility
应用网络案例……
最后
鸿蒙是完全具备无与伦比的机遇和潜力的;预计到年底将有 5,000 款的应用完成原生鸿蒙开发,这么多的应用需要开发,也就意味着需要有更多的鸿蒙人才。鸿蒙开发工程师也将会迎来爆发式的增长,学习鸿蒙势在必行!