首页云计算鸿蒙OS开发实战:【穿戴应用】

鸿蒙OS开发实战:【穿戴应用】

时间2024-08-03 00:54:38发布ongwu分类云计算浏览62

背景

HarmonyOS穿戴应用研发,仅仅是为了一线研发人员提供少许的帮助

在有些公司,可能因为业务的需要,所以要求研发人员一定要在华为手表穿戴上研发特定的功能,并且理所应当的认为这个开发成本就一个顺手的事情。

开发应用顺手的原因无非就几点

宣传 - HarmonyOS已经看似非常成熟实践UI效果 - 研发人员在手机上也能写出绚丽的页面和功能汇报 - 会写H5前端,就能立即开发HarmonyOS应用

穿戴研发核心注意点

developer.harmonyos.com 网站中的文档,只能阅读3.0版本智能穿戴 和 轻穿戴 应用均采用JS语言开发智能穿戴 对应产品 - HUAWEI WATCH 3轻穿戴 对应产品 - HUAWEI WATCH GT 2 Pro,HUAWEI WATCH GT 3和你产品功能相关的API,切记真机验证JS调用Java的文档一定要阅读,因为穿戴设备上的JS API功能相对来说比较少JS语言开发应用启动时的第一个页面是由 congfig.json 文件中的 module -> js -> pages 中的第一个位置文件决定的,比如如下截图中的红色框文件

重点-JS调用Java

场景

录音docs.qq.com/doc/DUmN4VVhBd3NxdExK前往。

效果图

为了快速演示效果,效果图来自IDE

代码说明

这里暂且认为你已阅读过3.0的开发文档,并且已经知晓所有的目录结构和开发语言

js代码包含三部分:1. hml页面 2. js逻辑处理 3. css页面样式

布局

testrecordaudiopage.js

import router from @system.router;
import featureAbility from @ohos.ability.featureAbility;
import serviceBridge from ../../generated/ServiceBridge.js;
var vm = null
export default {
data: {
title: "",
button_record: "录音",
show_button_record: true,
button_play: "播放",
show_button_play: true,
record: null
},
onInit() {
this.title = "JS 录音";
vm = this
},
onHide() {
if(this.record){
this.record.stopPlay()
this.record.stopRecord()
}
},
swIPeEvent(e) {
if (e.direction == "right") {
router.back()
}
},
async audioRecorderDemo(type) {
this.record = new serviceBridge()
if (type === recordaudio) {
if(this.button_record === 录音){
this.record.startRecord().then(value => {
if(value.abilityResult == 3){
vm.button_record = 停止录音
vm.show_button_play = false
}
});
} else {
this.record.stopRecord().then(value => {
if(value.abilityResult == 1){
vm.button_record = 录音
vm.show_button_play = true
}
});
}
} else if (type === playaudio) {
if(this.button_play === 播放){
this.record.play().then(value => {
if(value.abilityResult == 3){
vm.button_play = 停止播放
vm.show_button_record = false
var playTimeStatus = setInterval(()=>{
this.record.isPlaying().then(value => {
if(!value.abilityResult){
vm.button_play = 播放
vm.show_button_record = true
clearInterval(playTimeStatus)
}
})
}, 1000)
}
})
} else {
this.record.stopPlay().then(value => {
if(value.abilityResult == 1){
vm.button_play = 播放
vm.show_button_record = true
}
})
}
}
}
}

testrecordaudiopage.hml

<div class="container" onswIPe="swIPeEvent">
<text class="title">
{{ title }}
</text>
<div class="audiobutton">
<button class="buttons_record" show="{{show_button_record}}" onclick="audioRecorderDemo(recordaudio)">{{button_record}}</button>
<button class="buttons_play" show="{{show_button_play}}" onclick="audioRecorderDemo(playaudio)">{{button_play}}</button>
</div>
</div>

testrecordaudiopage.css

.container {
width: 100%;
flex-direction: column;
background-color: black;
}
.title {
font-size: 25fp;
text-align: center;
width: 100%;
margin: 20px;
}
.audiobutton {
width: 100%;
display: flex;
flex-direction: column;
align-items: center;
}
.buttons_record {
width: 45%;
height: 15%;
font-size: 20fp;
text-color: white;
background-color: #1F71FF;
}
.buttons_play {
width: 45%;
height: 15%;
font-size: 20fp;
margin-top: 10vp;
text-color: white;
background-color: #1F71FF;
}
Java API实现的功能

js/generated/ServiceBridge.js,注意这个文件是自动生成的.

// This file is automatically generated. Do not modify it!
const ABILITY_TYPE_EXTERNAL = 0;
const ABILITY_TYPE_INTERNAL = 1;
const ACTION_SYNC = 0;
const ACTION_ASYNC = 1;
const BUNDLE_NAME = com.harvey.hw.wear;
const ABILITY_NAME = com.harvey.hw.wear.ServiceBridgeStub;
......
const OPCODE_startRecord = 11;
const OPCODE_stopRecord = 12;
const OPCODE_stopPlay = 13;
const OPCODE_isPlaying = 14;
const OPCODE_play = 15;
const sendRequest = async (opcode, data) => {
var action = {};
action.bundleName = BUNDLE_NAME;
action.abilityName = ABILITY_NAME;
action.messageCode = opcode;
action.data = data;
action.abilityType = ABILITY_TYPE_INTERNAL;
action.syncOption = ACTION_SYNC;
return FeatureAbility.callAbility(action);
}
class ServiceBridge {
......
async startRecord() {
if (arguments.length != 0) {
throw new Error("Method expected 0 arguments, got " + arguments.length);
}
let data = {};
const result = await sendRequest(OPCODE_startRecord, data);
return JSON.parse(result);
}
async stopRecord() {
if (arguments.length != 0) {
throw new Error("Method expected 0 arguments, got " + arguments.length);
}
let data = {};
const result = await sendRequest(OPCODE_stopRecord, data);
return JSON.parse(result);
}
async stopPlay() {
if (arguments.length != 0) {
throw new Error("Method expected 0 arguments, got " + arguments.length);
}
let data = {};
const result = await sendRequest(OPCODE_stopPlay, data);
return JSON.parse(result);
}
async isPlaying() {
if (arguments.length != 0) {
throw new Error("Method expected 0 arguments, got " + arguments.length);
}
let data = {};
const result = await sendRequest(OPCODE_isPlaying, data);
return JSON.parse(result);
}
async play() {
if (arguments.length != 0) {
throw new Error("Method expected 0 arguments, got " + arguments.length);
}
let data = {};
const result = await sendRequest(OPCODE_play, data);
return JSON.parse(result);
}
}
export default ServiceBridge;

既然这个文件是自动生成的,那么继续看工程配置

首先,在 工程根目录/entry/src/main/java 文件夹下的包名(文章举例使用:com.harvey.hw.wear)中创建一个名为ServiceBridge.java的文件

其次,配置初始化和声明ServiceBridge.js文件的注解。为什么注册路径文件是MainAbility, 因为初始化的代码是在MainAbility.java文件

package com.harvey.hw.wear;
import com.harvey.hw.wear.bluetooth.BLEMain;
import ohos.annotation.f2pautogen.ContextInject;
import ohos.annotation.f2pautogen.InternalAbility;
import ohos.app.AbilityContext;
import ohos.bundle.IBundleManager;
import ohos.dcall.DistributedCallManager;
import ohos.hiviewdfx.HiLog;
import ohos.hiviewdfx.HiLogLabel;
import ohos.media.audio.*;
import java.io.*;
import java.util.Arrays;
@InternalAbility(registerTo = "com.harvey.hw.wear.MainAbility")
public class ServiceBridge {
private static final HiLogLabel LABEL_LOG = new HiLogLabel(3, 0xD001100, "ServiceBridge");
@ContextInject
AbilityContext abilityContext;
......
//样例:录音
/**
* 录音 - 启动
* @return
*/
public int startRecord() {
if(isRecording){
return 1;
}
if(abilityContext.verifySelfPermission("ohos.permission.MICROPHONE") == IBundleManager.PERMISSION_DENIED){
requestPermissions();
return 2;
}
HiLog.error(LABEL_LOG, "RecordServiceAbility::onStart");
init();
runRecord();
return 3;
}
/**
* 录音 - 停止
* @return
*/
public int stopRecord() {
if (isRecording && audioCapturer.stop()) {
audioCapturer.release();
}
isRecording = false;
return 1;
}
private AudioRenderer renderer;
private static boolean isPlaying = false;
/**
* 播放 - 停止
* @return
*/
public int stopPlay() {
if(isPlaying && renderer.stop()){
renderer.release();
}
isPlaying = false;
return 1;
}
/**
* 获取音频播放状态
* @return
*/
public boolean isPlaying(){
return isPlaying;
}
/**
* 播放 - 启动
* @return
*/
public int play() {
if(isPlaying){
return 1;
}
isPlaying = true;
String Path = "/data/data/"+abilityContext.getBundleName()+"/files/record.pcm";
File pcmFilePath = new File(Path);
if(!pcmFilePath.isFile() || !pcmFilePath.exists()){
isPlaying = false;
return 2;
}
new Thread(new Runnable() {
@Override
public void run() {
AudioStreamInfo audioStreamInfo = new AudioStreamInfo.Builder().sampleRate(SAMPLE_RATE)
.encodingFormat(ENCODING_FORMAT)
.channelMask(CHANNEL_OUT_MASK)
.streamUsage(AudioStreamInfo.StreamUsage.STREAM_USAGE_MEDIA)
.build();
AudioRendererInfo audioRendererInfo = new AudioRendererInfo.Builder().audioStreamInfo(audioStreamInfo)
.audioStreamOutputFlag(AudioRendererInfo.AudioStreamOutputFlag.AUDIO_STREAM_OUTPUT_FLAG_DIRECT_PCM)
.sessionID(AudioRendererInfo.SESSION_ID_UNSPECIFIED)
.bufferSizeInBytes(BUFFER_SIZE)
.isOffload(false)
.build();
renderer = new AudioRenderer(audioRendererInfo, AudioRenderer.PlayMode.MODE_STREAM);
AudioInterrupt audioInterrupt = new AudioInterrupt();
AudioManager audioManager = new AudioManager();
audioInterrupt.setStreamInfo(audioStreamInfo);
audioInterrupt.setInterruptListener(new AudioInterrupt.InterruptListener() {
@Override
public void onInterrupt(int type, int hint) {
if (type == AudioInterrupt.INTERRUPT_TYPE_BEGIN
&& hint == AudioInterrupt.INTERRUPT_HINT_PAUSE) {
renderer.pause();
} else if (type == AudioInterrupt.INTERRUPT_TYPE_BEGIN
&& hint == AudioInterrupt.INTERRUPT_HINT_NONE) {
} else if (type == AudioInterrupt.INTERRUPT_TYPE_END && (
hint == AudioInterrupt.INTERRUPT_HINT_NONE
|| hint == AudioInterrupt.INTERRUPT_HINT_RESUME)) {
renderer.start();
} else {
HiLog.error(LABEL_LOG, "unexpected type or hint");
}
}
});
audioManager.activateAudioInterrupt(audioInterrupt);
AudioDeviceDescrIPtor[] devices = AudioManager.getDevices(AudioDeviceDescrIPtor.DeviceFlag.INPUT_DEVICES_FLAG);
for(AudioDeviceDescrIPtor des:devices){
if(des.getType() == AudioDeviceDescrIPtor.DeviceType.SPEAKER){
renderer.setOutputDevice(des);
break;
}
}
renderer.setVolume(1.0f);
renderer.start();
BufferedInputStream bis1 = null;
try {
bis1 = new BufferedInputStream(new FileInputStream(pcmFilePath));
int minBufferSize = renderer.getMinBufferSize(SAMPLE_RATE, ENCODING_FORMAT,
CHANNEL_OUT_MASK);
byte[] buffers = new byte[minBufferSize];
while ((bis1.read(buffers)) != -1) {
if(isPlaying){
renderer.write(buffers, 0, buffers.length);
renderer.flush();
}
}
} catch (Exception e) {
e.printStackTrace();
} finally {
if (bis1 != null) {
try {
bis1.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
stopPlay();
}
}).start();
return 3;
}
private AudioCapturer audioCapturer;
private static final AudioStreamInfo.EncodingFormat ENCODING_FORMAT = AudioStreamInfo.EncodingFormat.ENCODING_PCM_16BIT;
private static final AudioStreamInfo.ChannelMask CHANNEL_IN_MASK = AudioStreamInfo.ChannelMask.CHANNEL_IN_STEREO;
private static final AudioStreamInfo.ChannelMask CHANNEL_OUT_MASK = AudioStreamInfo.ChannelMask.CHANNEL_OUT_STEREO;
private static final int SAMPLE_RATE = 16000;
private static final int BUFFER_SIZE = 1024;
private static boolean isRecording = false;
private void init() {
AudioDeviceDescrIPtor[] devices = AudioManager.getDevices(AudioDeviceDescrIPtor.DeviceFlag.INPUT_DEVICES_FLAG);
AudioDeviceDescrIPtor currentAudioType = null;
for(AudioDeviceDescrIPtor des:devices){
if(des.getType() == AudioDeviceDescrIPtor.DeviceType.MIC){
currentAudioType = des;
break;
}
}
AudioCapturerInfo.AudioInputSource source = AudioCapturerInfo.AudioInputSource.AUDIO_INPUT_SOURCE_MIC;
AudioStreamInfo audioStreamInfo = new AudioStreamInfo.Builder().audioStreamFlag(
AudioStreamInfo.AudioStreamFlag.AUDIO_STREAM_FLAG_AUDIBILITY_ENFORCED)
.encodingFormat(ENCODING_FORMAT)
.channelMask(CHANNEL_IN_MASK)
.streamUsage(AudioStreamInfo.StreamUsage.STREAM_USAGE_MEDIA)
.sampleRate(SAMPLE_RATE)
.build();
AudioCapturerInfo audioCapturerInfo = new AudioCapturerInfo.Builder().audioStreamInfo(audioStreamInfo)
.audioInputSource(source)
.build();
audioCapturer = new AudioCapturer(audioCapturerInfo, currentAudioType);
}
private void runRecord() {
isRecording = true;
new Thread(new Runnable() {
@Override
public void run() {
//启动录音
audioCapturer.start();
File file = new File("/data/data/"+abilityContext.getBundleName()+"/files/record.pcm");
if (file.isFile()) {
file.delete();
}
try (FileOutputStream outputStream = new FileOutputStream(file)) {
byte[] bytes = new byte[BUFFER_SIZE];
while (audioCapturer.read(bytes, 0, bytes.length) != -1) {
outputStream.write(bytes);
bytes = new byte[BUFFER_SIZE];
outputStream.flush();
if(!isRecording){
outputStream.close();
break;
}
}
} catch (IOException exception) {
HiLog.error(LABEL_LOG, "record exception," + exception.getMessage());
}
}
}).start();
}
private void requestPermissions() {
String[] permissions = {
"ohos.permission.MICROPHONE"
};
abilityContext.requestPermissionsFromUser(Arrays.stream(permissions)
.filter(permission -> abilityContext.verifySelfPermission(permission) != IBundleManager.PERMISSION_GRANTED).toArray(String[]::new), 0);
}
}

第三,如何初始化ServiceBridge.java呢?创建一个中间文件,命名为ServiceBridgeStub.java。

public class MainAbility extends AceAbility {
@Override
public void onStart(Intent intent) {
ServiceBridgeStub.register(this);
......
}
......
}

 ServiceBridgeStub.java 

package com.harvey.hw.wear;
import java.lang.Object;
import java.lang.String;
import java.lang.reflect.Field;
import java.util.HashMap;
import java.util.Map;
import ohos.ace.ability.AceInternalAbility;
import ohos.app.AbilityContext;
import ohos.rpc.IRemoteObject;
import ohos.rpc.MessageOption;
import ohos.rpc.MessageParcel;
import ohos.rpc.RemoteException;
import ohos.utils.zson.ZSONObject;
public class ServiceBridgeStub extends AceInternalAbility {
public static final String BUNDLE_NAME = "com.harvey.hw.wear";
public static final String ABILITY_NAME = "com.harvey.hw.wear.ServiceBridgeStub";
public static final int ERROR = -1;
public static final int SUCCESS = 0;
......
public static final int OPCODE_startRecord = 11;
public static final int OPCODE_stopRecord = 12;
public static final int OPCODE_stopPlay = 13;
public static final int OPCODE_isPlaying = 14;
public static final int OPCODE_play = 15;
private static ServiceBridgeStub instance;
private ServiceBridge service;
private AbilityContext abilityContext;
public ServiceBridgeStub() {
super(BUNDLE_NAME, ABILITY_NAME);
}
public boolean onRemoteRequest(int code, MessageParcel data, MessageParcel reply,
MessageOption option) {
Map<String, Object> result = new HashMap<String, Object>();
switch(code) {
......
case OPCODE_startRecord: {
java.lang.String zsonStr = data.readString();
ZSONObject zsonObject = ZSONObject.stringToZSON(zsonStr);
result.put("code", SUCCESS);
result.put("abilityResult", service.startRecord());
break;}
case OPCODE_stopRecord: {
java.lang.String zsonStr = data.readString();
ZSONObject zsonObject = ZSONObject.stringToZSON(zsonStr);
result.put("code", SUCCESS);
result.put("abilityResult", service.stopRecord());
break;}
case OPCODE_stopPlay: {
java.lang.String zsonStr = data.readString();
ZSONObject zsonObject = ZSONObject.stringToZSON(zsonStr);
result.put("code", SUCCESS);
result.put("abilityResult", service.stopPlay());
break;}
case OPCODE_isPlaying: {
java.lang.String zsonStr = data.readString();
ZSONObject zsonObject = ZSONObject.stringToZSON(zsonStr);
result.put("code", SUCCESS);
result.put("abilityResult", service.isPlaying());
break;}
case OPCODE_play: {
java.lang.String zsonStr = data.readString();
ZSONObject zsonObject = ZSONObject.stringToZSON(zsonStr);
result.put("code", SUCCESS);
result.put("abilityResult", service.play());
break;}
default: reply.writeString("Opcode is not defined!");
return false;
}
return sendResult(reply, result, option.getFlags() == MessageOption.TF_SYNC);
}
private boolean sendResult(MessageParcel reply, Map<String, Object> result, boolean isSync) {
if (isSync) {
reply.writeString(ZSONObject.toZSONString(result));
} else {
MessageParcel response = MessageParcel.obtain();
response.writeString(ZSONObject.toZSONString(result));
IRemoteObject remoteReply = reply.readRemoteObject();
try {
remoteReply.sendRequest(0, response, MessageParcel.obtain(), new MessageOption());
response.reclaim();
} catch (RemoteException exception) {
return false;
}
}
return true;
}
public static void register(AbilityContext abilityContext) {
instance = new ServiceBridgeStub();
instance.onRegister(abilityContext);
}
private void onRegister(AbilityContext abilityContext) {
this.abilityContext = abilityContext;
this.service = new ServiceBridge();
this.setInternalAbilityHandler(this::onRemoteRequest);
try {
Field field = ServiceBridge.class.getDeclaredField("abilityContext");
field.setAccessible(true);
field.set(this.service, abilityContext);
field.setAccessible(false);
} catch (NoSuchFieldException | IllegalAccessException e) {
ohos.hiviewdfx.HiLog.error(new ohos.hiviewdfx.HiLogLabel(0, 0, null), "context injection fail.");
}
}
public static void deregister() {
instance.onDeregister();
}
private void onDeregister() {
abilityContext = null;
this.setInternalAbilityHandler(null);
}
}

最后,对自动生成js代码功能,做编译配置

entry 主模块的 build.gradle 文件中添加如下代码

apply plugin: com.huawei.ohos.hap
apply plugin: com.huawei.ohos.decctest
ohos {
compileSdkVersion 6
defaultConfig {
compatibleSdkVersion 6
// 在文件头部定义JS模板代码生成路径
def jsOutputDir = project.file("src/main/js/default/generated").toString()
// 在ohos -> defaultConfig中设置JS模板代码生成路径
javaCompileOptions {
annotationProcessorOptions {
arguments = ["jsOutputDir": jsOutputDir] // JS模板代码生成赋值
}
}
}
......
compileOptions {
f2pautogenEnabled true // 此处为启用js2java-codegen工具的开关
}
}
......

至此,关于穿戴应用研发的重点介绍已完成。

如果你还有兴趣,可以尝试实践一下BLE传输数据

结尾

这是去年研发穿戴应用的一个Demo应用工程结构

备注

看的HarmonyOS文档多了, 对于开发语言文档中的描述容易产生混淆,这里附上一张简单的HarmonyOS 3.0 到 3.1版本的说明

鸿蒙开发岗位需要掌握那些核心要领?

目前还有很多小伙伴不知道要学习哪些鸿蒙技术?不知道重点掌握哪些?为了避免学习时频繁踩坑,最终浪费大量时间的。

自己学习时必须要有一份实用的鸿蒙(Harmony NEXT)资料非常有必要。 这里我推荐,根据鸿蒙开发官网梳理与华为内部人员的分享总结出的开发文档。内容包含了:【ArkTS、ArkUI、Stage模型、多端部署分布式应用开发音频视频、WebGL、OpenHarmony多媒体技术、NAPI组件、OpenHarmony内核、Harmony南向开发、鸿蒙项目实战】等技术知识点。

废话就不多说了,接下来好好看下这份资料

如果你是一名Android、Java、前端等等开发人员,想要转入鸿蒙方向发展。可以直接领取这份资料辅助你的学习鸿蒙OpenHarmony知识←前往。下面是鸿蒙开发的学习路线图。

针对鸿蒙成长路线打造的鸿蒙学习文档鸿蒙(OpenHarmony )学习手册(共计1236页)与鸿蒙(OpenHarmony )开发入门教学视频帮助大家在技术的道路上更进一步。

其中内容包含:

《鸿蒙开发基础》鸿蒙OpenHarmony知识←前往

ArkTS语言安装DevEco Studio运用你的第一个ArkTS应用ArkUI声明式UI开发.……

《鸿蒙开发进阶》鸿蒙OpenHarmony知识←前往

Stage模型入门网络管理数据管理电话服务分布式应用开发通知与窗口管理多媒体技术安全技能任务管理WebGL国际化开发应用测试DFX面向未来设计鸿蒙系统移植和裁剪定制……

《鸿蒙开发实战》鸿蒙OpenHarmony知识←前往

ArkTS实践UIAbility应用网络案例…… 最后

鸿蒙是完全具备无与伦比的机遇和潜力的;预计到年底将有 5,000 款的应用完成原生鸿蒙开发,这么多的应用需要开发,也就意味着需要有更多的鸿蒙人才。鸿蒙开发工程师也将会迎来爆发式的增长,学习鸿蒙势在必行!

Ongwu博客 版权声明:以上内容未经允许不得转载!授权事宜或对内容有异议或投诉,请联系站长,将尽快回复您,谢谢合作!

展开全文READ MORE
【硬核】鸿蒙HarmonyOS NEXT开发技术最全学习路线指南 C++类和对象

游客 回复需填写必要信息