同步操作将从 leesonzhong/TestMediaPipe 强制同步,此操作会覆盖自 Fork 仓库以来所做的任何修改,且无法恢复!!!
确定后同步将在后台操作,完成时将刷新页面,请耐心等待。
最好是在linux系统上下载所有文件,不要在windows下用git下载文件,否则windows下载的脚本运行会出现/bin/bash: $'\r': command not found
从github下载资源,https://github.com/google/mediapipe.git,我下载的是最新的,版本应该是v0.7.11
git clone https://github.com/google/mediapipe.git
# For Bazel 3.4.1
mkdir $HOME/bazel-3.4.1
cd $HOME/bazel-3.4.1
wget https://github.com/bazelbuild/bazel/releases/download/3.4.1/bazel-3.4.1-dist.zip
sudo apt-get install build-essential openjdk-8-jdk python zip unzip
unzip bazel-3.4.1-dist.zip
env EXTRA_BAZEL_ARGS="--host_javabase=@local_jdk//:jdk" bash ./compile.sh
sudo cp output/bazel /usr/local/bin/
看了下还有另外一种Bazel安装方法,不知道是否有区别
username@DESKTOP-TMVLBJ1:~$ sudo apt-get update && sudo apt-get install -y build-essential git python zip adb openjdk-8-jdk
username@DESKTOP-TMVLBJ1:~$ curl -sLO --retry 5 --retry-max-time 10 \
https://storage.googleapis.com/bazel/3.4.1/release/bazel-3.4.1-installer-linux-x86_64.sh && \
sudo mkdir -p /usr/local/bazel/3.4.1 && \
chmod 755 bazel-3.4.1-installer-linux-x86_64.sh && \
sudo ./bazel-3.4.1-installer-linux-x86_64.sh --prefix=/usr/local/bazel/3.4.1 && \
source /usr/local/bazel/3.4.1/lib/bazel/bin/bazel-complete.bash
username@DESKTOP-TMVLBJ1:~$ /usr/local/bazel/3.4.1/lib/bazel/bin/bazel version && \
alias bazel='/usr/local/bazel/3.4.1/lib/bazel/bin/bazel'
$ sudo apt-get install libopencv-core-dev libopencv-highgui-dev \
libopencv-calib3d-dev libopencv-features2d-dev \
libopencv-imgproc-dev libopencv-video-dev
关于安装adb,教程都有提到Win10 和 WSL 的 adb要安装同一版本。adb是Android 调试桥,提供给开发者进行安装和调试。但是我看了下官方教程,应该是为了编译安卓库后并安装demo apk到手机里,我们是为了编译库后重新新建android工程进行开发,所以不需要理会。
切换到mediapipe代码目录,测试安装环境
$ export GLOG_logtostderr=1
# if you are running on Linux desktop with CPU only
$ bazel run --define MEDIAPIPE_DISABLE_GPU=1 \
mediapipe/examples/desktop/hello_world:hello_world
# Should print:
# Hello World!
# Hello World!
# Hello World!
# Hello World!
# Hello World!
# Hello World!
# Hello World!
# Hello World!
# Hello World!
# Hello World!
ERROR: An error occurred during the fetch of repository 'org_tensorflow':
java.io.IOException: Error downloading [https://github.com/tensorflow/tensorflow/archive/0eadbb13cef1226b1bae17c941f7870734d97f8a.tar.gz] to /home/leeson/.cache/bazel/_bazel_leeson/fe152e59f3fccdc4f5b8d52e171385b5/external/org_tensorflow/0eadbb13cef1226b1bae17c941f7870734d97f8a.tar.gz: Read timed out
git config --global https.proxy http://127.0.0.1:10809
git config --global https.proxy https://127.0.0.1:10809
git config --global http.proxy 'socks5://127.0.0.1:10808'
git config --global https.proxy 'socks5://127.0.0.1:10808'
git config --global --list #查询代理设置
$env:HTTP_PROXY="127.0.0.1:10809"
$env:HTTPS_PROXY="127.0.0.1:10809"
set HTTP_PROXY=127.0.0.1:10809
set HTTPS_PROXY=127.0.0.1:10809
export http_proxy=127.0.0.1:10809
export https_proxy=127.0.0.1:10809
bazel --host_jvm_args "-DsocksProxyHost=127.0.0.1 -DsocksProxyPort=10809" run --define MEDIAPIPE_DISABLE_GPU=1 mediapipe/examples/desktop/hello_world:hello_world
bazel run \
--define MEDIAPIPE_DISABLE_GPU=1 \
--action_env PYTHON_BIN_PATH=$(which python) \
mediapipe/examples/desktop/hello_world
#或者设置环境变量,先用which找到地址,然后设置PYTHON_BIN_PATH
which python
export PYTHON_BIN_PATH=/usr/bin/python
sudo apt install python-pip
pip install numpy
官方教程地址https://google.github.io/mediapipe/getting_started/android.html,如果无法打开,修改dns为144.144.144.144再尝试
官方提供的问题解决方法地址https://google.github.io/mediapipe/getting_started/troubleshooting.html
其它第三方参考教程网址:Mediapipe框架学习之一——Win10安装Mediapipe环境https://blog.csdn.net/qq_36818449/article/details/103879399。Mediapipe框架在Android上的使用https://blog.csdn.net/qq_33200967/article/details/107174221
有两种方法来编译android的mediapipe,命令行编译和android studio编译。我使用的是命令行编译。
运行setup_android_sdk_and_ndk.sh脚本,自动下载和设置Android SDK 和 NDK,默认安装sdk路径是~/Android/Sdk,ndk安装路径是~/Android/Ndk
cd mediapipe
chmod 755 setup_android_sdk_and_ndk.sh
sudo ./setup_android_sdk_and_ndk.sh ~/Android/Sdk ~/Android/Ndk r18b
vim setup_android_sdk_and_ndk.sh
:set ff=unix
:wq
rm -rf ~/Android/Sdk
sudo ./setup_android_sdk_and_ndk.sh ~/Android/Sdk ~/Android/Ndk r18b
sudo vim /etc/environment
JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
rm -rf ~/Android/Sdk
sudo ./setup_android_sdk_and_ndk.sh ~/Android/Sdk ~/Android/Ndk r18b
#~/.bashrc或/etc/profile设置环境
export ANDROID_HOME=$PATH:/home/leeson/Android/Sdk
export ANDROID_NDK_HOME=$PATH:/home/leeson/Android/Ndk/android-ndk-r18b
#直接命令行设置环境变量,仅本次有效
export ANDROID_HOME=/home/leeson/Android/Sdk
export ANDROID_NDK_HOME=/home/leeson/Android/Ndk/android-ndk-r18b
bazel build -c opt --config=android_arm64 mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu:handtrackinggpu
#使用adb安装,也可以拷贝apk到对应目录比如d盘后,把apk通过usb、微信、QQ等方式发送到手机进行安装
adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/handtrackinggpu.apk
vim mediapipe/WORKSPACE
pip install future
编译报错An error occurred during the fetch of repository 'remotejdk11_linux',下载文件失败,翻墙软件添加https://mirror.bazel.build地址。
编译时报错/mediapipe/apps/basic:basic_lib depends on @maven//:androidx_concurrent_concurrent_futures in repository @maven which failed to fetch. no such package '@maven//': Error while fetching artifact with coursier: Timed out。应该是maven仓库获取不到对应文件,可以把仓库地址添加到翻墙软件上。打开mediapipe文件夹的WORKSPACE,搜索maven_install,里面repositories有几个地址,就是仓库地址,添加到翻墙软件上。
编译时报错 url.split("://", 1)[1] index out of range (index is 1, but sequence has 1 elements),把代理进行修改,关于https_proxy=http(s)是否加个s好像都一样
export http_proxy=http://127.0.0.1:10809
export https_proxy=http://127.0.0.1:10809
cd mediapipe/examples/android/src/java/com/google/mediapipe/apps
mkdir aar_example
cd aar_example
vim BUILD
#BUILD的内容
load("//mediapipe/java/com/google/mediapipe:mediapipe_aar.bzl", "mediapipe_aar")
mediapipe_aar(
name = "mp_face_detection_aar",
calculators = ["//mediapipe/graphs/face_detection:mobile_calculators"],
)
#注意要切换到mediapipe代码的根目录
bazel build -c opt --host_crosstool_top=@bazel_tools//tools/cpp:toolchain --fat_apk_cpu=arm64-v8a,armeabi-v7a \
//mediapipe/examples/android/src/java/com/google/mediapipe/apps/aar_example:mp_face_detection_aar
#编译成功后生成aar,把aar拷贝到你要存放的目录
mkdir ./aar_build
mkdir ./aar_build/face_detection/
cp bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/apps/aar_example/mp_face_detection_aar.aar ./aar_build/face_detection/mp_face_detection_aar.aar
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.google.mediapipe">
<uses-sdk
android:minSdkVersion="21"
android:targetSdkVersion="27" />
<application />
</manifest>
```
打开mediapipe_aar.bzl文件设置unix格式
```
vi mediapipe/java/com/google/mediapipe/mediapipe_aar.bzl
:set ff=unix
:wq
bazel build -c opt mediapipe/graphs/face_detection:face_detection_mobile_gpu_binary_graph
#编译成功后,把二进制图拷贝出来备用
cp bazel-bin/mediapipe/graphs/face_detection/face_detection_mobile_gpu.binarypb ./aar_build/face_detection/face_detection_mobile_gpu.binarypb
#还有其它资源开发也会用到,也拷贝出来备用,可以根据https://google.github.io/mediapipe/solutions/models介绍了解模型需要的资源
cp mediapipe/models/face_detection_front.tflite ./aar_build/face_detection/
cp mediapipe/models/face_detection_front_labelmap.txt ./aar_build/face_detection/
教程地址https://google.github.io/mediapipe/getting_started/android_archive_library.html
打开Anroid studio并打开android工程(如果没有,新建一个工程)
拷贝AAR包到工程的app/libs目录下。
拷贝binarypb、tflite、labelmap.txt拷贝到app/src/main/assets,如果assets没有,就新建一个
拷贝opencv的so库到app/src/main/jniLibs目录。如果没有jniLibs目录,新建一个。如果没有opencv的so库,从这里下载https://github.com/opencv/opencv/releases/download/3.4.3/opencv-3.4.3-android-sdk.zip,下载压缩包后解压,sdk/native/libs目录下有不同架构的so库。
修改app/build.gradle,添加MediaPipe依赖和AAR包。在android里指定编译的Java版本为1.8,最好minSdkVersion设置为21,targetSdkVersion设置为27或更高。CameraX core library的依赖最好根据官方教程进行设置,因为aar的代码调用的方法需要对应的camerax_version。
android {
// android 中添加
compileOptions {
targetCompatibility = 1.8
sourceCompatibility = 1.8
}
}
dependencies {
implementation fileTree(dir: 'libs', include: ['*.jar', '*.aar'])
implementation 'androidx.appcompat:appcompat:1.0.2'
implementation 'androidx.constraintlayout:constraintlayout:1.1.3'
testImplementation 'junit:junit:4.12'
androidTestImplementation 'androidx.test.ext:junit:1.1.0'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.1.1'
// MediaPipe deps
implementation 'com.google.flogger:flogger:0.3.1'
implementation 'com.google.flogger:flogger-system-backend:0.3.1'
implementation 'com.google.code.findbugs:jsr305:3.0.2'
implementation 'com.google.guava:guava:27.0.1-android'
implementation 'com.google.guava:guava:27.0.1-android'
implementation 'com.google.protobuf:protobuf-java:3.11.4'
// CameraX core library
def camerax_version = "1.0.0-beta10"
implementation "androidx.camera:camera-core:$camerax_version"
implementation "androidx.camera:camera-camera2:$camerax_version"
implementation "androidx.camera:camera-lifecycle:$camerax_version"
}
同步build.gradle修改时,报错ERROR: Unable to resolve dependency for ':app@debug/compileClasspath': Failed to transform file 'mp_face_detection_aar.aar' to match attributes {artifactType=jar},这是打包AAR时出错了,因为用官方的AAR没有问题。清空bazel缓存重新编译也不行。在wsl用git重新下载mediapipe代码,重新编译aar没问题,还有编译最后参数路径前面加上双斜杠不知道有没有影响。
在AndroidManifest.xml添加权限和使用说明
<!-- For using the camera -->
<uses-permission android:name="android.permission.CAMERA" />
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" />
<!-- For MediaPipe -->
<uses-feature android:glEsVersion="0x00020000" android:required="true" />
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent">
<FrameLayout
android:id="@+id/preview_display_layout"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:layout_weight="1"
tools:ignore="MissingConstraints">
<TextView
android:id="@+id/no_camera_access_view"
android:layout_height="fill_parent"
android:layout_width="fill_parent"
android:gravity="center"
android:text="no camera access" />
</FrameLayout>
</androidx.constraintlayout.widget.ConstraintLayout>
import android.graphics.SurfaceTexture;
import android.os.Bundle;
import androidx.appcompat.app.AppCompatActivity;
import android.util.Size;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.view.ViewGroup;
import com.google.mediapipe.components.CameraHelper;
import com.google.mediapipe.components.CameraXPreviewHelper;
import com.google.mediapipe.components.ExternalTextureConverter;
import com.google.mediapipe.components.FrameProcessor;
import com.google.mediapipe.components.PermissionHelper;
import com.google.mediapipe.framework.AndroidAssetUtil;
import com.google.mediapipe.glutil.EglManager;
/** Main activity of MediaPipe example apps. */
public class MainActivity extends AppCompatActivity {
private static final String TAG = "MainActivity";
private static final String BINARY_GRAPH_NAME = "mobile_gpu.binarypb";
private static final String INPUT_VIDEO_STREAM_NAME = "input_video";
private static final String OUTPUT_VIDEO_STREAM_NAME = "output_video";
private static final CameraHelper.CameraFacing CAMERA_FACING = CameraHelper.CameraFacing.FRONT;
// Flips the camera-preview frames vertically before sending them into FrameProcessor to be
// processed in a MediaPipe graph, and flips the processed frames back when they are displayed.
// This is needed because OpenGL represents images assuming the image origin is at the bottom-left
// corner, whereas MediaPipe in general assumes the image origin is at top-left.
private static final boolean FLIP_FRAMES_VERTICALLY = true;
static {
// Load all native libraries needed by the app.
System.loadLibrary("mediapipe_jni");
System.loadLibrary("opencv_java3");
}
// {@link SurfaceTexture} where the camera-preview frames can be accessed.
private SurfaceTexture previewFrameTexture;
// {@link SurfaceView} that displays the camera-preview frames processed by a MediaPipe graph.
private SurfaceView previewDisplayView;
// Creates and manages an {@link EGLContext}.
private EglManager eglManager;
// Sends camera-preview frames into a MediaPipe graph for processing, and displays the processed
// frames onto a {@link Surface}.
private FrameProcessor processor;
// Converts the GL_TEXTURE_EXTERNAL_OES texture from Android camera into a regular texture to be
// consumed by {@link FrameProcessor} and the underlying MediaPipe graph.
private ExternalTextureConverter converter;
// Handles camera access via the {@link CameraX} Jetpack support library.
private CameraXPreviewHelper cameraHelper;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
previewDisplayView = new SurfaceView(this);
setupPreviewDisplayView();
// Initialize asset manager so that MediaPipe native libraries can access the app assets, e.g.,
// binary graphs.
AndroidAssetUtil.initializeNativeAssetManager(this);
eglManager = new EglManager(null);
processor =
new FrameProcessor(
this,
eglManager.getNativeContext(),
BINARY_GRAPH_NAME,
INPUT_VIDEO_STREAM_NAME,
OUTPUT_VIDEO_STREAM_NAME);
processor.getVideoSurfaceOutput().setFlipY(FLIP_FRAMES_VERTICALLY);
PermissionHelper.checkAndRequestCameraPermissions(this);
}
@Override
protected void onResume() {
super.onResume();
converter = new ExternalTextureConverter(eglManager.getContext());
converter.setFlipY(FLIP_FRAMES_VERTICALLY);
converter.setConsumer(processor);
if (PermissionHelper.cameraPermissionsGranted(this)) {
startCamera();
}
}
@Override
protected void onPause() {
super.onPause();
converter.close();
}
@Override
public void onRequestPermissionsResult(
int requestCode, String[] permissions, int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
PermissionHelper.onRequestPermissionsResult(requestCode, permissions, grantResults);
}
private void setupPreviewDisplayView() {
previewDisplayView.setVisibility(View.GONE);
ViewGroup viewGroup = findViewById(R.id.preview_display_layout);
viewGroup.addView(previewDisplayView);
previewDisplayView
.getHolder()
.addCallback(
new SurfaceHolder.Callback() {
@Override
public void surfaceCreated(SurfaceHolder holder) {
processor.getVideoSurfaceOutput().setSurface(holder.getSurface());
}
@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
// (Re-)Compute the ideal size of the camera-preview display (the area that the
// camera-preview frames get rendered onto, potentially with scaling and rotation)
// based on the size of the SurfaceView that contains the display.
Size viewSize = new Size(width, height);
Size displaySize = cameraHelper.computeDisplaySizeFromViewSize(viewSize);
// Connect the converter to the camera-preview frames as its input (via
// previewFrameTexture), and configure the output width and height as the computed
// display size.
converter.setSurfaceTextureAndAttachToGLContext(
previewFrameTexture, displaySize.getWidth(), displaySize.getHeight());
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
processor.getVideoSurfaceOutput().setSurface(null);
}
});
}
private void startCamera() {
cameraHelper = new CameraXPreviewHelper();
cameraHelper.setOnCameraStartedListener(
surfaceTexture -> {
previewFrameTexture = surfaceTexture;
// Make the display view visible to start showing the preview. This triggers the
// SurfaceHolder.Callback added to (the holder of) previewDisplayView.
previewDisplayView.setVisibility(View.VISIBLE);
});
cameraHelper.startCamera(this, CAMERA_FACING, /*surfaceTexture=*/ null);
}
}
def camerax_version = "1.0.0-alpha06"
无法找到calculator,binarypb文件换成官方例子的,因为aar用的时官方例子的,可能需要配套
后面用wsl下载的代码编译的aar可以使用,添加后,手机安装运行后报错ava.lang.UnsatisfiedLinkError: dlopen failed: cannot locate symbol "aligned_alloc" referenced by "/data/app/com.octant.mediapipedemo-2/lib/arm64/libmediapipe_jni.so"。在mediapipe代码跟目录的WORKSPACE文件中找到android_ndk_repository并添加api_level = 21后重新编译。
vi WORKSPACE
android_ndk_repository(
name = "androidndk",
api_level = 21, #add this
)
def camerax_version = "1.0.0-beta10"
implementation "androidx.camera:camera-lifecycle:$camerax_version"
自己编译的aaar使用报错ava.lang.NoSuchMethodError: No virtual method provideSurface(Landroid/view/Surface;Ljava/util/concurrent/Executor;Landroidx/core/util/Consumer;)V in class Landroidx/camera/core/SurfaceRequest。发现是camerax_version没改动,应该是1.0.0-beta10,改成1.0.0-alpha10
使用自己编译的aar和二进制图,小米MIX2黑屏,小米10正常。但是把face_detection_back.tflite和face_detection_back_labelmap.txt添加到assets后,小米MIX2就可以了,摄像头还是前置的,不知道为什么会影响,试过删掉后小米MIX2又黑屏没图像了。
demo的项目上传到gitee。但是aar文件太大,.gitignore添加aar,不上传aar文件,可以从http://www.mediafire.com/file/sd8zoizhqwahsr7/mp_face_detection_aar.aar/file下载AAR文件,下载后放到MediaPipeDemo/app/libs目录下。
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。