1 Star 0 Fork 1

liujack2021 / TestMediaPipe

forked from leesonzhong / TestMediaPipe 
加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
该仓库未声明开源许可证文件(LICENSE),使用请关注具体项目描述及其代码上游依赖。
克隆/下载
贡献代码
同步代码
取消
提示: 由于 Git 不支持空文件夾,创建文件夹后会生成空的 .keep 文件
Loading...
README

下载MediaPipe和工具包

  git clone https://github.com/google/mediapipe.git
  • 安装Bazel
  # For Bazel 3.4.1
  mkdir $HOME/bazel-3.4.1
  cd $HOME/bazel-3.4.1
  wget https://github.com/bazelbuild/bazel/releases/download/3.4.1/bazel-3.4.1-dist.zip
  sudo apt-get install build-essential openjdk-8-jdk python zip unzip
  unzip bazel-3.4.1-dist.zip
  env EXTRA_BAZEL_ARGS="--host_javabase=@local_jdk//:jdk" bash ./compile.sh
  sudo cp output/bazel /usr/local/bin/

看了下还有另外一种Bazel安装方法,不知道是否有区别

  username@DESKTOP-TMVLBJ1:~$ sudo apt-get update && sudo apt-get install -y build-essential git python zip adb openjdk-8-jdk
  username@DESKTOP-TMVLBJ1:~$ curl -sLO --retry 5 --retry-max-time 10 \
  https://storage.googleapis.com/bazel/3.4.1/release/bazel-3.4.1-installer-linux-x86_64.sh && \
  sudo mkdir -p /usr/local/bazel/3.4.1 && \
  chmod 755 bazel-3.4.1-installer-linux-x86_64.sh && \
  sudo ./bazel-3.4.1-installer-linux-x86_64.sh --prefix=/usr/local/bazel/3.4.1 && \
  source /usr/local/bazel/3.4.1/lib/bazel/bin/bazel-complete.bash
  
  username@DESKTOP-TMVLBJ1:~$ /usr/local/bazel/3.4.1/lib/bazel/bin/bazel version && \
  alias bazel='/usr/local/bazel/3.4.1/lib/bazel/bin/bazel'
  • 安装 OpenCV 和 FFmpeg
  $ sudo apt-get install libopencv-core-dev libopencv-highgui-dev \
                         libopencv-calib3d-dev libopencv-features2d-dev \
                         libopencv-imgproc-dev libopencv-video-dev
  • 关于安装adb,教程都有提到Win10 和 WSL 的 adb要安装同一版本。adb是Android 调试桥,提供给开发者进行安装和调试。但是我看了下官方教程,应该是为了编译安卓库后并安装demo apk到手机里,我们是为了编译库后重新新建android工程进行开发,所以不需要理会。

  • 切换到mediapipe代码目录,测试安装环境

  $ export GLOG_logtostderr=1
  
  # if you are running on Linux desktop with CPU only
  $ bazel run --define MEDIAPIPE_DISABLE_GPU=1 \
      mediapipe/examples/desktop/hello_world:hello_world
  
  # Should print:
  # Hello World!
  # Hello World!
  # Hello World!
  # Hello World!
  # Hello World!
  # Hello World!
  # Hello World!
  # Hello World!
  # Hello World!
  # Hello World!
  • 第一次测试安装环境时会从github下载一些东西,网络不行可能会导致下载失败。使用翻墙软件,网上找教程,git命令\cmd\poewrshell 设置对应翻墙软件代理,我使用windows的linux也就是wsl,所以设置cmd,对应端口要根据翻墙软件设置,我用v2ray要设置好允许局域网连接。但是我发现cmd是可以的,但是输入bash进入linux后就用curl测试没效果了,把set改成export,大写HTTP_PROXY和HTTPS_PROXY改成小写后就可以了。(奇怪的是我测试安装环境时设置代理时用cmd设置,不会出现网络问题了,但是后面编译android的aar库会出现网络问题,查到可以用curl进行测试,发现代理没设置成功,最后改成linux设置)。设置代理可以参考这个网址(python设置代理和添加镜像源的方法)https://www.jb51.net/article/180438.html
  ERROR: An error occurred during the fetch of repository 'org_tensorflow':
       java.io.IOException: Error downloading [https://github.com/tensorflow/tensorflow/archive/0eadbb13cef1226b1bae17c941f7870734d97f8a.tar.gz] to /home/leeson/.cache/bazel/_bazel_leeson/fe152e59f3fccdc4f5b8d52e171385b5/external/org_tensorflow/0eadbb13cef1226b1bae17c941f7870734d97f8a.tar.gz: Read timed out
  • git 命令设置代理,设置永久有效
  git config --global https.proxy http://127.0.0.1:10809	
    git config --global https.proxy https://127.0.0.1:10809
  git config --global http.proxy 'socks5://127.0.0.1:10808'
    git config --global https.proxy 'socks5://127.0.0.1:10808'
    git config --global --list	#查询代理设置
  • PowerShell 代理设置,设置仅本次有效
  $env:HTTP_PROXY="127.0.0.1:10809"
    $env:HTTPS_PROXY="127.0.0.1:10809"
  • CMD 代理设置,设置内容仅本次有效
  set HTTP_PROXY=127.0.0.1:10809
    set HTTPS_PROXY=127.0.0.1:10809
  • LINUX 代理设置
    export http_proxy=127.0.0.1:10809
    export https_proxy=127.0.0.1:10809
  • bazel可以设置代理并运行,--host_jvm_args "-DsocksProxyHost= -DsocksProxyPort="。但是感觉没效果
    bazel --host_jvm_args "-DsocksProxyHost=127.0.0.1 -DsocksProxyPort=10809" run --define MEDIAPIPE_DISABLE_GPU=1 mediapipe/examples/desktop/hello_world:hello_world
  • 第一次测试安装环境时可能会报错An error occurred during the fetch of repository 'local_execution_config_python' ---省略---Is the Python binary path set up right?,找到python的bin并添加到bazel命令后面--action_env PYTHON_BIN_PATH=,网上教程是which python3,但是我使用不行,改成which python。或者使用export设置环境变量
      bazel run \
          --define MEDIAPIPE_DISABLE_GPU=1 \
        --action_env PYTHON_BIN_PATH=$(which python) \
          mediapipe/examples/desktop/hello_world
    #或者设置环境变量,先用which找到地址,然后设置PYTHON_BIN_PATH
    which python
    export PYTHON_BIN_PATH=/usr/bin/python
  • 第一次测试安装环境时可能会报错An error occurred during the fetch of repository 'local_execution_config_python' ---省略--- Is numpy installed?有些python安装包缺少,使用pip install或pip3 install安装python或python3的安装包,但是我使用提示Command 'pip' not found,先运行命令sudo apt install python-pip
    sudo apt install python-pip
    pip install numpy

编译MediaPipe

  cd mediapipe
  chmod 755 setup_android_sdk_and_ndk.sh
  sudo ./setup_android_sdk_and_ndk.sh ~/Android/Sdk ~/Android/Ndk r18b
  • 如果运行脚本报错$'\r': command not found,这是因为是在windows下使用git下载的脚本,因此要用vim进行修改
    vim setup_android_sdk_and_ndk.sh
    :set ff=unix
    :wq
  • 如果运行脚本下载时报错An error occurred while preparing SDK package Android Emulator: archive is not a ZIP archive,可能是权限不够,加上sudo
    rm -rf ~/Android/Sdk
    sudo ./setup_android_sdk_and_ndk.sh ~/Android/Sdk ~/Android/Ndk r18b
  • 如果报错ERROR: JAVA_HOME is set to an invalid directory: /usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java,打开/etc/environment,改成JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64,就就是去掉/jre/bin/java。
    sudo vim /etc/environment
    JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
    rm -rf ~/Android/Sdk
    sudo ./setup_android_sdk_and_ndk.sh ~/Android/Sdk ~/Android/Ndk r18b
  • 设置ANDROID_HOME和ANDROID_NDK_HOME环境变量,注意Android SDK版本是28.0.3或更高,Android NDK版本是r18b或更高。打开~/.bashrc添加环境变量后,执行source ~/.bashrc对配置进行生效。或者直接命令设置,(注意NDK设置到具体到android-ndk-r18b,我在windows下载的代码只需要/home/leeson/Android/Ndk,但我第二次在wsl下载的代码需要/home/leeson/Android/Ndk/android-ndk-r18b。
  #~/.bashrc或/etc/profile设置环境
  export ANDROID_HOME=$PATH:/home/leeson/Android/Sdk
  export ANDROID_NDK_HOME=$PATH:/home/leeson/Android/Ndk/android-ndk-r18b
  #直接命令行设置环境变量,仅本次有效
  export ANDROID_HOME=/home/leeson/Android/Sdk
  export ANDROID_NDK_HOME=/home/leeson/Android/Ndk/android-ndk-r18b
  • 尝试编译官方提供的例子,编译时好像还要下载很多库或者是跟网上的库进行对比,所以最好设置代理。最后得到的是apk,可以使用adb进行安装或拷贝apk到手机上手动安装。(如果需要自己编译库进行开发,可以跳过这一步,这一步主要是看mediapipe的效果,小米MAX2安卓版本7.1安装后打开闪退,小米10安卓版本11安装后打开正常)
  bazel build -c opt --config=android_arm64 mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu:handtrackinggpu
  
  #使用adb安装,也可以拷贝apk到对应目录比如d盘后,把apk通过usb、微信、QQ等方式发送到手机进行安装
  adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/handtrackinggpu.apk
  • 编译报错WORKSPACE:245:5: duplicate keyword argument: path。看了下,重复设置了4个path,把重复的3个删除
    vim mediapipe/WORKSPACE
  • 编译报错ImportError: No module named builtins,安装future
    pip install future
  • 编译报错An error occurred during the fetch of repository 'remotejdk11_linux',下载文件失败,翻墙软件添加https://mirror.bazel.build地址。

  • 编译时报错/mediapipe/apps/basic:basic_lib depends on @maven//:androidx_concurrent_concurrent_futures in repository @maven which failed to fetch. no such package '@maven//': Error while fetching artifact with coursier: Timed out。应该是maven仓库获取不到对应文件,可以把仓库地址添加到翻墙软件上。打开mediapipe文件夹的WORKSPACE,搜索maven_install,里面repositories有几个地址,就是仓库地址,添加到翻墙软件上。

  • 编译时报错 url.split("://", 1)[1] index out of range (index is 1, but sequence has 1 elements),把代理进行修改,关于https_proxy=http(s)是否加个s好像都一样

    export http_proxy=http://127.0.0.1:10809
    export https_proxy=http://127.0.0.1:10809
  • 开发者进行开发需要编译得到MediaPipe AAR。在mediapipe/examples/android/src/java/com/google/mediapipe/apps目录下新建文件夹,比如叫aar_example,新建BUILD文件。主要是name和calculators,name是设置生成aar文件名,calculators是使用的模型和计算单元,根据需求从mediapipe/graphs里面选择,比如face_detection是mediapipe/graphs目录的其中一个模型,mobile_calculators是计算单元,在对应模型face_detection目录的BUILD文件的cc_library的name可以找到可以使用的计算单元,由于android是移动端,所以要选择mobile前缀的计算单元。
  cd mediapipe/examples/android/src/java/com/google/mediapipe/apps
  mkdir aar_example
  cd aar_example
  vim BUILD
  
  #BUILD的内容
  load("//mediapipe/java/com/google/mediapipe:mediapipe_aar.bzl", "mediapipe_aar")
  
  mediapipe_aar(
      name = "mp_face_detection_aar",
      calculators = ["//mediapipe/graphs/face_detection:mobile_calculators"],
  )
  • 编译新建的BUILD文件,生成AAR。不知道--host_crosstool_top有什么用,有些教程没有这个。fat_apk_cpu是编译对应的CPU架构,后面跟着前面新建的文件夹和要生成的aar文件名
  #注意要切换到mediapipe代码的根目录
  bazel build -c opt --host_crosstool_top=@bazel_tools//tools/cpp:toolchain --fat_apk_cpu=arm64-v8a,armeabi-v7a \
  //mediapipe/examples/android/src/java/com/google/mediapipe/apps/aar_example:mp_face_detection_aar
  
  
  #编译成功后生成aar,把aar拷贝到你要存放的目录
  mkdir ./aar_build
  mkdir ./aar_build/face_detection/
  cp bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/apps/aar_example/mp_face_detection_aar.aar ./aar_build/face_detection/mp_face_detection_aar.aar
  • 编译时报错/bin/bash: $'\r': command not found,应该是从windows下载代码导致的,要修改脚本。如果错误不够详细,在编译命令加上--verbose_failures。错误有提到AndroidManifest.xml,大致内容如下,搜索了下mediapipe/java/com/google/mediapipe/mediapipe_aar.bzl有这些内容,所以打开mediapipe_aar.bzl文件并进行修改
    <?xml version="1.0" encoding="utf-8"?>
    <manifest xmlns:android="http://schemas.android.com/apk/res/android"
        package="com.google.mediapipe">
      <uses-sdk
            android:minSdkVersion="21"
          android:targetSdkVersion="27" />
        <application />
    </manifest>
    ```

    打开mediapipe_aar.bzl文件设置unix格式

    ```
    vi mediapipe/java/com/google/mediapipe/mediapipe_aar.bzl
    :set ff=unix
    :wq
  • 生成Mediapipe的二进制图。最后面的参数从前面编译aar的BUILD文件得到的,比如BUILD文件有一句calculators = ["//mediapipe/graphs/face_detection:mobile_calculators"],所以找到mediapipe/graphs/face_detection的BUILD文件,打开BUILD文件找到mediapipe_binary_graph,如果里面有对应的calculators(比如mobile_calculators),那里面的name的值就是可以用的。比如mediapipe/graphs/face_detection的BUILD文件的mediapipe_binary_graph含有deps = [":mobile_calculators"]的有两个face_detection_mobile_cpu_binary_graph和face_detection_mobile_gpu_binary_graph,分别是使用cpu和gpu的。
  bazel build -c opt mediapipe/graphs/face_detection:face_detection_mobile_gpu_binary_graph
  
  #编译成功后,把二进制图拷贝出来备用
  cp bazel-bin/mediapipe/graphs/face_detection/face_detection_mobile_gpu.binarypb ./aar_build/face_detection/face_detection_mobile_gpu.binarypb
  #还有其它资源开发也会用到,也拷贝出来备用,可以根据https://google.github.io/mediapipe/solutions/models介绍了解模型需要的资源
  cp mediapipe/models/face_detection_front.tflite ./aar_build/face_detection/
  cp mediapipe/models/face_detection_front_labelmap.txt ./aar_build/face_detection/
  • INFO: Deleting stale sandbox base打印后,好像一直卡住,不用担心,应该有在运行。在windows的任务管理器有看到一个进程叫“Java”有一直在运行。我这边大概1个小时,所以不要耐心,我这边命令行有时输出会卡,所以我看任务管理器“java”进程结束了,但是命令行没输出,要按一下“enter”键刷新一下才有成功编译的输出。但是我第二次在wsl上用git重新下载的mediapipe的代码,编译二进制图一直有输出,没有好像卡住的现象。

导入MediaPipe工程

  android {
      // android 中添加
      compileOptions {
          targetCompatibility = 1.8
          sourceCompatibility = 1.8
      }
  }
  dependencies {
  
      implementation fileTree(dir: 'libs', include: ['*.jar', '*.aar'])
      implementation 'androidx.appcompat:appcompat:1.0.2'
      implementation 'androidx.constraintlayout:constraintlayout:1.1.3'
      testImplementation 'junit:junit:4.12'
      androidTestImplementation 'androidx.test.ext:junit:1.1.0'
      androidTestImplementation 'androidx.test.espresso:espresso-core:3.1.1'
      // MediaPipe deps
      implementation 'com.google.flogger:flogger:0.3.1'
      implementation 'com.google.flogger:flogger-system-backend:0.3.1'
      implementation 'com.google.code.findbugs:jsr305:3.0.2'
      implementation 'com.google.guava:guava:27.0.1-android'
      implementation 'com.google.guava:guava:27.0.1-android'
      implementation 'com.google.protobuf:protobuf-java:3.11.4'
      // CameraX core library
      def camerax_version = "1.0.0-beta10"
      implementation "androidx.camera:camera-core:$camerax_version"
      implementation "androidx.camera:camera-camera2:$camerax_version"
      implementation "androidx.camera:camera-lifecycle:$camerax_version"
  }
  • 同步build.gradle修改时,报错ERROR: Unable to resolve dependency for ':app@debug/compileClasspath': Failed to transform file 'mp_face_detection_aar.aar' to match attributes {artifactType=jar},这是打包AAR时出错了,因为用官方的AAR没有问题。清空bazel缓存重新编译也不行。在wsl用git重新下载mediapipe代码,重新编译aar没问题,还有编译最后参数路径前面加上双斜杠不知道有没有影响。

  • 在AndroidManifest.xml添加权限和使用说明

  <!-- For using the camera -->
  <uses-permission android:name="android.permission.CAMERA" />
  <uses-feature android:name="android.hardware.camera" />
  <uses-feature android:name="android.hardware.camera.autofocus" />
  <!-- For MediaPipe -->
  <uses-feature android:glEsVersion="0x00020000" android:required="true" />
  
  • 在activity_main.xml添加ui设置,主要是提供layout来添加相机预览画面。
  <?xml version="1.0" encoding="utf-8"?>
  <androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
      xmlns:app="http://schemas.android.com/apk/res-auto"
      xmlns:tools="http://schemas.android.com/tools"
      android:layout_width="match_parent"
      android:layout_height="match_parent">
  
      <FrameLayout
          android:id="@+id/preview_display_layout"
          android:layout_width="fill_parent"
          android:layout_height="fill_parent"
          android:layout_weight="1"
          tools:ignore="MissingConstraints">
          <TextView
              android:id="@+id/no_camera_access_view"
              android:layout_height="fill_parent"
              android:layout_width="fill_parent"
              android:gravity="center"
              android:text="no camera access" />
      </FrameLayout>
  </androidx.constraintlayout.widget.ConstraintLayout>
  • 修改MainActivity.java文件,添加mediapipe,BINARY_GRAPH_NAME要改成放在assets的binarypb的文件名,
  import android.graphics.SurfaceTexture;
  import android.os.Bundle;
  import androidx.appcompat.app.AppCompatActivity;
  import android.util.Size;
  import android.view.SurfaceHolder;
  import android.view.SurfaceView;
  import android.view.View;
  import android.view.ViewGroup;
  import com.google.mediapipe.components.CameraHelper;
  import com.google.mediapipe.components.CameraXPreviewHelper;
  import com.google.mediapipe.components.ExternalTextureConverter;
  import com.google.mediapipe.components.FrameProcessor;
  import com.google.mediapipe.components.PermissionHelper;
  import com.google.mediapipe.framework.AndroidAssetUtil;
  import com.google.mediapipe.glutil.EglManager;
  
  /** Main activity of MediaPipe example apps. */
  public class MainActivity extends AppCompatActivity {
      private static final String TAG = "MainActivity";
  
      private static final String BINARY_GRAPH_NAME = "mobile_gpu.binarypb";
      private static final String INPUT_VIDEO_STREAM_NAME = "input_video";
      private static final String OUTPUT_VIDEO_STREAM_NAME = "output_video";
      private static final CameraHelper.CameraFacing CAMERA_FACING = CameraHelper.CameraFacing.FRONT;
  
      // Flips the camera-preview frames vertically before sending them into FrameProcessor to be
      // processed in a MediaPipe graph, and flips the processed frames back when they are displayed.
      // This is needed because OpenGL represents images assuming the image origin is at the bottom-left
      // corner, whereas MediaPipe in general assumes the image origin is at top-left.
      private static final boolean FLIP_FRAMES_VERTICALLY = true;
  
      static {
          // Load all native libraries needed by the app.
          System.loadLibrary("mediapipe_jni");
          System.loadLibrary("opencv_java3");
      }
  
      // {@link SurfaceTexture} where the camera-preview frames can be accessed.
      private SurfaceTexture previewFrameTexture;
      // {@link SurfaceView} that displays the camera-preview frames processed by a MediaPipe graph.
      private SurfaceView previewDisplayView;
  
      // Creates and manages an {@link EGLContext}.
      private EglManager eglManager;
      // Sends camera-preview frames into a MediaPipe graph for processing, and displays the processed
      // frames onto a {@link Surface}.
      private FrameProcessor processor;
      // Converts the GL_TEXTURE_EXTERNAL_OES texture from Android camera into a regular texture to be
      // consumed by {@link FrameProcessor} and the underlying MediaPipe graph.
      private ExternalTextureConverter converter;
  
      // Handles camera access via the {@link CameraX} Jetpack support library.
      private CameraXPreviewHelper cameraHelper;
  
      @Override
      protected void onCreate(Bundle savedInstanceState) {
          super.onCreate(savedInstanceState);
          setContentView(R.layout.activity_main);
  
          previewDisplayView = new SurfaceView(this);
          setupPreviewDisplayView();
  
          // Initialize asset manager so that MediaPipe native libraries can access the app assets, e.g.,
          // binary graphs.
          AndroidAssetUtil.initializeNativeAssetManager(this);
  
          eglManager = new EglManager(null);
          processor =
                  new FrameProcessor(
                          this,
                          eglManager.getNativeContext(),
                          BINARY_GRAPH_NAME,
                          INPUT_VIDEO_STREAM_NAME,
                          OUTPUT_VIDEO_STREAM_NAME);
          processor.getVideoSurfaceOutput().setFlipY(FLIP_FRAMES_VERTICALLY);
  
          PermissionHelper.checkAndRequestCameraPermissions(this);
      }
  
      @Override
      protected void onResume() {
          super.onResume();
          converter = new ExternalTextureConverter(eglManager.getContext());
          converter.setFlipY(FLIP_FRAMES_VERTICALLY);
          converter.setConsumer(processor);
          if (PermissionHelper.cameraPermissionsGranted(this)) {
              startCamera();
          }
      }
  
      @Override
      protected void onPause() {
          super.onPause();
          converter.close();
      }
  
      @Override
      public void onRequestPermissionsResult(
              int requestCode, String[] permissions, int[] grantResults) {
          super.onRequestPermissionsResult(requestCode, permissions, grantResults);
          PermissionHelper.onRequestPermissionsResult(requestCode, permissions, grantResults);
      }
  
      private void setupPreviewDisplayView() {
          previewDisplayView.setVisibility(View.GONE);
          ViewGroup viewGroup = findViewById(R.id.preview_display_layout);
          viewGroup.addView(previewDisplayView);
  
          previewDisplayView
                  .getHolder()
                  .addCallback(
                          new SurfaceHolder.Callback() {
                              @Override
                              public void surfaceCreated(SurfaceHolder holder) {
                                  processor.getVideoSurfaceOutput().setSurface(holder.getSurface());
                              }
  
                              @Override
                              public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
                                  // (Re-)Compute the ideal size of the camera-preview display (the area that the
                                  // camera-preview frames get rendered onto, potentially with scaling and rotation)
                                  // based on the size of the SurfaceView that contains the display.
                                  Size viewSize = new Size(width, height);
                                  Size displaySize = cameraHelper.computeDisplaySizeFromViewSize(viewSize);
  
                                  // Connect the converter to the camera-preview frames as its input (via
                                  // previewFrameTexture), and configure the output width and height as the computed
                                  // display size.
                                  converter.setSurfaceTextureAndAttachToGLContext(
                                          previewFrameTexture, displaySize.getWidth(), displaySize.getHeight());
                              }
  
                              @Override
                              public void surfaceDestroyed(SurfaceHolder holder) {
                                  processor.getVideoSurfaceOutput().setSurface(null);
                              }
                          });
      }
  
      private void startCamera() {
          cameraHelper = new CameraXPreviewHelper();
          cameraHelper.setOnCameraStartedListener(
                  surfaceTexture -> {
                      previewFrameTexture = surfaceTexture;
                      // Make the display view visible to start showing the preview. This triggers the
                      // SurfaceHolder.Callback added to (the holder of) previewDisplayView.
                      previewDisplayView.setVisibility(View.VISIBLE);
                  });
          cameraHelper.startCamera(this, CAMERA_FACING, /*surfaceTexture=*/ null);
      }
  }
  
  • 报错java.lang.NoClassDefFoundError: Failed resolution of: Landroidx/camera/core/CameraX$LensFacing。app/build.gradle的camerax_version改成1.0.0-alpha06,删掉implementation "androidx.camera:camera-lifecycle:$camerax_version"
    def camerax_version = "1.0.0-alpha06"
  • 无法找到calculator,binarypb文件换成官方例子的,因为aar用的时官方例子的,可能需要配套

  • 后面用wsl下载的代码编译的aar可以使用,添加后,手机安装运行后报错ava.lang.UnsatisfiedLinkError: dlopen failed: cannot locate symbol "aligned_alloc" referenced by "/data/app/com.octant.mediapipedemo-2/lib/arm64/libmediapipe_jni.so"。在mediapipe代码跟目录的WORKSPACE文件中找到android_ndk_repository并添加api_level = 21后重新编译。

    vi WORKSPACE
    android_ndk_repository(
        name = "androidndk",
        api_level = 21, #add this
     )
  • 自己编译的aar使用报错java.lang.NoClassDefFoundError: Failed resolution of: Landroidx/camera/lifecycle/ProcessCameraProvider,app/build.gradle添加lifecycle,因为没有camera-lifecycle:1.0.0-alpha06,所以camerax_version要改成1.0.0-beta10
    def camerax_version = "1.0.0-beta10"
    implementation "androidx.camera:camera-lifecycle:$camerax_version"
  • 自己编译的aaar使用报错ava.lang.NoSuchMethodError: No virtual method provideSurface(Landroid/view/Surface;Ljava/util/concurrent/Executor;Landroidx/core/util/Consumer;)V in class Landroidx/camera/core/SurfaceRequest。发现是camerax_version没改动,应该是1.0.0-beta10,改成1.0.0-alpha10

  • 使用自己编译的aar和二进制图,小米MIX2黑屏,小米10正常。但是把face_detection_back.tflite和face_detection_back_labelmap.txt添加到assets后,小米MIX2就可以了,摄像头还是前置的,不知道为什么会影响,试过删掉后小米MIX2又黑屏没图像了。

  • demo的项目上传到gitee。但是aar文件太大,.gitignore添加aar,不上传aar文件,可以从http://www.mediafire.com/file/sd8zoizhqwahsr7/mp_face_detection_aar.aar/file下载AAR文件,下载后放到MediaPipeDemo/app/libs目录下。

空文件

简介

MediaPipe编译后,把库导入工程进行测试 展开 收起
取消

发行版

暂无发行版

贡献者

全部

近期动态

加载更多
不能加载更多了
1
https://gitee.com/liujack2021/test-media-pipe.git
git@gitee.com:liujack2021/test-media-pipe.git
liujack2021
test-media-pipe
TestMediaPipe
master

搜索帮助