media/demos/transformer
huangdarwin ba9c9bb964 HDR: Use FP16 color representation for texture processors.
* Introduced `useHdr` for `GlEffect#toGlTextureProcessor`, so
  `TextureProcessor` implementations can decide how to handle HDR.
* Creating FP16 color textures for HDR input.

Tested via manual testing, adding a no-op GlEffectWrapper to the transformation to
force use of intermediate textures, adding a linear ramp to the fragment shader,
and trying to ascertain that there's a real reduction in posterization when
switching from 4-bit to 8-bit unsigned bytes, and again from 8-bit unsigned bytes
to 16-bit floating point.

PiperOrigin-RevId: 461613117
2022-07-21 12:42:32 +00:00
..

Transformer demo

This app demonstrates how to use the Transformer API to modify videos, for example by removing audio or video.

See the demos README for instructions on how to build and run this demo.

MediaPipe frame processing demo

Building the demo app with MediaPipe integration enabled requires some extra manual steps.

  1. Follow the instructions to install MediaPipe.

  2. Copy the Transformer demo's build configuration and MediaPipe graph text protocol buffer under the MediaPipe source tree. This makes it easy to build an AAR with bazel by reusing MediaPipe's workspace.

    cd "<path to MediaPipe checkout>"
    MEDIAPIPE_ROOT="$(pwd)"
    MEDIAPIPE_TRANSFORMER_ROOT="${MEDIAPIPE_ROOT}/mediapipe/java/com/google/mediapipe/transformer"
    cd "<path to the transformer demo (containing this readme)>"
    TRANSFORMER_DEMO_ROOT="$(pwd)"
    mkdir -p "${MEDIAPIPE_TRANSFORMER_ROOT}"
    mkdir -p "${TRANSFORMER_DEMO_ROOT}/libs"
    cp ${TRANSFORMER_DEMO_ROOT}/BUILD.bazel ${MEDIAPIPE_TRANSFORMER_ROOT}/BUILD
    cp ${TRANSFORMER_DEMO_ROOT}/src/withMediaPipe/assets/edge_detector_mediapipe_graph.pbtxt \
      ${MEDIAPIPE_TRANSFORMER_ROOT}
    
  3. Build the AAR and the binary proto for the demo's MediaPipe graph, then copy them to Transformer.

    cd ${MEDIAPIPE_ROOT}
    bazel build -c opt --strip=ALWAYS \
      --host_crosstool_top=@bazel_tools//tools/cpp:toolchain \
      --fat_apk_cpu=arm64-v8a,armeabi-v7a \
      --legacy_whole_archive=0 \
      --features=-legacy_whole_archive \
      --copt=-fvisibility=hidden \
      --copt=-ffunction-sections \
      --copt=-fdata-sections \
      --copt=-fstack-protector \
      --copt=-Oz \
      --copt=-fomit-frame-pointer \
      --copt=-DABSL_MIN_LOG_LEVEL=2 \
      --linkopt=-Wl,--gc-sections,--strip-all \
      mediapipe/java/com/google/mediapipe/transformer:edge_detector_mediapipe_aar.aar
    cp bazel-bin/mediapipe/java/com/google/mediapipe/transformer/edge_detector_mediapipe_aar.aar \
      ${TRANSFORMER_DEMO_ROOT}/libs
    bazel build mediapipe/java/com/google/mediapipe/transformer:edge_detector_binary_graph
    cp bazel-bin/mediapipe/java/com/google/mediapipe/transformer/edge_detector_mediapipe_graph.binarypb \
      ${TRANSFORMER_DEMO_ROOT}/src/withMediaPipe/assets
    
  4. In Android Studio, gradle sync and select the withMediaPipe build variant (this will only appear if the AAR is present), then build and run the demo app and select a MediaPipe-based effect.