Update extension README with usage instructions

Issue: #3162

-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=165572088
This commit is contained in:
andrewlewis 2017-08-17 06:40:44 -07:00 committed by Oliver Woodman
parent 9b18130ecc
commit b174bcc173
4 changed files with 160 additions and 28 deletions

View File

@ -2,17 +2,18 @@
## Description ## ## Description ##
The FFmpeg extension is a [Renderer][] implementation that uses FFmpeg to decode The FFmpeg extension provides `FfmpegAudioRenderer`, which uses FFmpeg for
audio. decoding and can render audio encoded in a variety of formats.
[Renderer]: https://google.github.io/ExoPlayer/doc/reference/com/google/android/exoplayer2/Renderer.html
## Build instructions ## ## Build instructions ##
To use this extension you need to clone the ExoPlayer repository and depend on To use this extension you need to clone the ExoPlayer repository and depend on
its modules locally. Instructions for doing this can be found in ExoPlayer's its modules locally. Instructions for doing this can be found in ExoPlayer's
[top level README][]. In addition, it's necessary to build the extension's [top level README][]. The extension is not provided via JCenter (see [#2781][]
native components as follows: for more information).
In addition, it's necessary to build the extension's native components as
follows:
* Set the following environment variables: * Set the following environment variables:
@ -34,7 +35,11 @@ NDK_PATH="<path to Android NDK>"
HOST_PLATFORM="linux-x86_64" HOST_PLATFORM="linux-x86_64"
``` ```
* Fetch and build FFmpeg. For example, to fetch and build for armeabi-v7a, * Fetch and build FFmpeg. The configuration flags determine which formats will
be supported. See the [Supported formats][] page for more details of the
available flags.
For example, to fetch and build for armeabi-v7a,
arm64-v8a and x86 on Linux x86_64: arm64-v8a and x86 on Linux x86_64:
``` ```
@ -103,5 +108,35 @@ cd "${FFMPEG_EXT_PATH}"/jni && \
${NDK_PATH}/ndk-build APP_ABI="armeabi-v7a arm64-v8a x86" -j4 ${NDK_PATH}/ndk-build APP_ABI="armeabi-v7a arm64-v8a x86" -j4
``` ```
## Using the extension ##
Once you've followed the instructions above to check out, build and depend on
the extension, the next step is to tell ExoPlayer to use `FfmpegAudioRenderer`.
How you do this depends on which player API you're using:
* If you're passing a `DefaultRenderersFactory` to
`ExoPlayerFactory.newSimpleInstance`, you can enable using the extension by
setting the `extensionRendererMode` parameter of the `DefaultRenderersFactory`
constructor to `EXTENSION_RENDERER_MODE_ON`. This will use
`FfmpegAudioRenderer` for playback if `MediaCodecAudioRenderer` doesn't
support the input format. Pass `EXTENSION_RENDERER_MODE_PREFER` to give
`FfmpegAudioRenderer` priority over `MediaCodecAudioRenderer`.
* If you've subclassed `DefaultRenderersFactory`, add an `FfmpegAudioRenderer`
to the output list in `buildAudioRenderers`. ExoPlayer will use the first
`Renderer` in the list that supports the input media format.
* If you've implemented your own `RenderersFactory`, return an
`FfmpegAudioRenderer` instance from `createRenderers`. ExoPlayer will use the
first `Renderer` in the returned array that supports the input media format.
* If you're using `ExoPlayerFactory.newInstance`, pass an `FfmpegAudioRenderer`
in the array of `Renderer`s. ExoPlayer will use the first `Renderer` in the
list that supports the input media format.
Note: These instructions assume you're using `DefaultTrackSelector`. If you have
a custom track selector the choice of `Renderer` is up to your implementation,
so you need to make sure you are passing an `FfmpegAudioRenderer` to the player,
then implement your own logic to use the renderer for a given track.
[top level README]: https://github.com/google/ExoPlayer/blob/release-v2/README.md [top level README]: https://github.com/google/ExoPlayer/blob/release-v2/README.md
[Android NDK]: https://developer.android.com/tools/sdk/ndk/index.html [Android NDK]: https://developer.android.com/tools/sdk/ndk/index.html
[#2781]: https://github.com/google/ExoPlayer/issues/2781
[Supported formats]: https://google.github.io/ExoPlayer/supported-formats.html#ffmpeg-extension

View File

@ -2,18 +2,17 @@
## Description ## ## Description ##
The Flac extension is a [Renderer][] implementation that helps you bundle The Flac extension provides `FlacExtractor` and `LibflacAudioRenderer`, which
libFLAC (the Flac decoding library) into your app and use it along with use libFLAC (the Flac decoding library) to extract and decode FLAC audio.
ExoPlayer to play Flac audio on Android devices.
[Renderer]: https://google.github.io/ExoPlayer/doc/reference/com/google/android/exoplayer2/Renderer.html
## Build instructions ## ## Build instructions ##
To use this extension you need to clone the ExoPlayer repository and depend on To use this extension you need to clone the ExoPlayer repository and depend on
its modules locally. Instructions for doing this can be found in ExoPlayer's its modules locally. Instructions for doing this can be found in ExoPlayer's
[top level README][]. In addition, it's necessary to build the extension's [top level README][].
native components as follows:
In addition, it's necessary to build the extension's native components as
follows:
* Set the following environment variables: * Set the following environment variables:
@ -46,3 +45,40 @@ ${NDK_PATH}/ndk-build APP_ABI=all -j4
[top level README]: https://github.com/google/ExoPlayer/blob/release-v2/README.md [top level README]: https://github.com/google/ExoPlayer/blob/release-v2/README.md
[Android NDK]: https://developer.android.com/tools/sdk/ndk/index.html [Android NDK]: https://developer.android.com/tools/sdk/ndk/index.html
## Using the extension ##
Once you've followed the instructions above to check out, build and depend on
the extension, the next step is to tell ExoPlayer to use the extractor and/or
renderer.
### Using `FlacExtractor` ###
`FlacExtractor` is used via `ExtractorMediaSource`. If you're using
`DefaultExtractorsFactory`, `FlacExtractor` will automatically be used to read
`.flac` files. If you're not using `DefaultExtractorsFactory`, return a
`FlacExtractor` from your `ExtractorsFactory.createExtractors` implementation.
### Using `LibflacAudioRenderer` ###
* If you're passing a `DefaultRenderersFactory` to
`ExoPlayerFactory.newSimpleInstance`, you can enable using the extension by
setting the `extensionRendererMode` parameter of the `DefaultRenderersFactory`
constructor to `EXTENSION_RENDERER_MODE_ON`. This will use
`LibflacAudioRenderer` for playback if `MediaCodecAudioRenderer` doesn't
support the input format. Pass `EXTENSION_RENDERER_MODE_PREFER` to give
`LibflacAudioRenderer` priority over `MediaCodecAudioRenderer`.
* If you've subclassed `DefaultRenderersFactory`, add a `LibflacAudioRenderer`
to the output list in `buildAudioRenderers`. ExoPlayer will use the first
`Renderer` in the list that supports the input media format.
* If you've implemented your own `RenderersFactory`, return a
`LibflacAudioRenderer` instance from `createRenderers`. ExoPlayer will use the
first `Renderer` in the returned array that supports the input media format.
* If you're using `ExoPlayerFactory.newInstance`, pass a `LibflacAudioRenderer`
in the array of `Renderer`s. ExoPlayer will use the first `Renderer` in the
list that supports the input media format.
Note: These instructions assume you're using `DefaultTrackSelector`. If you have
a custom track selector the choice of `Renderer` is up to your implementation,
so you need to make sure you are passing an `LibflacAudioRenderer` to the
player, then implement your own logic to use the renderer for a given track.

View File

@ -2,18 +2,17 @@
## Description ## ## Description ##
The Opus extension is a [Renderer][] implementation that helps you bundle The Opus extension provides `LibopusAudioRenderer`, which uses
libopus (the Opus decoding library) into your app and use it along with libopus (the Opus decoding library) to decode Opus audio.
ExoPlayer to play Opus audio on Android devices.
[Renderer]: https://google.github.io/ExoPlayer/doc/reference/com/google/android/exoplayer2/Renderer.html
## Build instructions ## ## Build instructions ##
To use this extension you need to clone the ExoPlayer repository and depend on To use this extension you need to clone the ExoPlayer repository and depend on
its modules locally. Instructions for doing this can be found in ExoPlayer's its modules locally. Instructions for doing this can be found in ExoPlayer's
[top level README][]. In addition, it's necessary to build the extension's [top level README][].
native components as follows:
In addition, it's necessary to build the extension's native components as
follows:
* Set the following environment variables: * Set the following environment variables:
@ -59,3 +58,31 @@ ${NDK_PATH}/ndk-build APP_ABI=all -j4
* Clean and re-build the project. * Clean and re-build the project.
* If you want to use your own version of libopus, place it in * If you want to use your own version of libopus, place it in
`${OPUS_EXT_PATH}/jni/libopus`. `${OPUS_EXT_PATH}/jni/libopus`.
## Using the extension ##
Once you've followed the instructions above to check out, build and depend on
the extension, the next step is to tell ExoPlayer to use `LibopusAudioRenderer`.
How you do this depends on which player API you're using:
* If you're passing a `DefaultRenderersFactory` to
`ExoPlayerFactory.newSimpleInstance`, you can enable using the extension by
setting the `extensionRendererMode` parameter of the `DefaultRenderersFactory`
constructor to `EXTENSION_RENDERER_MODE_ON`. This will use
`LibopusAudioRenderer` for playback if `MediaCodecAudioRenderer` doesn't
support the input format. Pass `EXTENSION_RENDERER_MODE_PREFER` to give
`LibopusAudioRenderer` priority over `MediaCodecAudioRenderer`.
* If you've subclassed `DefaultRenderersFactory`, add a `LibopusAudioRenderer`
to the output list in `buildAudioRenderers`. ExoPlayer will use the first
`Renderer` in the list that supports the input media format.
* If you've implemented your own `RenderersFactory`, return a
`LibopusAudioRenderer` instance from `createRenderers`. ExoPlayer will use the
first `Renderer` in the returned array that supports the input media format.
* If you're using `ExoPlayerFactory.newInstance`, pass a `LibopusAudioRenderer`
in the array of `Renderer`s. ExoPlayer will use the first `Renderer` in the
list that supports the input media format.
Note: These instructions assume you're using `DefaultTrackSelector`. If you have
a custom track selector the choice of `Renderer` is up to your implementation,
so you need to make sure you are passing an `LibopusAudioRenderer` to the
player, then implement your own logic to use the renderer for a given track.

View File

@ -2,18 +2,17 @@
## Description ## ## Description ##
The VP9 extension is a [Renderer][] implementation that helps you bundle libvpx The VP9 extension provides `LibvpxVideoRenderer`, which uses
(the VP9 decoding library) into your app and use it along with ExoPlayer to play libvpx (the VPx decoding library) to decode VP9 video.
VP9 video on Android devices.
[Renderer]: https://google.github.io/ExoPlayer/doc/reference/com/google/android/exoplayer2/Renderer.html
## Build instructions ## ## Build instructions ##
To use this extension you need to clone the ExoPlayer repository and depend on To use this extension you need to clone the ExoPlayer repository and depend on
its modules locally. Instructions for doing this can be found in ExoPlayer's its modules locally. Instructions for doing this can be found in ExoPlayer's
[top level README][]. In addition, it's necessary to build the extension's [top level README][].
native components as follows:
In addition, it's necessary to build the extension's native components as
follows:
* Set the following environment variables: * Set the following environment variables:
@ -76,3 +75,38 @@ ${NDK_PATH}/ndk-build APP_ABI=all -j4
`${VP9_EXT_PATH}/jni/libvpx` or `${VP9_EXT_PATH}/jni/libyuv` respectively. But `${VP9_EXT_PATH}/jni/libvpx` or `${VP9_EXT_PATH}/jni/libyuv` respectively. But
please note that `generate_libvpx_android_configs.sh` and the makefiles need please note that `generate_libvpx_android_configs.sh` and the makefiles need
to be modified to work with arbitrary versions of libvpx and libyuv. to be modified to work with arbitrary versions of libvpx and libyuv.
## Using the extension ##
Once you've followed the instructions above to check out, build and depend on
the extension, the next step is to tell ExoPlayer to use `LibvpxVideoRenderer`.
How you do this depends on which player API you're using:
* If you're passing a `DefaultRenderersFactory` to
`ExoPlayerFactory.newSimpleInstance`, you can enable using the extension by
setting the `extensionRendererMode` parameter of the `DefaultRenderersFactory`
constructor to `EXTENSION_RENDERER_MODE_ON`. This will use
`LibvpxVideoRenderer` for playback if `MediaCodecVideoRenderer` doesn't
support decoding the input VP9 stream. Pass `EXTENSION_RENDERER_MODE_PREFER`
to give `LibvpxVideoRenderer` priority over `MediaCodecVideoRenderer`.
* If you've subclassed `DefaultRenderersFactory`, add a `LibvpxVideoRenderer`
to the output list in `buildVideoRenderers`. ExoPlayer will use the first
`Renderer` in the list that supports the input media format.
* If you've implemented your own `RenderersFactory`, return a
`LibvpxVideoRenderer` instance from `createRenderers`. ExoPlayer will use the
first `Renderer` in the returned array that supports the input media format.
* If you're using `ExoPlayerFactory.newInstance`, pass a `LibvpxVideoRenderer`
in the array of `Renderer`s. ExoPlayer will use the first `Renderer` in the
list that supports the input media format.
Note: These instructions assume you're using `DefaultTrackSelector`. If you have
a custom track selector the choice of `Renderer` is up to your implementation,
so you need to make sure you are passing an `LibvpxVideoRenderer` to the
player, then implement your own logic to use the renderer for a given track.
`LibvpxVideoRenderer` can optionally output to a `VpxVideoSurfaceView` when not
being used via `SimpleExoPlayer`, in which case color space conversion will be
performed using a GL shader. To enable this mode, send the renderer a message of
type `LibvpxVideoRenderer.MSG_SET_OUTPUT_BUFFER_RENDERER` with the
`VpxVideoSurfaceView` as its object, instead of sending `MSG_SET_SURFACE` with a
`Surface`.