After this change GlEffects can use any GlTextureProcessor not just
SingleFrameGlTextureProcessor.
MediaPipeProcessor now implements GlTextureProcessor directly which
allows it to reuse MediaPipe's output texture for its output texture
and avoids an extra copy shader step.
PiperOrigin-RevId: 456530718
After this change, FrameProcessorChain chains any GlTextureProcessors
instead of only SingleFrameGlTextureProcessors.
The GlTextureProcessors are chained in a bidirectional manner using
ChainingGlTextureProcessorListener to feed input and output related
events forward and release events backwards.
PiperOrigin-RevId: 456478414
This is useful for testing Transformer in the same way as it is used
in tests and to see only the real transformation time.
PiperOrigin-RevId: 456058466
Based on
https://developer.android.com/reference/android/media/MediaCodec#using-an-output-surface,
frame dropping behaviour depends on the target SDK version.
After this change transformer will only use
MediaFormat#KEY_ALLOW_FRAME_DROP if both the target and system SDK
version are at least 29 and default to its pre 29 behaviour where each
decoder output frame must be processed before a new one is rendered
to prevent frame dropping otherwise.
Also remove deprecated Transformer.Builder constructor without a
context and the context setter.
PiperOrigin-RevId: 453971097
Transformer always enabled glAssertionsEnabled, so there should
be no functional change.
ExoPlayer previously disabled glAssertionsEnabled, so GlUtil logged
GlExceptions instead of throwing them. The GlExceptions are now
caught and logged by the callers so that there should also be no
functional change overall.
This change also replaces EGLSurfaceTexture#GlException with
GlUtil#GlException.
PiperOrigin-RevId: 453963741
SingleFrameGlTextureProcessor is now an abstract class containing a
default implementation of the more flexible GlTextureProcessor interface
while still exposing the same simple abstract methods for single frame
processing it previously did.
FrameProcessorChain and GlEffect will be changed to use
GlTextureProcessor in follow-ups.
PiperOrigin-RevId: 453633000
Once the more advanced GlTextureProcessor interface exists,
it will be possible to change the output size of a GlTextureProcessor
between frames. To keep the re-configuration based on the frame sizes
minimal, things indepedent of the frame size, such as the GlProgram,
can be initialized in the constructor.
PiperOrigin-RevId: 451997584
Decode-only video frames (needed when the frame at / first frame after the
clipping start is not a key frame) need to be decoded but not passed to
the frame processor chain or encoder.
The clipping start offset needs to be removed from the frame timestamps
in the passthrough and video pipelines.
There are no changes needed for this in the audio pipeline, as it doesn't
use the input timestamps -- it uses its own timestamps derived from the
buffer sizes instead.
Also add demo option to try this out.
#minor-release
PiperOrigin-RevId: 451353609
Also update names of implementations to match design doc.
In follow-ups, SingleFrameGlTextureProcessor will become
an abstract implementation of a new GlTextureProcessor
interface.
Texture processor makes sense as it processes OpenGL textures.
The term frame processor will be used for something else in
follow-ups.
PiperOrigin-RevId: 451142085
Put cloud storage samples at the top to avoid having a sample at the top of the list where we don't control the server.
Also update labels not to mention progressive container type, as it's irrelevant for Transformer, which always transmuxes even if it doesn't transcode.
#ame-bug-week
PiperOrigin-RevId: 450403784
This listener replaces
FrameProcessorChain#getAndRethrowBackgroundExceptions.
The listener uses a new exception type FrameProcessingException
separate from TransformationException as the frame processing
components will be made reusable outside of transformer soon.
PiperOrigin-RevId: 447455746
This constant is used for https://docs.gl/es2/glVertexAttribPointer
which takes the number of components per generic vertex attribute
(meaning the size of the individual coordinate vectors here) not the
number of attributes (the number of vertices that the old constant
name referred to).
PiperOrigin-RevId: 447427241
This change splits AdvancedFrameProcessor into 4 files:
- MatrixTransformationFrameProcessor for the GlFrameProcessor
implementation
- MatrixTransformation and GlMatrixTransformation for the GlEffect
specification
- MatrixUtils for the static matrix helpers
PiperOrigin-RevId: 446236384
Add a very short (1 second) video, so that some manual tests / prototyping,
including tests for the start and end of a video, encoder selection, or
changes applied to frames, can finish quickly.
PiperOrigin-RevId: 441756901
What a minimal implementation should include is now explained in the
interface javadoc while the method name reflects what the method does.
PiperOrigin-RevId: 441432059
The new demo GlFrameProcessor is based on BitmapOverlayVideoProcessor
from the gl-demo. The demo-only GlFrameProcessor can be deleted once
Transformer supports this functionality.
PiperOrigin-RevId: 440059735
PeriodicDimmingFrameProcessor is an example of how a custom fragment
shader can be used to apply color changes that change over time.
PiperOrigin-RevId: 439840609
The matrix provider allows the transformation matrix to be updated
for each frame based on the timestamp.
The following example effects using this were added to the demo:
* a zoom-in transition for the start of the video,
* cropping a rotating rectangular frame portion,
* rotating the frame around the y-axis in 3D.
PiperOrigin-RevId: 439791592
From Android T onwards `MediaCodec` supports requesting tone-mapping down to
SDR. Add an option to request this behavior and document that it isn't
supported before T. Also add an option in the demo app to try it out.
Tested manually on a prerelease build.
PiperOrigin-RevId: 437765325
The muxer doesn't support HEVC below API 24. The is documented in
the TransformationRequest javadoc and the Transformer.Builder will
throw if HEVC is requested below API 24 so the option should not
be part of the demo for those devices.
#mse-bug-week
PiperOrigin-RevId: 429343805
- The resources were released twice before, which is not necessary since
the MSG_RELEASE message is already in the internal player queue.
- The demo app was failing because the stop watch was stopped in
onTransformationError after being reset.
#minor-release
#mse-bug-week
PiperOrigin-RevId: 428794426
On Samsung S21 with light theme the debug labels weren't showing up because the player view was visible (with a black background) while the transformation was in progress, matching the debug label text color. Hide the player view until the transformation is complete to fix this.
Also tweak the layout slightly to add space between the card border and the labels.
#mse-bug-week
PiperOrigin-RevId: 428455824
Tested by confirming transformations still work and write to a output file in a
scoped-storage directory on a:
* Nexus 6P API 23 emulator
* Google Pixel 4 API 31 physical device
PiperOrigin-RevId: 425644266
- Add a checkbox in the demo app to enable experimental HDR editing.
- Add an `experimental_` method to `TransformationRequest` to enable HDR editing.
- Add fragment/vertex shaders for the experimental HDR pipeline. The main difference compared to the existing shaders is that we sample from the decoder in YUV rather than RGB (because the YUV -> RGB conversion in the graphics driver is not precisely defined, so we need to do this to get consistent results), which requires the use of ES 3, and then do a crude YUV -> RGB conversion in the shader (ignoring the input color primaries for now).
- When HDR editing is enabled, we force using `FrameEditor` (no passthrough) to avoid the need to select another edit operation, and use the new shaders. The `EGLContext` and `EGLSurface` also need to be set up differently for this path.
PiperOrigin-RevId: 425570639