The decoder and encoder won't accept high values for frame rate, so avoid
setting the key when configuring the decoder, and set a default value for the
encoder (where the key is required).
Also skip SSIM calculation for 4k, where the device lacks concurrent decoding
support.
PiperOrigin-RevId: 585604976
(cherry picked from commit 8b38b34b9f02c3648d2988c4cdaca005fbf8f1cd)
Using `Integer.MAX_VALUE` risks causing arithmetic overflow in the codec
implementation.
Issue: androidx/media#810
PiperOrigin-RevId: 585104621
(cherry picked from commit ad40db448943fef579879c9c2a988f514b254109)
Move remote 8K file to local and trim to 320ms.
Trim done with ffmpeg:
`ffmpeg -i {remote_file} -t 0.3 -c:v copy -c:a copy 8k24fps_300ms.mp4`
PiperOrigin-RevId: 569449962
MediaCodecRenderer has already been updated to not rely on the
input stream to mark its samples as decode-only and instead use
a simple time-based comparison to achieve the same effect.
This change makes the same update for all other renderers that
either use the flag directly or forward to a "decoder" instance.
PiperOrigin-RevId: 568232212
If getProgress is blocking whilst the internal thread calls endInternal
(for error or success), the condition is never opened. Related to this,
onCompleted and onError are therefore never surfaced to the app.
progressState is accessed from application and internal threads, so
should be marked volatile to prevent a thread caching the value.
PiperOrigin-RevId: 565720184
When we switch from MUXER_MODE_MUX_PARTIAL_VIDEO to MUXER_MODE_APPEND_VIDEO
`muxedPartialVideo` will already be `true` so `endTrack` method will pass
through this `if(muxedPartialVideo)` check which is incorrect.
PiperOrigin-RevId: 565398117
The logic that handles components' boundaries are grouped together in private
methods, like handling VideoCompositor's output textures.
PiperOrigin-RevId: 565131579
For pause and resume feature we need to use same `MuxerWrapper`
for `remuxing processed video` and then to `process remaining video`.
In order to use same `MuxerWrapper` across `different Exports`
we need to preserve its state.
PiperOrigin-RevId: 564728396
Modifying dumping to not required "released" to be called.
Track index is an arbitrary value based on the order of addTrack calls.
Samples are dumped by track (rather than as soon as they are written),
so it's preferable to use a value that provides more context.
By using the track type as a key, dump files will be more deterministic
and will have more similarities when branched.
PiperOrigin-RevId: 563700982
That said, only SDR is supported for now, so this will always throw if
HDR is input. This will also throw if different ColorInfo values are input
because color SDR mixing (ex. between sRGB and BT709) is not yet supported.
PiperOrigin-RevId: 563457729
For pause and resume feature we will need to build `ExportResult`
from multiple internal Exports. Keeping the `ExportResultBuider` in
`Transformer` class will allow using same builder across different
internal exports.
PiperOrigin-RevId: 563392443
When checking the color transfer, there is no reason that the file should have
exactly two tracks, and this assertion means that this method can't be used
as-is for checking video-only files, for example.
PiperOrigin-RevId: 562813111
Overlays may be overlaid over:
* In VideoFrameProcessor, videos or images (or texture input).
* In Compositor, other videos.
In Compositor, Overlays may consist of video, so it could be confusing
for videoFrameAnchor to contrast with overlayAnchor.
Also, rename overlayAnchor to overlayFrameAnchor, since it's modifying
the anchor in the overlay's frame, so this name seems slightly more precise.
PiperOrigin-RevId: 562004292
Implement VideoCompositor support of:
* Different input and output sizes
* CompositorSettings, to customize output size based on input texture sizes
* OverlaySettings, to place an input frame in an arbitrary position on
the output frame.
Also, refactor Overlay's matrix logic to make it more reusable between
Compositor and Overlays
PiperOrigin-RevId: 561931854
For pause and resume feature we will remux the previously processed
video in the first export and then in the second export we will resume
processing remaining video. For the second export we will have to
set initial timestamp offset so that video samples from the second
export are in continuation to the previously muxed samples.
PiperOrigin-RevId: 561689651
For pause and resume we need to perform multiple intermediate exports.
So moved start() logic into a separate so that it can be reused from
different places with different Compositions and MuxerWrapper.
PiperOrigin-RevId: 561675472