Add `VideoFrameProcessor.registerInputStream()` to signal a new type of input.
And `InputHandler.signalEndOfCurrentInputStream()` to signal to `InputHandler`
partial input stream completion.
Fully processed means after FinalShaderProgramWrapper releases the last frame.
PiperOrigin-RevId: 527356646
Allow the VideoFrameProcessor to output to a texture without an output surface.
Tested by updating texture output tests to no longer output to a surface.
PiperOrigin-RevId: 527244605
`prepare()` now logs a warning if it's called before `setPlayer()`
because it's not possible to tell if it's being called on the wrong
thread (since 3480a27994).
This change finds all the places one is called immediately after the
other and flips the order to be more correct.
Issue: androidx/media#350
#minor-release
PiperOrigin-RevId: 526582294
If the limited number of input buffers causes reading of all samples except the last one conveying end of stream, then the last frame will not be rendered.
PiperOrigin-RevId: 525974445
The sessions may have different application threads for their players,
and the service with its notification provider runs on the main thread.
To ensure everything runs on the correct thread, this change labels
methods where needed and fixes thread access in some places.
Issue: androidx/media#318
PiperOrigin-RevId: 524849598
Previously, ExoPlayerImpl had volume flags hardcoded to SHOW_UI, but now the developer can choose what happens on volume change. The old methods have been deprecated.
PiperOrigin-RevId: 523974358
It is not possible to provide a safe deprecation path because
BaseTrackSelection can't easily know which of the methods is
implemented by subclasses.
PiperOrigin-RevId: 523471578
Before this CL, SurfaceTexture.onFrameAvailable was used to tell whether a frame
was available in the VideoFrameProcessor's output texture. This was incorrect, as
it would rely on having the texture be written to before the
SurfaceTexture.onFrameAvailableListener is invoked, leading to null-pointer-
exceptions on timeouts.
Instead of using DefaultVideoFrameProcessor different interfaces to set that we
want to output to a texture, and get that output texture, use one interface that
sets a listener, and renders to a texture iff that listener is set. As this
listener is executed on the GL thread, this also allows us to no longer need to
expand visibility for the GL task executor and tasks.
PiperOrigin-RevId: 522362101
This includes:
- Add an ad for each LOADED event of the SDK by taking the duration
of the ad from the media structure to exactly match the start position
of ads and then use `addLiveAdBreak()` that is used for HLS live already.
- When the refreshed content timeline arrives, possibly correct
the duration of an ad that has been inserted while the period duration was
still unknown (last period of the live timeline).
- When an ad period is removed the ad group needs to be put into a condition
that allows continuing playback.
PiperOrigin-RevId: 520919236
The `DashMediaSource` wrongly added an offset to the media times set
to the `MediaLoadData`. With this the `startTimeMS` and `endTimeMs`
don't represent the positions in the period but in the stream.
`DashMediaSource` was the only call site that was setting the offset
to a non-zero value. So if we are using 0 for the `DashMediaSource`
as well, the offset is redundant and we can remove it everywhere.
PiperOrigin-RevId: 520682026
Also fixed the javadoc link in devsite and removed javadoc links from decoder extensions as it is not published yet on developer.android.com.
#minor-release
PiperOrigin-RevId: 520636868
Previously, we always used ImageReader to read from the output of DefaultVideoFrameProcessor, for pixel tests. This has a limitation of not being
able to read HDR contents, so that we couldn't support HDR pixel tests.
Reading from a texture allows us to use glReadPixels to read from
DefaultVideoFrameProcessor, and build upon this to implement HDR pixel tests. We do
still want tests for surface output though, because real use-cases only will output
to Surfaces.
Also, add some tests for outputting to textures, since this test infrastructure is
a bit complex.
PiperOrigin-RevId: 519786535
This change makes sure that live ad periods that are played are
skip when attempted to be added to the queue. To make this work
the existing filter logic had to be take into account the content
resume offset that live periods use.
PiperOrigin-RevId: 518068138
This change makes sure that the `AdPlaybackState` of any period can
contain an empty postroll placeholder.
The placeholder postroll should be represented in the `MediaPeriodId`
of a content period as `nextAdGroupIndex`, but should be ignored when
building the list of `MediaPeriodInfo` in the `MediaPeriodQueue`. This
is required to allow to add an ad group to ad playback state of the
content period that is currently being played, instantly insert an ad
period into the media period queue and immediately transition playback
to the new period.
This change makes sure and tests that
- a live server side inserted postroll placeholder can be inserted to
a `AdPlaybackState` in well-defined and tested way (helper method)
- a postroll placeholder is NOT ignored when
`AdPlaybackState.getAdGroupIndexAfterPositionUs` is called (this
is required when evaluating the `nextAdGroupIndex`).
- a postroll placeholder is ignored when
`AdPlaybackState.getAdGroupIndexForPositionUs` is called (this is
required to not attempt to play the ad and is analogous to ignore the
post roll placeholder in a single period timeline).
- `MediaPeriod.getFollowingMediaPeriodInfo()` does not include a
`MediaPeriodInfo` for the placeholder postroll when building the
queue.
PiperOrigin-RevId: 515079136
This timeline will be used in unit test cases of follow-up
CLs. It basically can be used to emulate the timeline created by a
multi-period live media source when the real time advances.
PiperOrigin-RevId: 512665552
Following test cases are added:
1. Mux H264 video
2. Mux H265 video
3. Mux HDR video
Each test case performs following actions:
1. Extract track and samples from input Mp4 using MediaExtractor.
2. Feed those samples into Mp4 muxer.
3. Use extractor to extract the samples from muxed file and create dump file.
PiperOrigin-RevId: 512589069
In most cases, this means updating Subject subclasses and their assertThat methods to accept null actual values. (They should always accept null so that assertions like "assertThat(foo).isNull()" succeed instead of throwing NullPointerException.)
Occasionally, it involves other changes, like writing `isGreaterThan(1L)` instead of `isGreaterThan(1)` to resolve an ambiguity it Kotlin overload resolution.
PiperOrigin-RevId: 511776581
Otherwise, a lack of HDR decoding support will result in the tests checking output files for HDR output, like HdrEditingTest.transform_noRequestedTranscode_hdr10File_transformsOrThrows, failing.
PiperOrigin-RevId: 510213020
GLEffectsFrameProcessor, MatrixShaderProgram and FinalMatrixShaderProgramWrapper are currently setup to handle the input frames coming from an external input (i.e. a video decoder). Image input is loaded into Bitmap objects at the start of the pipeline, so they are not produced externally. The changes provide a way for the frame processing pipeline to handle this "internal" (i.e. non-external) input.
PiperOrigin-RevId: 508645244
Based on [this conversation thread](https://chat.google.com/room/AAAA--f88ao/76Rem_cRCK8), I've opted to update the existing FrameProcessor.create() rather than deprecate it, as it is unlikely to be in use by apps outside google3.
PiperOrigin-RevId: 506920930
Add checks to GL tone-mapping pixel tests, to ensure the device's decoder, API
version, and OpenGL implementation support GL tone-mapping before attempting it.
These tests should be run on mobile harness, to detect per-device failures, and
so are moved to transforemr/mh. Per b/263395272, these tests should ultimately
be in an effect/mh directory.
PiperOrigin-RevId: 505749974
Also, omit the "actual" label from output files, as this boilerplate isn't necessary
(it doesn't disambiguate between any other saved filename like "expected").
PiperOrigin-RevId: 502378188
Frame cache compensates for the fluctuation in frame processing times.
Imagine a frame takes 10ms to process, and the interval between two frames is
33ms. The third frame took 40ms to process.
If we don't have frame cache:
- Process frame 1, ready after 10ms, starts playback, now t=0 ms
- Start processing frame 2, ready at t=10ms,
- Release frame 2 at t=33ms
- We start processing the third frame at t=33ms
- The third frame is due presentation at t=66ms
- But frame 3 is available at t=73ms, late
If we have a frame cache of say 3 frams,
- Process frame 1, ready after 10ms, starts playback, now t=0 ms
- Start processing frame 2, ready at t=10ms
- Start processing frame 3, ready at t=50ms
- Release frame 2 at t=33ms
- Start frame 4, ready at t=60ms
- Frame 3 is due presentation at t=66ms
- Frame 3 isn't late
PiperOrigin-RevId: 501869948
Move BitmapTestUtil from media3.effect to media3.test.utils.
This allows this class to be accessed from both transformer and effect tests.
Refactoring change. No functional change intended.
PiperOrigin-RevId: 501837130