This CL introduces the new public API setSuppressPlaybackWhenUnsuitableOutput which if set to TRUE will cause suppression of a requested playback if that is going to happen on an unsuitable audio output (e.g. builtin speaker on a WearOS device).
PiperOrigin-RevId: 538867212
*** Original commit ***
BEGIN_PUBLIC
Set video size to 0/0 when video render is disabled
In terms of MCVR with a `VideoRendererEventListener`, the video size is set to
0/0 right after `onVideoDisabled()` is called and is set to the actual size as
soon as the video size is known after 'onVideoEnabled()`.
For ExoPlayer and in terms of the `Player` interface, `Player.getVideoSize()`
returns a video size of 0/0 when `Player.getCurrentTracks()` does not support
`C.TRACK_TYPE_VIDEO`. This is ensured by the masking behavior
***
PiperOrigin-RevId: 537938947
*** Original commit ***
ExoPlayer: Add setVideoFrameProcessorFactory().
This allows apps to use a custom VideoFrameProcessor implementation for video
playback. This may be useful, for example, when outputting to a texture.
***
PiperOrigin-RevId: 536391597
This change moves the default logic into the actual Player
implementations, but does not introduce any behavior changes compared
to addMediaItems+removeMediaItems except to make the updates "atomic"
in ExoPlayerImpl, SimpleBasePlayer and MediaController. It also
provides backwards compatbility for cases where Players don't support
the operation.
Issue: google/ExoPlayer#8046
#minor-release
PiperOrigin-RevId: 534945089
This allows apps to use a custom VideoFrameProcessor implementation for video
playback. This may be useful, for example, when outputting to a texture.
PiperOrigin-RevId: 534044831
In terms of MCVR with a `VideoRendererEventListener`, the video size is set to
0/0 right after `onVideoDisabled()` is called and is set to the actual size as
soon as the video size is known after 'onVideoEnabled()`.
For ExoPlayer and in terms of the `Player` interface, `Player.getVideoSize()`
returns a video size of 0/0 when `Player.getCurrentTracks()` does not support
`C.TRACK_TYPE_VIDEO`. This is ensured by the masking behavior of
`ExoPlayerImpl` that sets an empty track selection result when the playing
period changes due to a seek or timeline removal.
When transitioning playback from a video media item to the next, or when
seeking within the same video media item, the renderer is not disabled.
#minor-release
PiperOrigin-RevId: 533479600
This allows us to avoid needing a reference to the VideoFrameProcessor, which
can be especially difficult if an App only has a reference to the
VideoFrameProcessor.Factory it passes into Transformer/ExoPlayer.
PiperOrigin-RevId: 533205983
Allow the VideoFrameProcessor to output multiple textures at a time, so that
lifetime of textures is up to the consumer calling VFP.releaseOutputFrame.
The FinalShaderProgramWrapper also has a new maxCapacity limit added, to ensure
the a reasonable amount of textures is used and avoid using up memory.
PiperOrigin-RevId: 532094256
Until the linked bug is fixed, relax constraints to allow this one device to
pass, to suppress failures and avoid triage toil.
PiperOrigin-RevId: 531259233
Systems accepting URIs should treat schemes as case-insensitive
([RFC 3986 Section 3.1](https://www.rfc-editor.org/rfc/rfc3986#section-3.1)):
> An implementation should accept uppercase letters as equivalent to
> lowercase in scheme names (e.g., allow "HTTP" as well as "http") for
> the sake of robustness
PiperOrigin-RevId: 528735287
renderOutputFrame actually renders frames to an output surface. We'll soon have
a releaseOutputFrame method, that would release resources associated with an
output time, so rename this to disambiguate the two methods.
Also rename onOutputFrameAvailable to onOutputFrameAvailableForRendering, to
make it clear this is not available for "release"
This change should be a renaming-only change and have no functional differences.
PiperOrigin-RevId: 527844947
Add `VideoFrameProcessor.registerInputStream()` to signal a new type of input.
And `InputHandler.signalEndOfCurrentInputStream()` to signal to `InputHandler`
partial input stream completion.
Fully processed means after FinalShaderProgramWrapper releases the last frame.
PiperOrigin-RevId: 527356646
Allow the VideoFrameProcessor to output to a texture without an output surface.
Tested by updating texture output tests to no longer output to a surface.
PiperOrigin-RevId: 527244605
`prepare()` now logs a warning if it's called before `setPlayer()`
because it's not possible to tell if it's being called on the wrong
thread (since 3480a27994).
This change finds all the places one is called immediately after the
other and flips the order to be more correct.
Issue: androidx/media#350
#minor-release
PiperOrigin-RevId: 526582294
If the limited number of input buffers causes reading of all samples except the last one conveying end of stream, then the last frame will not be rendered.
PiperOrigin-RevId: 525974445
Previously, ExoPlayerImpl had volume flags hardcoded to SHOW_UI, but now the developer can choose what happens on volume change. The old methods have been deprecated.
PiperOrigin-RevId: 523974358
It is not possible to provide a safe deprecation path because
BaseTrackSelection can't easily know which of the methods is
implemented by subclasses.
PiperOrigin-RevId: 523471578
Before this CL, SurfaceTexture.onFrameAvailable was used to tell whether a frame
was available in the VideoFrameProcessor's output texture. This was incorrect, as
it would rely on having the texture be written to before the
SurfaceTexture.onFrameAvailableListener is invoked, leading to null-pointer-
exceptions on timeouts.
Instead of using DefaultVideoFrameProcessor different interfaces to set that we
want to output to a texture, and get that output texture, use one interface that
sets a listener, and renders to a texture iff that listener is set. As this
listener is executed on the GL thread, this also allows us to no longer need to
expand visibility for the GL task executor and tasks.
PiperOrigin-RevId: 522362101
This includes:
- Add an ad for each LOADED event of the SDK by taking the duration
of the ad from the media structure to exactly match the start position
of ads and then use `addLiveAdBreak()` that is used for HLS live already.
- When the refreshed content timeline arrives, possibly correct
the duration of an ad that has been inserted while the period duration was
still unknown (last period of the live timeline).
- When an ad period is removed the ad group needs to be put into a condition
that allows continuing playback.
PiperOrigin-RevId: 520919236
The `DashMediaSource` wrongly added an offset to the media times set
to the `MediaLoadData`. With this the `startTimeMS` and `endTimeMs`
don't represent the positions in the period but in the stream.
`DashMediaSource` was the only call site that was setting the offset
to a non-zero value. So if we are using 0 for the `DashMediaSource`
as well, the offset is redundant and we can remove it everywhere.
PiperOrigin-RevId: 520682026
Previously, we always used ImageReader to read from the output of DefaultVideoFrameProcessor, for pixel tests. This has a limitation of not being
able to read HDR contents, so that we couldn't support HDR pixel tests.
Reading from a texture allows us to use glReadPixels to read from
DefaultVideoFrameProcessor, and build upon this to implement HDR pixel tests. We do
still want tests for surface output though, because real use-cases only will output
to Surfaces.
Also, add some tests for outputting to textures, since this test infrastructure is
a bit complex.
PiperOrigin-RevId: 519786535
This change makes sure that live ad periods that are played are
skip when attempted to be added to the queue. To make this work
the existing filter logic had to be take into account the content
resume offset that live periods use.
PiperOrigin-RevId: 518068138
This change makes sure that the `AdPlaybackState` of any period can
contain an empty postroll placeholder.
The placeholder postroll should be represented in the `MediaPeriodId`
of a content period as `nextAdGroupIndex`, but should be ignored when
building the list of `MediaPeriodInfo` in the `MediaPeriodQueue`. This
is required to allow to add an ad group to ad playback state of the
content period that is currently being played, instantly insert an ad
period into the media period queue and immediately transition playback
to the new period.
This change makes sure and tests that
- a live server side inserted postroll placeholder can be inserted to
a `AdPlaybackState` in well-defined and tested way (helper method)
- a postroll placeholder is NOT ignored when
`AdPlaybackState.getAdGroupIndexAfterPositionUs` is called (this
is required when evaluating the `nextAdGroupIndex`).
- a postroll placeholder is ignored when
`AdPlaybackState.getAdGroupIndexForPositionUs` is called (this is
required to not attempt to play the ad and is analogous to ignore the
post roll placeholder in a single period timeline).
- `MediaPeriod.getFollowingMediaPeriodInfo()` does not include a
`MediaPeriodInfo` for the placeholder postroll when building the
queue.
PiperOrigin-RevId: 515079136
This timeline will be used in unit test cases of follow-up
CLs. It basically can be used to emulate the timeline created by a
multi-period live media source when the real time advances.
PiperOrigin-RevId: 512665552
Following test cases are added:
1. Mux H264 video
2. Mux H265 video
3. Mux HDR video
Each test case performs following actions:
1. Extract track and samples from input Mp4 using MediaExtractor.
2. Feed those samples into Mp4 muxer.
3. Use extractor to extract the samples from muxed file and create dump file.
PiperOrigin-RevId: 512589069
In most cases, this means updating Subject subclasses and their assertThat methods to accept null actual values. (They should always accept null so that assertions like "assertThat(foo).isNull()" succeed instead of throwing NullPointerException.)
Occasionally, it involves other changes, like writing `isGreaterThan(1L)` instead of `isGreaterThan(1)` to resolve an ambiguity it Kotlin overload resolution.
PiperOrigin-RevId: 511776581