FallbackListener.onTransformationRequestFinalized() is called from the
AssetLoader thread for audio, and from the GL thread for video.
PiperOrigin-RevId: 542851284
By passing this class where it's needed, implementations don't need to store it
in a field (reducing boilerplate) and it's clearer that it can't be unset when
needed.
PiperOrigin-RevId: 542823522
`MediaControllerImplBase` has 2 methods for updating listeners about `PlayerInfo` changes - `updatePlayerInfo` (for masking the state) and `onPlayerInfoChanged` (when communicating with the session). There is a set number of listener callbacks related to `PlayerInfo` updates and both methods should go through the same control flow (whether we know that masking will ignore most of them or not).
A unified method `notifyPlayerInfoListenersWithReasons` encapsulates only the shared logic of 2 methods - listeners' callbacks. This ensures that both methods call them in the same order and none are missed out.
PiperOrigin-RevId: 542587879
With the upcoming "handle format changes" CL, stereo -> mono audio
would add an AudioProcessor. Robolectric decodes output encoded data,
which crashes some AudioProcessors because the number of frames may not
be an integer.
PiperOrigin-RevId: 542568875
This change uses this new method everywhere we currently `instanceof`
check an `Extractor` directly. This allows us to introduce
wrapping/delegating `Extractor` instances - because the `instanceof`
checks will continue to operate on the underlying instance.
HLS is a slightly different case, because it directly re-instantiates
`Extractor` instances, which is not compatible with an arbitrary
wrapping structure. Luckily the only `Extractor` instances that HLS
re-instantiates do not support muxed subtitles, so won't be wrapped
in the first place (although future changes might use the
delegating-`Extractor` pattern for other purposes, which might affect
HLS).
PiperOrigin-RevId: 542550928
The callbacks for `PlayerInfo` changes are currently in both `MediaControllerImplBase.updatePlayerInfo` (masking) and `MediaControllerImplBase.onPlayerInfoChanged`. But the order was different between them both and `ExoPlayerImpl.updatePlaybackInfo` which they are trying to mimic.
#minor-release
PiperOrigin-RevId: 542519070
Devices pre-API 33 are not able to comprehend the position reset that occurs by the HAL in offloaded gapless track transitions.
PiperOrigin-RevId: 542503662
The effects pipeline must receive images in the sRGB colorspace due to the color transfers applied in the shaders. Currently the burden to making sure images are in the right colorspaces falls onto apps. This CL ensures that this is not the case anymore.
PiperOrigin-RevId: 542323613
When an app tried to re-prepare a live streeam with server side inserted
ad after a playback exception, the player tried to find the ad group by
its index in the ad playback state of the next timeline when creating
the first period.
If a source that supports server side ad, has removed the ad playback
state when the source has been removed, this causes a crash. For live
streams this is a reasonable thing to do given the exception could be
caused by an invalid ad playback state.
This change removes the ad metadata from the current period for live
streams and the timeline. In case the ad playback state is not reset
by the source, the first timeline refresh would ad the metadata again.
PiperOrigin-RevId: 541959628
Some events may arrive after the playlist is cleared (e.g. load
cancellation). In this case, the DefaultPlaybackSessionManager may
create a new session for the already removed item.
We already have checks in place that ignore events with old
windowSequenceNumbers, but these checks only work if the current
session is set (i.e. the playlist is non-empty). The fix is to add
the same check for empty playlists by keeping note of the last
removed window sequence number.
PiperOrigin-RevId: 541870812
FakeClock currently doesn't work well with Espresso and Compose UI
tests because view interactions in both frameworks intentionally idle
the main looper to handle pending UI effects. However, this also
advances playback progress even though we want to deterministically
trigger progress from the test itself.
To solve this problem, we can detect the idling Robolectric call and
postpone any further updates until we leave this state.
PiperOrigin-RevId: 541831050
Add a Wear OS specific implementation of 'Player.Listener' to help resolving the playback suppression due to unsuitable output by launching a system media output switcher dialog.
PiperOrigin-RevId: 541698125
Audio only tests are now using RAW audio where possible, which is
passed through the Robolectric decoders/encoders, and can be handled by
the AudioProcessor instances accurately.
PiperOrigin-RevId: 541648853
To more accurately describe what they do, especially as Compositor will
starts to use more contexts or threads, and it's important to know what
needs to be reset/recreated/focused before what methods.
PiperOrigin-RevId: 541010135
Issue: When running the Transformer related test cases, the tests are flaky
because the order in which audio and video samples are interleaved seems to
differ in few instances.
Root cause: When running a transformation the sample producer (Asset loader)
and sample consumer (Sample pipeline) both runs on different thread and
theoretically there is no reason for behaviour to be deterministic because
the number of samples produced/written depends on how fast individual thread
works. So it is indeed surprising that test somehow worked deterministically in
majority of instances (may be something to do with Robolectric environment).
Solution: Since we don't expect the order of sample interleaving to be deterministic, make the dumping logic deterministic where all the video
samples will be collected and then dumped together (similarly for audio). This would mean we won't be able to see the interleaving so for that we need to
add separate test case verifying the interleaving logic only.
Pending: Test case for interleaving.
PiperOrigin-RevId: 540930871
Previously, TextureMangers have a method to signal ending of a current input
stream, and a method to end the **entire input**. The responsibility of both
methods are not easy to document, understand and read.
With the new design,
- Only `TextureManager.signalEndOfCurrentInputStream()` is kept
- It's called for every MediaItem in the sequence, include the final one
- FinalWrapper now takes explicit signal that frame processing is ending,
rather than relying on the return value of `onCurrentInputStreamProcessed()`
- On DVFP receiving EOS from the pipeline, it signals FinalWrapper the stream
is ending, **before** signaling the input switcher, so that FinalWrapper is
able to end the stream when the onCurrentInputStreamEnded signal eventually
reaches FinalWrapper
PiperOrigin-RevId: 540856680