The setter command is only used for setPlaylistMetadata and can
be named COMMAND_SET_PLAYLIST_METADATA. The getter commnad is
used to access getMediaMetadata and getPlaylistMetadata and can
be better named COMMAND_GET_METADATA to reflect this usage.
PiperOrigin-RevId: 523673286
Previously `ChannelMixingAudioProcessor` output float because it was
implemented using the audio mixer's float mixing support.
Move the implementation over to just using the `ChannelMixingMatrix` and make
it publicly visible in the common module so it can be used by apps for both
playback and export.
Also resolve a TODO that no longer had a bug attached by implementing support
for putting multiple mixing matrices to handle different input audio channel
counts, and fix some nits in the test code.
Tested via unit tests and manually configuring a `ChannelMixingAudioProcessor`
in the transformer demo app and playing an audio stream that identifies
channels, and verifying that they are remapped as expected.
PiperOrigin-RevId: 523653901
It's currently not possible to even subclass MediaController because
the constructor is package-private. To avoid any accidental usage or
future indirect subclassing, all methods can be marked as final.
PiperOrigin-RevId: 523648114
This change selects the best suited media button receiver
component and pending intent when creating the legacy
session. This is important to ensure that a service can
be started with a media button event from BT headsets
after the app has been terminated.
The `MediaSessionLegacyStub` selects the best suited
receiver to be passed to the `MediaSessionCompat`
constructor.
1. When the app has declared a broadcast receiver for
`ACTION_MEDIA_BUTTON` in the manifest, this broadcast
receiver is used.
2. When the session is housed in a service, the service
component is used as a fallback.
3. As a last resort a receiver is created at runtime.
When the `MediaSessionLegacyStub` is released, the media
button receiver is removed unless the app has provided a
media button receiver in the manifest. In this case we
assume the app supports resuming when the BT play intent
arrives at `MediaSessionService.onStartCommand`.
#minor-release
Issue: androidx/media#167
Issue: androidx/media#27
Issue: androidx/media#314
PiperOrigin-RevId: 523638051
It is not possible to provide a safe deprecation path because
BaseTrackSelection can't easily know which of the methods is
implemented by subclasses.
PiperOrigin-RevId: 523471578
Change what format is logged from MediaCodecAudioRenderer when
AudioSink throws InitializationException. We printed the
AudioSink's format, which most of the times is audio/raw (PCM)
and not the renderer's format. With this change both formats are
logged.
#minor-release
Issue: google/ExoPlayer#11066
PiperOrigin-RevId: 523456840
In addition to the changes in 3a5c4277a7
This change essentially reverts 30e5bc9837 (Merged Jul 2022).
From this CL on, `VideoFrameProcessor` takes in non-offset, monotonically
increasing timestamps. For example, with one 5s and one 10s video,
- `VideoFrameProcessor`'s input should start from 0
- On switching to the second video (10s), the timestamp of the first frame in
the second video should be at 5s.
In ExoPlayer however, `streamOffset` is managed differently and thus needs
correction before sending the frames to `VideoFrameProcessor`:
- The timestamp of the first video is offset by a large int, so the first frame
of the first media item has timestamp (assuming) 10000000000000000
- The last frame of the first media item has 10000005000000000
- At this point the stream off set is updated to 10000005000000000
- The pts of the first frame of the second video starts from 0 again.
PiperOrigin-RevId: 523444236
Simplify the audio encoder input timestamp calculation. The new calculation
avoids drifting by tracking the total number of bytes encoded rather than
tracking the timestamp and remainder separately, and also makes the timestamps
match the decoder output buffer timestamps.
Also switch one of the export tests that was passing through AMR samples over
to using WAVE audio. The problem with using AMR is that the compressed samples
are not necessarily an integer number of audio frames and the shadow decoder
would pass them from input to output, so the audio encoder was receiving
non-integer numbers of audio frames.
Tested by logging the timestamps at the decoder output and encoder input with
forcing transcoding audio, and verifying that after this change the audio
timestamps are no longer off by one.
PiperOrigin-RevId: 523409869
The video asset loader renders decoder output to a surface texture, and if the
video sample pipeline is in the process of updating the surface texture image
at the moment when the asset loader video decoder is released this seems to
cause `MediaCodec.release` to get stuck.
Swap the release order so that we stop updating the texture before trying to
release the codec.
PiperOrigin-RevId: 523401619
Log at debug level immediately when MediaCodec throws. This logging will be
output closer to the time when the error actually happened so should make it
easier to identify the order of components failing.
Downgrade logging of errors after export ends to warning level, as output may
still be fine if there was a problem after exporting completed (though it's
still worth logging a warning as the device may not be in a good state).
PiperOrigin-RevId: 523370457
Before this CL, SurfaceTexture.onFrameAvailable was used to tell whether a frame
was available in the VideoFrameProcessor's output texture. This was incorrect, as
it would rely on having the texture be written to before the
SurfaceTexture.onFrameAvailableListener is invoked, leading to null-pointer-
exceptions on timeouts.
Instead of using DefaultVideoFrameProcessor different interfaces to set that we
want to output to a texture, and get that output texture, use one interface that
sets a listener, and renders to a texture iff that listener is set. As this
listener is executed on the GL thread, this also allows us to no longer need to
expand visibility for the GL task executor and tasks.
PiperOrigin-RevId: 522362101
There is a race with the ad period preparation having completed
and `onDownstreamFormatChanged` being called when a live stream
is joined in an ad period. In this case the stream event metadata
of the period is immediately emitted and causing an ad media period
being created that is selected in `getMediaPeriodForEvent` before
being prepared (1 out of 4).
Using an `isPrepared` flag makes sure we don't hand out the media
period to early in `getMediaPeriodForEvent`.
PiperOrigin-RevId: 522340046
To set the chroma format and depth information for H265 format,the csd-0 data
needs to be parsed. The previous implementation skipped parsing
csd-0 data and hard coded values based on "profile" field in MediaFormat.
Along with above mention changes, corrected some of the comments
as per spec.
PiperOrigin-RevId: 522335595
This change improves `ImaUtil.maybeCorrectPreviouslyUnknownAdDuration` to
handles the case when the timeline moves forward more than a single period
while an ad group with unknown period duration is being played.
PiperOrigin-RevId: 522292612
If the duration is reported in MediaMetadataCompat, it should
also be set in the QueueTimeline to match controller.getDuration()
PiperOrigin-RevId: 522022953
NPE in toneMap_hlgFrame_matchesGoldenFile and toneMap_pqFrame_matchesGoldenFile was created because a uEnableColorTransfer uniform was being created on the HDR path, when HDR shader files don't have this uniform. (they don't support disable color transfers right now)
Fix: only create the uniform when input is SDR.
manually tested on failing tests
PiperOrigin-RevId: 522002603
MediaItems are not meant to be unique in a playlist. If a legacy
session publishes multiple items that get converted to equal MediaItems,
the current code fails because we look up queue ids in a Map (that
doesn't allow duplicate entries).
Fix this by storing a simple list of items with additional data.
#minor-release
Issue: androidx/media#290
PiperOrigin-RevId: 521993802
As most classes are used via interface only and people depending on it locally can always find the Javadoc in Android Studio directly, we don't plan to add Javadocs for these extensions module in developer.android.com.
PiperOrigin-RevId: 521993756
It's currently not possible to even subclass MediaSession because
the constructor is package-private. To avoid any accidental usage or
future indirect subclassing, all methods can be marked as final.
PiperOrigin-RevId: 521775373
This ensures that anybody implementing `Player` (which is relatively
unusual) must override at least one `@UnstableApi` method, and therefore
opt-in to the unstable API.
PiperOrigin-RevId: 521769675
* Add a new event `onAudioCapabilitiesChanged` in `AudioSink.Listener` interface.
* Add an interface `RendererCapabilities.Listener`, which will listen to `onRendererCapabilitiesChanged` events from the renderer.
* Add `getRendererCapabilitiesReceiver` method for `TrackSelector`, and register/unregister the `TrackSelector` as the `RendererCapabilitiesReceiver` (if implemented) when the `ExoPlayer` is initialized/released.
* Trigger the `AudioSink.Listener.onAudioCapabilitiesChanged` and further `RendererCapabilities.Listener.onRendererCapabilitiesChanged` events when the audio capabilities changes are detected in `DefaultAudioSink`.
PiperOrigin-RevId: 521427567
The check currently relies on the default value of 0 returned if the
Bundle doesn't define a pid. But in some cases, like Robolectric unit tests,
0 is a possible pid. The check can be improved by directly asserting that
the value is defined.
PiperOrigin-RevId: 521414649
This includes:
- Add an ad for each LOADED event of the SDK by taking the duration
of the ad from the media structure to exactly match the start position
of ads and then use `addLiveAdBreak()` that is used for HLS live already.
- When the refreshed content timeline arrives, possibly correct
the duration of an ad that has been inserted while the period duration was
still unknown (last period of the live timeline).
- When an ad period is removed the ad group needs to be put into a condition
that allows continuing playback.
PiperOrigin-RevId: 520919236
Handling of the stream offset and start position was unnecessarily
complex and even incorrect. It was going to be an issue for
concatenation of video and image input.
The stream offset is the offset added before decoding/encoding to
make sure it doesn’t fail in case of negative timestamps (which do
rarely occur).
The start position is equal to the stream offset, plus the clipping
start time if the media is clipped.
Before this change:
- Samples were offset by the start position before decoding, and this
offset was removed before muxing.
- The startPosition of the first MediaItem in a sequence was used for
all the MediaItems in this sequence (which is incorrect).
- The stream offset was removed before applying the GL effects and
added back before encoding so that it was not visible to the OpenGL
processing.
After this change:
- The start position is subtracted in the AssetLoader, so that the
downstream components don’t have to deal with the stream offsets and
start positions.
- Decoded samples with negative timestamps are not passed to the
SamplePipelines. The MediaMuxer doesn’t handle negative timestamps
well. If a stream is 10 secondes long and starts at timestamp -2
seconds, the output will only contain the samples corresponding to the
first 8 (10 - 2) seconds. It won’t contain the last 2 seconds of the
stream. It seems acceptable to remove the first 2 seconds instead.
PiperOrigin-RevId: 520916464
Releasing the player once a sequence has ended seems to make our
emulator tests flaky. Comment out until we find the cause. The player
will still be released from TransformerInternal, when the export ends.
PiperOrigin-RevId: 520886181
The `DashMediaSource` wrongly added an offset to the media times set
to the `MediaLoadData`. With this the `startTimeMS` and `endTimeMs`
don't represent the positions in the period but in the stream.
`DashMediaSource` was the only call site that was setting the offset
to a non-zero value. So if we are using 0 for the `DashMediaSource`
as well, the offset is redundant and we can remove it everywhere.
PiperOrigin-RevId: 520682026
The media3-hosted versions of these SVGs were removed due to a change in
the way the reference docs are generated. While work on getting them
hosted on developer.android.com, this change simply links to the
(identical) exoplayer2 versions in order to fix the media3 docs.
#minor-release
PiperOrigin-RevId: 520647905