The previous code led me to misread this stack trace as a null pointer
exception, but it's really an index out of bounds exception:
```
Caused by: androidx.media3.exoplayer.upstream.Loader$UnexpectedLoaderException: Unexpected IllegalArgumentException: null
at androidx.media3.exoplayer.upstream.Loader$LoadTask.run(Loader.java:435)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:644)
at java.lang.Thread.run(Thread.java:1012)
Caused by: java.lang.IllegalArgumentException
at androidx.media3.common.util.Assertions.checkArgument(Assertions.java:40)
at androidx.media3.common.util.ParsableByteArray.setPosition(ParsableByteArray.java:164)
at androidx.media3.extractor.text.cea.Cea608Parser.parse(Cea608Parser.java:440)
```
PiperOrigin-RevId: 592876546
Instead, for input videos, use the colorInfo provided by the extractor. Similarly, for input images, use sRGB, the only color currently in use.
Textures do still need the input ColorInfo provided though.
PiperOrigin-RevId: 592875967
Replace the event for notifying fallback to cover codec initialization in
general (but keeping the list of errors).
Add a flag to control whether to try non-primary codecs, with the same
documentation as the similar flag in ExoPlayer's renderer.
Make the class final as it shouldn't be necessary to subclass it.
PiperOrigin-RevId: 592869868
This is similar to the HLS fix in 770ca66fbc
Similar to HLS, the original problem here was **not** modifying the
`Format` for caption tracks
embedded into the video stream. I tried just updating the format in
both places, but that caused new failures because the new
('transcoded') format was then fed into `FragmentedMp4Extractor` as
part of `closedCaptionFormats`, which resulted in the CEA-608 data
being emitted from `FragmentedMp4Extractor` with the incorrect
`application/x-media3-cues` MIME type (but the bytes were actually
CEA-608), meaning the transcoding wrapper passed it through without
transcoding and decoding failed (because obviously CEA-608 bytes can't
be decoded by `CueDecoder` which is expecting a `Bundle` from
`CuesWithTiming.toBundle`.
To resolve this we keep track of the 'original' caption formats inside
`TrackGroupInfo`, so we can feed them into `FragmentedMp4Extractor`.
For all other usages in `DashMediaPeriod` we use the 'transcoded'
caption formats.
PiperOrigin-RevId: 592866262
With fMP4 implementation there will be two writers `DefaultMp4Writer`
and `FragmentedMp4Writer`.
Changes includes:
1. Make Mp4Writer as an abstract class and keep only common functionality
into it.
2. Create a DefaultMp4Writer which contains existing logic to write MP4.
3. The fMP4 logic needs to access `pending sample buffer info` at various
places, so did refactoring to split List<Pair<BufferInfo, ByteBuffer>>
into two separate lists.
PiperOrigin-RevId: 592861398
Checking the output format's mime type may skip tests more often than we'd like,
because we may desire using a lower-spec output mimetype than what's passed in, if
based on the input's HDR mimetype value.
Therefore, update this output format to null, for tone-mapping tests
PiperOrigin-RevId: 592855713
Also updated the `README` file to accurately specify the use of NDK r23c and the default setting `ANDROID_ABI=21` for NDK r26b.
PiperOrigin-RevId: 592845796
Issues with the current implementation
1. The implementation is unnecessarily complicated and can be
easily simplified.To make all the tracks start from the same time,
its only the first sample that require some timestamp adjustments
but the current implementation shifts all the timestamps. Since method
calculates the `sample duration`, shifting all the timestamps has no effect
as such.
2. The implementation always forced first sample to start at 0. But when we
want to use same method for `Fragmented MP4` then it will look inaccurate
as we will call this method for different `fragments` and each `fragment`
will not start from 0 presentation time. Although the output will be same
since this method returns `duration` and not the `timestamps`.
3. As per previous implementation if there is just one sample then
its duration is made equals to its presentation time, which looks incorrect.
With new changes, if a single sample is passed then its duration will always
be 0 irrespective of specified last sample duration behaviour.
PiperOrigin-RevId: 592826612
getAudioCapabilities currently creates the receiver and returns
the current capabilities. This is error-prone because the
capabilities are also available as a class field.
This can made cleaner by letting the method just create the receiver
and all access to the capabilities can be made via class field.
PiperOrigin-RevId: 592591590
HLS distinguishes between 'subtitles' (WebVTT or TTML distributed in
separate files with their own playlist) and 'captions' (CEA-608 or 708,
distributed muxed into the video file).
The format transformation added in 7b762642db
only applies to subtitles and not captions. This change makes the same
transformation for caption formats.
This resolves an error like:
```
SampleQueueMappingException: Unable to bind a sample queue to TrackGroup with MIME type application/cea-608.
```
Also add two playback tests for HLS CEA-608, one that parses during
decoding (old way) and one during extraction (new way). Adding these
tests is what alerted me to this issue.
PiperOrigin-RevId: 592571284
This was generated by combining the existing `ts/bbb_2500ms.ts` test
asset and a temporary `.srt` file using
https://cloud.google.com/transcoder/docs/how-to/captions-and-subtitles
This doesn't directly reproduce the problem fixed by
7ca26f898d,
because the CEA-608 subs are structured differently to the stream I
discovered the problem with (from Issue: androidx/media#887). However this test
does fail if that fix is reverted after
486230fbd7.
I'm also not able to repro the character duplication reported in
Issue: androidx/media#887 by just changing the manifest in this CL. I'm not yet
sure on the exact differences between the stream provided on GitHub
and this stream.
This stream does provide some regression protection, because it
currently fails with 'new' subtitle parsing
(`DashMediaSource.Factory.experimentalParseSubtitlesDuringExtraction(true)`),
though I'm not sure on the exact reason for that yet.
PiperOrigin-RevId: 592476328
This is a refactoring that allows the `MediaPeriodQueue` to
create media period holders without all the collaborators
being passes in to `enqueueNextMediaPeriodHolder(...)`.
The factory is presumably also helpful in unit test to
know whether and when exactly a holder is created in
the preloading process.
PiperOrigin-RevId: 592301400
We deliberately left `Cea608Decoder` and `Cea708Decoder` intact when
creating the respective `SubtitleParser` implementations (in
27caeb8038
and
94e45eb4ad
respectively).
However we didn't correctly change the behaviour of
`SubtitleDecoderFactory.DEFAULT` in order to use these 'legacy'
implementations. We firstly left the `Cea608Decoder` instantiation
**after** the `DefaultSubtitleParserFactory.supportsFormat()` check,
meaning that code would never be reached (since `supportsFormat` now
returns true for CEA-608). Then in the second change (which was supposed
to only be about CEA-708) we removed all instantiations of **both**
`Cea608Decoder` and `Cea708Decoder`. This change puts the decoder
instantiations back and moves them above the
`DefaultSubtitleParserFactory.supportsFormat` check, meaning they will
be used in preference to `DelegatingSubtitleDecoder`.
This resolves the immediately-disappearing subtitles tracked by
Issue: androidx/media#904.
PiperOrigin-RevId: 592217937
While investigating Issue: androidx/media#887 I naively assumed the CEA-608
captions were in a TS file, but they're actually in an MP4 (which is
possibly obvious given DASH only supports MP4). This change includes
container info in the `EventLogger` `tracks` output.
PiperOrigin-RevId: 592192752
This allows us to remove the additional thread we create
for asynchronous buffer queuing and use a synchronized
queueing approach again.
This API looks strictly beneficial on all tested devices,
but since this code path in MediaCodec has not been used
widely, we leave an opt-out flag for now.
PiperOrigin-RevId: 591867472
The parameters may change the decoding behavior of the
following samples and it's suprising/wrong to apply them
from another thread where we can't guarantee from which
sample they apply.
PiperOrigin-RevId: 591808760
Move `inputColorInfo` from `VideoFrameProcessor`'s `Factory.create` to `FrameInfo`,
input via `registerInputStream`.
Also, manually tested on exoplayer demo that setVideoEffects still works.
PiperOrigin-RevId: 591273545
When an adaptive source has been preloaded, the track selection
of the player should possibly not override the preloaded selection
to avoid discarding preloaded data.
PiperOrigin-RevId: 591256334
The number of empty image byte arrays written is one less than the
total number of tiles in the image. The empty byte arrays act as
placeholders for individual tiles inside ImageRenderer.
PiperOrigin-RevId: 591231432
This is because currently
1. Player sets a surfaceView to render to
2. Player intializes the renderer
3. MCVR initializes the VideoSinkProvider, by extension VideoGraph
But when 1 happens, MCVR doesn't set the surfaceView on the VideoGraph because
it's not initialized. Consequently after VideoGraph is initialized, it doesn't
have a surface to render to, and thus dropping the first a few frames.
Also adds a test for first frame to verify the correct first frame is rendered.
PiperOrigin-RevId: 591228174
When track is changed during playback, `playbackPositionUs` may be in middle of a chunk and `loadPositionUs` should be the start of that chunk. In this situation `loadPositionUs` can be less than the current `playbackPositionUs`, resulting into negative `bufferedDurationUs`. It translates to having no buffer and hence we should send `0` for `bufferedDurationUs` when creating new instances of `CmcdData.Factory`.
Issue: androidx/media#888
#minor-release
PiperOrigin-RevId: 591099785
Otherwise, there's a memory leak of ~30MB, as this is never released.
This likely used to be considered released as part of what now became
`intermediateGlShaderPrograms`, but its release was missed after we split
`finalShaderProgramWrapper` out from the larger glShaderProgram list.
PiperOrigin-RevId: 590954785
To prepare to move `inputColorInfo` from `VFP.Factory.create` to
`VFP.registerInputStream`, move all usage of `inputColorInfo` to be *after*
`registerInputStream`.
To do this, defer creation of `externalShaderProgram` instances, which require
`inputColorInfo`. However, we must still initialize `InputSwitcher` and OpenGL
ES 3.0 contexts in the VFP create, to create and request the input surface from
ExternalTextureManager.
PiperOrigin-RevId: 590937251
The timestamp adjuster also estimates the number of wraparounds
of the 90Khz TS timestamp. It does that by assuming that a new
timestamp is always close to the previous one (in either direction).
This logic doesn't always work for duration estimates because the
timestamp at the end of the media is not close to the one at the
beginning and it may also never be less than the one at the beginning.
This can be fixed by introducing a new estimation model that assumes
the new timestamp is strictly greater than the previous one without
making the assumption that it has to be close to it.
Issue: androidx/media#855
#minor-release
PiperOrigin-RevId: 590936953
When broadcasting a notifyChildrenChanged event, the task for legacy
controllers was sent to the broadcasting callback. This would
technically work, but because the subscription list is maintained
with specific controllers, the broadcast controller isn't subscribed
and hence the call wasn't executed.
This change calls the overloaded method for a specific controller
for each connected controller. Making sure (only) subscribed
controllers are notified.
Issue: androidx/media#644
PiperOrigin-RevId: 590904037