If a renderer error happens while processing readahead data for the
next item in the playlist, we currently throw the error immediately
and only set the item id in the error details. This makes it harder
to associate the error to the right item. For example, the user
facing UI is likely not updated to show the failing item when the
error is reported.
This can be improved slighly by force setting the position to the
failing item. The playback still fails immediately, but this can't
be avoided because the renderer itself went into an error state.
PiperOrigin-RevId: 507808635
The AudioTrackPositionTracker needs to correct positions by
the speed set on the AudioTrack itself whenever it makes
estimations based on real-time (=the real-time playout
duration is not equal to the media duration played).
This happens for the main playback path already, but not for
the mode in which the position is estimated from the playback
head position and also not in the phase after the track has
been stopped. Both cases are not very noticeable during
normal playback, but become relevant when playing in offload
mode.
PiperOrigin-RevId: 507736408
This is a refactoring to separate and simplify the logic
of VOD and live streams when handling IMA ad events. An
additional listener will be required for DASH live stream
in a follow-up CL.
PiperOrigin-RevId: 507435741
It doesn't actually make sense for them to be placed in the Transformer, because the error's root causes are actually only in codecs. Also, a few codec errors were
repeated, so deduplicate these instances
PiperOrigin-RevId: 506937695
Based on [this conversation thread](https://chat.google.com/room/AAAA--f88ao/76Rem_cRCK8), I've opted to update the existing FrameProcessor.create() rather than deprecate it, as it is unlikely to be in use by apps outside google3.
PiperOrigin-RevId: 506920930
* Overload added `(cause, errorCode, isVideo, isDecoder, details)`,
where `details` is a string of values to be added to the error message
of the `TransformationException`.
* Overload with `MediaFormat` and `mediaCodecName` moved to
`DefaultCodec`, because all usages of that overload were from
`DefaultCodec`, and this allows a simplified API because of internally
stored values.
* `mediaCodecName` removed from overload that takes a `Format`.
* Reordered `createForCodec` parameters.
PiperOrigin-RevId: 506895268
In encoding small and odd-numbered resolutions, like `316x61` ([this image](https://upload.wikimedia.org/wikipedia/commons/6/65/100winners.png)), the current fallback logic prefers a software encoder to hardware ones. The assumption was, the encoder factory applies the encoder size alignment and changes the resolution to `316x60` for SW encoders and `316x64` for HW ones. SW encoders is selected because the supported resolution 60 is closer to requested 61, than the hardware supported 64.
This change changes the default encoder selection process to only expose hardware encoders if there is any.
PiperOrigin-RevId: 506879983
With multi-asset, the sample pipelines can process more than one
MediaItem. The renaming makes it clear that the format passed to the
SamplePipeline constructors is the one corresponding to the first
MediaItem. Indeed, the first format is the one used to configure the
SamplePipelines.
PiperOrigin-RevId: 506879260
Set the correct output color on the debug SurfaceViewWrapper, so that SDR contents
can have an output transfer of either GAMMA_2_2 (Gamma 2.2) or SDR (SMPTE 170M).
This fixes an issue where in-app tone-mapping would output gamma 2.2, and the
SDR value incorrectly hardcoded here would lead to an error in the OpenGL, which
does not support SMPTE 170M.
PiperOrigin-RevId: 506684602
This change includes 3 things:
- when the legacy media session is created, FLAG_HANDLES_QUEUE_COMMANDS
is advertised if the player has the COMMAND_CHANGE_MEDIA_ITEMS
available.
- when the player changes its available commands, a new
PlaybackStateCompat is sent to the remote media controller to
advertise the updated PlyabackStateCompat actions.
- when the player changes its available commands, the legacy media
session flags are sent accoridingly: FLAG_HANDLES_QUEUE_COMMANDS is
set only if the COMMAND_CHANGE_MEDIA_ITEMS is available.
#minor-release
PiperOrigin-RevId: 506605905
Before this CL, the `renderedLastFrame` flag is not set if the last frame is released immediately (force render), or when it's dropped.
PiperOrigin-RevId: 506358626
With the current ExtTexMgr,
it can happen that
- `x` frames are registered, but haven't arrived yet
- flush
- need to drop `x` frames when they arrive on SurfaceTexture
- status is reset to 0 pending, 0 available, drop `x` when frames arrive
- register one frame
- status is set to 1 pending, 0 available, drop `x` when frames arrive
- flush
- now the number of frame to drop is reset to `pending - available = 1`
- but it should be `x+1`
This CL solves the issue by reporting (by running the afterFlushTask) flush completes only after all the pending frames before calling flush are accounted for.
PiperOrigin-RevId: 506310671
The prior version (with the call to createEncodingException) could
never occur as select...SupportedMimeType already checks for HDR
editing support. This change ensures we throw before creating an
encoder, gives a better error code and allows future simplifications
around createForCodec (see child CL).
PiperOrigin-RevId: 506308290
Flushing resets all the texture processors within the `FrameProcessor`. This
includes:
- At the back, the FinalMatrixTextureProcessorWrapper, and its MatrixTextureProcessor
- At the front, the ExternalTextureManager
- All the texture processors in between
- All the ChainingGlTextureProcessorListeners in between texture processors
- All the internal states in the aforementioned components
The flush process follows the order, from `GlEffectsFrameProcessor.flush()`
1. Flush the `FrameProcessingTaskExecutor`, so that after it returns, all tasks queued before calling `flush()` completes
2. Post to `FrameProcessingTaskExecutor`, to flush the `FinalMatrixTextureProcessorWrapper`
3. Flushing the `FinalMatrixTextureProcessorWrapper` will propagate flushing through, via the `ChainingGlTextureProcessorListener`
Startblock:
has LGTM from christosts
and then
add reviewer andrewlewis
PiperOrigin-RevId: 506296469
The eotf is needed so that overlay (image) colors are correctly interpreted and mixed the linear video colors.
Also replaces the 100winners.png with "homemade" image file.
Added GlEffectsFrameProcessor test to justify that the color looks correct at the end of frame processing.
PiperOrigin-RevId: 506290309
Can be used to combine multiple media items into a single timeline window.
Issue: androidx/media#247
Issue: google/ExoPlayer#4868
PiperOrigin-RevId: 506283307
Can be used to combine multiple media items into a single timeline window.
Issue: androidx/media#247
Issue: google/ExoPlayer#4868
PiperOrigin-RevId: 506283307
For HLG input in transformer, FinalWrapper is configured to only output HLG to
encoder. But since DebugPreview is configured to take PQ for HDR content, the
color will not look correct.
This CL allows overriding the MatrixTP output transfer function, so
that FinalWrapper can output
- HLG to encoder
- PQ to debug preview
PiperOrigin-RevId: 506022840
Implementations outside media3 should be able to throw FrameProcessingException if they come across an error during configure().
PiperOrigin-RevId: 506020149
Transformer callbacks will take a Composition instead of a MediaItem.
Apps should be able to see what this Composition contains.
PiperOrigin-RevId: 505976561