Support parsing ID3 tags at the beginning of FLAC files, even though FLAC spec
does not require this.
GitHub: #4055.
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=192127929
A renderer configuration being null is equivalent to the
renderer being disabled. Remove the redundant state.
Issue: #3915
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=192126015
Also added an assertion to the DRM event dispatcher to cause
immediate failure when this happens. This is consistent with
the assertion in the equivalent MediaSource class.
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=191892735
Also refactor the tests to make them behavioral (rather than testing the method)
and inline simple assertions.
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=191867614
These don't seem to be needed anymore. All tests run without them in gradle
and Blaze.
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=191867518
In MatroskaExtractor TrueHD audio samples are joined into larger chunks. For
some streams the resulting chunked samples seem never to start with a syncframe.
This could result in playback of TrueHD streams getting stuck after seeking
because we could never read a syncframe at the start of a sample to determine
the sample size.
Instead of expecting to find a syncframe at the start of a sample, search for it
within the sample, to fix this issue.
Note: this means that we may search a few thousand bytes to find the sample
size, but the cost is only incurred for the first audio sample read.
Issue: #3845
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=191775779
This field (formerly "id") is almost impossible to use so far. Having setters
in the factories allows to specify custom tags for all media sources.
Also added a ExoPlayer.getCurrentTag() method to retrieve the tag of the
currently playing media source in a convenient way.
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=191738754
The data collector keeps track of active media periods to assign each event to
the correct media period and/or window. This information, together with other
information like playback position and buffered duration, is then forwarded
with the event to all registered listeners.
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=191726408
Previously the SonicAudioProcessor and SilenceSkippingAudioProcessor would track
their pending playback parameters and only apply them in flush(). Having the
values only take effect once flushed made the processors a bit more difficult to
use, especially because the value returned by isActive wouldn't update
immediately.
Make DefaultAudioSink only set the new speed/pitch/skip silence setting after
the audio processors have drained. This means it's no longer necessary to keep
track of pending parameter values and also fixes a bug where initial playback
parameters weren't applied because the audio processors weren't flushed while
uninitialized before DefaultAudioSink called isActive() on them.
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=191586727
The previous API allowed to pass in null to the constructors although variants
without listeners exist. That's why we need to handle these null values.
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=191577891
Partial rollback of [] which caused b/77294898 by deleting a public Exoplayer API.
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=191519591
Previously it was necessary to create a new Sonic instance every time the
processor was flushed. Add a flush() method to Sonic so that it can be reused
when seeking. It still needs to be recreated when parameters change.
SonicAudioProcessor and SilenceSkippingAudioProcessor have methods for setting
parameters that are documented as taking effect after a call to flush(), but
actually the value returned by isActive() was updated immediately. Track the
pending values and apply them in flush() to fix this.
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=191442336
applyContentMetadataMutations and getContentMetadata methods suppossed to be synchronized and assert the instance isn't released.
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=191419637
This is in preparation for making it possible to flush a Sonic instance so that
it's not necessary to create new ones every time the processor is flushed.
There should be no behavior changes here.
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=191410326
To be immediately rolled back after submission
Submitting on behalf of cblay.
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=191128111
Video renderers assume that the player position is advancing linearly while in
the started state. MediaCodecVideoRenderer schedules frames for rendering in the
future in the expectation that the player position is advancing.
This assumption is not currently true when using a position from the AudioTrack:
the player position can be fixed for (in the worst case) up to about 100 ms
before it starts increasing. This leads to an effect where the first frame is
rendered then a few other frames are rendered, then there's a pause before
frames start being rendered smoothly.
Work around this issue by checking whether the position has started advancing
before scheduling frames to be rendered in the future.
It might be preferable to make the audio renderer only become ready when its
timestamp can start advancing, but this is likely to be complicated.
Issue: #3841
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=190937429
In audio processors an audio frame consists of a sample (which is 2 bytes for
16-bit PCM) for each channel. Sonic used "sample" to refer to this.
We've already diverged from the original source for Sonic quite a bit (deleting
code and making stylistic changes) and there haven't been upstream changes so
far, so it seems fine to start making more substantial changes here.
There should be no behavior changes here.
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=190916793
Use string concatenation for Metadata.Entry instances, and add
Util.formatInvariant for numerical formatting.
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=190915643
This adds two options to the ClippingMediaSource which allow proper clipping
of live streams:
1. The clipping stays fixed relative to already created media periods. That
means that playback actually progresses through the clipped media and
eventually reaches the end of the clipping. The window is also marked
as non-dynamic to let playback end in this case.
2. Allow to specify a clipping duration relative to the default position to
be able to specify the duration of live stream which is to be played.
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=190911049
*** Reason for rollback ***
b/76391022 was caused by a timestamp correction in StabilizableSimpleExoPlayer which will be fixed with this CL.
*** Original change description ***
Automated g4 rollback of changelist 189570277.
*** Reason for rollback ***
causes b/76391022, motion still playback in Photos is broken
*** Original change description ***
Used fixed time frame in clipping media period.
Currently, whenever the clipping is updated, we move the time frame of the
clipped period to start at 0. This causes problems when we are already playing
this period and the renderer position does no longer match the stream
positions.
This change keeps the time frame of the...
***
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=190906020
- Ensure that no memory is used by audio processors that are always inactive, by
only allocating in flush() if active. If data was already allocated but a
processor becomes inactive we assume that the allocation may be needed in
future so do not remove it (e.g., in the case of ResamplingAudioProcessor).
- Make SilenceSkippingAudioProcessor set up its buffers in flush(), and clarify
that it is always necessary to call flush() if configure() returns true.
- Make reset() reset all state for all processors.
- Use @Nullable state or empty arrays for inactive audio processor buffers.
- Miscellaneous style/consistency cleanup.
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=190895783
This uses a simple threshold-based algorithm for classifying audio frames as
silent, and removes silences from input audio that last longer than a given
duration.
The plan is to expose this functionality via PlaybackParameters in a later
change.
Issue: #2635
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=190737027
They don't really belong there; it was basically a convenience
thing where one of the arguments to the track selector was being
packaged up in the result to avoid having to hold a separate
reference to it.
This change is being made as a precursor to a subsequent change
where creating the TrackSelectorResult will move from
MappingTrackSelector to DefaultTrackSelector. DefaultTrackSelector
doesn't currently have access to the un-mapped tracks, and so is
unable to create a TrackSelectorResult. It's IMO preferable to
keep it that way rather than passing them down just so they can
be included in the result.
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=190640594
*** Reason for rollback ***
causes b/76391022, motion still playback in Photos is broken
*** Original change description ***
Used fixed time frame in clipping media period.
Currently, whenever the clipping is updated, we move the time frame of the
clipped period to start at 0. This causes problems when we are already playing
this period and the renderer position does no longer match the stream
positions.
This change keeps the time frame of the clipped media period as it is and
instead specifies the offset of the window in the period.
***
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=190628272
Currently, MediaPeriod states that continueLoading may be called
during preparation. Some implementations would throw an error if
this happened.
Also make MediaPeriod documentation clearer.
-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=190596870