This currently causes the test to fail on Pixel 6 Pro running a recent S build
SQ1D.220205.004.
There is no need to test audio transcoding while we are measuring video
quality.
PiperOrigin-RevId: 435635314
ExternalCopyFrameProcessor's output dimensions match the input
size not the output size. So the intermediate texture size
should match the input size.
Also rename configureOutputDimensions to configureOutputSize.
PiperOrigin-RevId: 435058789
Use android.util.Size, whose naming is much easier to understand than Pair<Integer, Integer>, in both FrameProcessor and EncoderUtil.
PiperOrigin-RevId: 434813986
* Move auto-adjustments for transformation matrices from the
VideoTranscodingSamplePipeline constructor to the new
ScaleToFitFrameProcessor.
* Add GlFrameProcessor#getOutputDimensions() to allow for GlFrameProcessors with
different input and output dimensions. This is a prerequisite for
Presentation.
* Tested with unit tests (and manually just in case).
* A follow up CL will implement change the FrameProcessor input to be scale and
rotate values as requested by the user. This was kept out of this CL to
reduce CL review size. Presentation will also be implemented in a follow up
CL.
PiperOrigin-RevId: 434774854
If an OpenGL call blocks because the encoder's input surface is full,
this will now block the background thread while the main thread can
continue querying encoder output and free up encoder capacity until
it accepts more input unblocking the background thread.
PiperOrigin-RevId: 433283287
This will allow for easier customisation of the additional tasks
performed by the test runner, such as calculating metrics like SSIM.
PiperOrigin-RevId: 432434850
This test tests the same cases as the FrameEditorDataProcessingTest
as currently the main FrameEditor functionality is to apply a
transformation matrix using a TransformationFrameProcessor.
PiperOrigin-RevId: 431642066
We use SSIM to measure the transcoding quality between. SSIM is a widely used
tool that compares the luma channel between two images, and generates a score
from 0 to 1 that indicates "how similar" the two images are.
In `SsimHelper`, we decode the two videos, extract matching frames and
calculates the mean SSIM (SSIM averaged all matching frames) for both videos.
Matching frames are referred to as "comparisonFrame" in the CL, which is
selected based on the frame number and a user-set comparison interval.
For instance, if the interval is 7, then every seventh frames are compared.
We use MediaCodec/MediaExtractor to decode the video, and use ImageReader to
extract the decoded frame.
The SSIM calculation logic is a inspired by and modified from the CTS
[MSSIMMatcher](https://cs.android.com/android/platform/superproject/+/master:cts/tests/tests/uirendering/src/android/uirendering/cts/bitmapcomparers/MSSIMComparer.java;l=1?q=mssimcom)
that has some errors and extra features we don't need (like handling RGB
images).
Adds TranscodeQualityTest to ensure high quality transcoding.
PiperOrigin-RevId: 430951206
All (later customizable) GlFrameProcessors after the
ExternalCopyFrameProcessor receive their input from a normal OpenGL
texture not an external texture, so they won't need to worry about
the textureTransformMatrix.
PiperOrigin-RevId: 430165652
There are two major blockers to this test:
- H265 muxing is not available for API<24, so setting video mimeType to H265
will fail on those devices.
- AMR audio encoding is buggy on some device and it's not a widely used format.
The solution: use a video that is encoded with AVC/MP3, to ensure transcoding
to AVC/AAC.
PiperOrigin-RevId: 429648598
Re-enable tests that have no muxer support for timestamps going backwards.
Tests running on the B-frame sample will be added in a future commit.
#mse-bug-week
PiperOrigin-RevId: 429599177
Tested:
Verified that the additional information is available through
instrumentation tests, as well as via manual testing.
#mse-bug-week
PiperOrigin-RevId: 429038695
Fallback is only disabled for robolectric and instrumentation tests.
For MH tests, fallback is not disabled, as it may be needed due to
the broad range of devices available.
PiperOrigin-RevId: 426403167
Tested by confirming transformations still work and write to a output file in a
scoped-storage directory on a:
* Nexus 6P API 23 emulator
* Google Pixel 4 API 31 physical device
PiperOrigin-RevId: 425644266
- Add a checkbox in the demo app to enable experimental HDR editing.
- Add an `experimental_` method to `TransformationRequest` to enable HDR editing.
- Add fragment/vertex shaders for the experimental HDR pipeline. The main difference compared to the existing shaders is that we sample from the decoder in YUV rather than RGB (because the YUV -> RGB conversion in the graphics driver is not precisely defined, so we need to do this to get consistent results), which requires the use of ES 3, and then do a crude YUV -> RGB conversion in the shader (ignoring the input color primaries for now).
- When HDR editing is enabled, we force using `FrameEditor` (no passthrough) to avoid the need to select another edit operation, and use the new shaders. The `EGLContext` and `EGLSurface` also need to be set up differently for this path.
PiperOrigin-RevId: 425570639
TransformerTest sounds like a unit test for Transformer but these
tests test behaviour that involves multiple stages of the pipeline.
PiperOrigin-RevId: 425378369
If
a) the end of stream buffer arrives with a frame rather than an
empty buffer or
b) processDataV29() renders several decoder output buffers to the
FrameEditor's input Surface immediately before encountering the
EOS flag
these frames were previously stuck in the FrameEditor's input Surface
and never fed to the encoder.
PiperOrigin-RevId: 424898820
This test tests that all frames are processed when transcoding
video to a different sample MIME type (and that the transformation
completes successfully).
PiperOrigin-RevId: 424896014
Expected images are taken on emulators, so a larger acceptable
difference from expected images must be accepted on physical devices.
PiperOrigin-RevId: 421543441