RFR: 8282999: Add support for EXT-X-MEDIA tag in HTTP Live Streaming [v4]
Alexander Matveev
almatvee at openjdk.org
Wed May 8 22:02:21 UTC 2024
> - Added support for #EXT-X-MEDIA tag to HTTP Live Streaming.
> - Following audio renditions via #EXT-X-MEDIA tag will be supported (see CSR for more details):
> - MP2T streams with one H.264/AVC video track and elementary AAC audio stream via #EXT-X-MEDIA tag.
> - fMP4 streams with one H.264/AVC or H.265/HEVC video track and elementary AAC audio stream via #EXT-X-MEDIA tag.
> - fMP4 streams with one H.264/AVC or H.265/HEVC video track and fMP4 streams with one AAC audio track via #EXT-X-MEDIA tag.
> - Separate audio stream will be playback via separate chain of GStreamer elements inside one pipeline. Which means two "javasource" elements will be used inside one pipeline and they will be reading data independently of each other via two separate HLSConnectionHolders. GStreamer will handle audio and video synchronization based on PTS as for other streams. Other solutions were considered such as one "javasource" with multiple source pads, but such implementation will be more complex and does not provide any benefits.
> - HLSConnectionHolder which handles video stream will also parse all information for separate audio stream and then create child HLSConnectionHolder for separate audio stream which will be responsible for downloading audio segments and seek of audio streams.
> - Parser in HLSConnectionHolder was reworked to make it more readable and easy to maintain and extend.
> - JavaDoc updated to point to latest HLS implementation vs old draft. It also updated with information on #EXT-X-MEDIA tag. Also, added missing information on AAC elementary streams and fMP4 from previous fixes.
> - Fixed and improved debug output in Linux AV plugins.
> - Added new property to "dshowwrapper" to disable PTS reset for each new segment, since with #EXT-X-MEDIA tag audio and video segments are not align and they can start at different time.
> - Fixed missing PTS on first buffer after seek in MP2T demuxer in "dshowwrapper". Without it audio and video synchronization breaks with two separate streams.
> - Removed dead code from MediaManager.
> - Added handling for GST_MESSAGE_LATENCY. Based on GStreamer doc we need to call gst_bin_recalculate_latency() when such message received. Not sure if we really need to do this, but with separate video and audio streams we do receive this message when seek is done. Most likely due to video and audio is not align perfectly when we seek. For other streams this message is not received in most cases.
Alexander Matveev has updated the pull request incrementally with one additional commit since the last revision:
8282999: Add for support EXT-X-MEDIA tag in HTTP Live Streaming [v3]
-------------
Changes:
- all: https://git.openjdk.org/jfx/pull/1435/files
- new: https://git.openjdk.org/jfx/pull/1435/files/84103bd4..4e2e1572
Webrevs:
- full: https://webrevs.openjdk.org/?repo=jfx&pr=1435&range=03
- incr: https://webrevs.openjdk.org/?repo=jfx&pr=1435&range=02-03
Stats: 1198 lines in 2 files changed: 284 ins; 296 del; 618 mod
Patch: https://git.openjdk.org/jfx/pull/1435.diff
Fetch: git fetch https://git.openjdk.org/jfx.git pull/1435/head:pull/1435
PR: https://git.openjdk.org/jfx/pull/1435
More information about the openjfx-dev
mailing list