Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: Using Opus in broadcasting (Read 10158 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

Using Opus in broadcasting

Reply #1
AFAICT you would need it to appear in TS 101 154, EN 300 468, and be adopted by silicon vendors. Those first two documents are written within DVB, published as DVB Bluebooks, and then tweaked + adopted through ETSI (at which point the Bluebook versions are withdrawn).

"Processors on modern set-top-boxes are more than powerful enough to decode Opus so it’s unlikely that dedicated hardware is needed." - the first part is true, but such devices still include dedicated audio decoders/DSP because those powerful general processors have other things to do.

I have no idea whether the Opus approach of stacking pairs of channels to deliver 5.1 etc can be competitive with, e.g. Dolby AC-4.

Cheers,
David.

Using Opus in broadcasting

Reply #2
AFAICT you would need it to appear in TS 101 154, EN 300 468, and be adopted by silicon vendors. Those first two documents are written within DVB, published as DVB Bluebooks, and then tweaked + adopted through ETSI (at which point the Bluebook versions are withdrawn).

"Processors on modern set-top-boxes are more than powerful enough to decode Opus so it’s unlikely that dedicated hardware is needed." - the first part is true, but such devices still include dedicated audio decoders/DSP because those powerful general processors have other things to do.


First part is correct. However, to begin with we are using it for contribution where we control the entire chain.

They contain dedicated audio decoders/DSP because existing codec manufacturers don't want to license codecs as pure software on STBs. Opus decoding is a drop in the ocean. Android set top boxes already have many codecs built in as software.

Using Opus in broadcasting

Reply #3
Broadcasting in Opus would be awesome

At what rates is Dolby AC-4 considered transparent and where can we test out some AC-4 encoders?
2 x 96Kbps Stereo + 64Kbps Mono = 256Kbps. I think Opus can get competitive

Using Opus in broadcasting

Reply #4
Quote
This is where Opus comes in – it’s a low-delay, royalty-free audio codec designed for a wide range of audio applications and is used in Skype, WebRTC and on the Playstation 4.


Used in Skype? I wasn't aware they had rolled it out yet. I only know of the old, incompatible SILK, from which part of Opus was developed. Do you have a citation for this? Wouldn't mind adding it to Wikipedia if backed by a reliable source.
Dynamic – the artist formerly known as DickD

Using Opus in broadcasting

Reply #5
Quote
This is where Opus comes in – it’s a low-delay, royalty-free audio codec designed for a wide range of audio applications and is used in Skype, WebRTC and on the Playstation 4.


Used in Skype? I wasn't aware they had rolled it out yet. I only know of the old, incompatible SILK, from which part of Opus was developed. Do you have a citation for this? Wouldn't mind adding it to Wikipedia if backed by a reliable source.


You're right I guess - I based this statement off http://blogs.skype.com/2012/09/12/skype-an...ew-audio-codec/ but it doesn't say they specifically use Opus.

Using Opus in broadcasting

Reply #6
I also found in the article you posted, a link to a recent blog where they raved about the Skype implementation of Opus for voice and linked to a Soundcloud demo they had recorded, only to be told in the comments that it was just the original Skype SILK operating at about 96kbps mono in each direction with a super wideband sampling rate and 16 kHz audio bandwidth.
Dynamic – the artist formerly known as DickD

Using Opus in broadcasting

Reply #7
There’s also a lack of metadata for features such as downmixing coefficients, which allows viewers watching on stereo equipment to hear a stereo downmix of a 5.1 broadcast.  However, the current specification reserves plenty of space for these future additions.

How were you intending to go about this, exactly?  There's padding where you can store arbitrary stuff, but Jean-Marc is sensitive about people using that without IETF cooperation.

Using Opus in broadcasting

Reply #8
There’s also a lack of metadata for features such as downmixing coefficients, which allows viewers watching on stereo equipment to hear a stereo downmix of a 5.1 broadcast.  However, the current specification reserves plenty of space for these future additions.

How were you intending to go about this, exactly?  There's padding where you can store arbitrary stuff, but Jean-Marc is sensitive about people using that without IETF cooperation.


If you read the spec draft you can see that we have reserved space for future extensions within our PES:
https://wiki.xiph.org/OpusTS



Using Opus in broadcasting

Reply #10
If you read the spec draft you can see that we have reserved space for future extensions within our PES:
https://wiki.xiph.org/OpusTS

I've never understood why the channel information is ever kept in the container.  Ogg Opus does this for multistream.


Where do you think the channel map information should be kept?

Using Opus in broadcasting

Reply #11
If you read the spec draft you can see that we have reserved space for future extensions within our PES:
https://wiki.xiph.org/OpusTS

I've never understood why the channel information is ever kept in the container.  Ogg Opus does this for multistream.


Where do you think the channel map information should be kept?

Wouldn't it make more sense for the audio stream itself to know that sort of thing?

EDIT:  I am not asserting that I am right, I am saying I don't understand.

Using Opus in broadcasting

Reply #12
If you read the spec draft you can see that we have reserved space for future extensions within our PES:
https://wiki.xiph.org/OpusTS

I've never understood why the channel information is ever kept in the container.  Ogg Opus does this for multistream.


Where do you think the channel map information should be kept?

Wouldn't it make more sense for the audio stream itself to know that sort of thing?

EDIT:  I am not asserting that I am right, I am saying I don't understand.


I basically agree. That said it was easier to stick this information in the PMT to begin with since our streams aren't expected to change channel map.
We'll use that reserved space to put a frame accurate channel map in the future.

Using Opus in broadcasting

Reply #13
Patches that supposedly add support for Opus in MPEG-TS landed in FFmpeg in October:

https://git.videolan.org/?p=ffmpeg.git;a=co...309d2db35183635
https://git.videolan.org/?p=ffmpeg.git;a=co...070c580c4e4ece0

Muxing seems to work. But demuxing fails. libavformat can't recognize the stream.

@kierank
Can you confirm that demuxing does not work?



Using Opus in broadcasting

Reply #16
@kierank
Can you confirm that demuxing does not work?


You mean demuxing works and muxing works, right?

What streams are you using?


I mean, creating mpegts files/streams seems to work (no errors).
But when I try to demux those files/streams. Opus is not detected
and playback fails.

Here is a shortened debug output.

Code: [Select]
Opening an input file: t2.ts.
[mpegts @ 0xfbc6a0] Format mpegts probed with size=2048 and score=100
[mpegts @ 0xfbc6a0] stream=0 stream_type=6 pid=100 prog_reg_desc=
[mpegts @ 0xfbc6a0] Before avformat_find_stream_info() pos: 0 bytes read:32768 seeks:0
[mpegts @ 0xfbc6a0] parser not found for codec none, packets or times may be invalid.
[mpegts @ 0xfbc6a0] probing stream 0 pp:2500
[mpegts @ 0xfbc6a0] Probe with size=2790, packets=1 detected mp3 with score=1
[mpegts @ 0xfbc6a0] probing stream 0 pp:2499
[mpegts @ 0xfbc6a0] Probe with size=5616, packets=2 detected mp3 with score=1
[mpegts @ 0xfbc6a0] probing stream 0 pp:2498
[mpegts @ 0xfbc6a0] probing stream 0 pp:2497
[mpegts @ 0xfbc6a0] probing stream 0 pp:2496
[mpegts @ 0xfbc6a0] probing stream 0 pp:2495
[mpegts @ 0xfbc6a0] Probe with size=16689, packets=6 detected mp3 with score=1
[mpegts @ 0xfbc6a0] probing stream 0 pp:2494
[mpegts @ 0xfbc6a0] probing stream 0 pp:2493
.
.
.
[mpegts @ 0xfbc6a0] probing stream 0 pp:2478
[mpegts @ 0xfbc6a0] probing stream 0 pp:2477
[mpegts @ 0xfbc6a0] Probe with size=67017, packets=24 detected aac with score=1
[mpegts @ 0xfbc6a0] probing stream 0 pp:2476
[mpegts @ 0xfbc6a0] probing stream 0 pp:2475
[mpegts @ 0xfbc6a0] probing stream 0 pp:2474
.
.
,
mpegts @ 0xfbc6a0] probing stream 0 pp:2113
[mpegts @ 0xfbc6a0] probing stream 0 pp:2112
[mpegts @ 0xfbc6a0] probing stream 0 pp:2111
[mpegts @ 0xfbc6a0] probing stream 0 pp:2110
[mpegts @ 0xfbc6a0] probing stream 0 pp:2109
[mpegts @ 0xfbc6a0] Probe with size=1096815, packets=2500 detected aac with score=1
[mpegts @ 0xfbc6a0] probed stream 0
[aac @ 0xfc04e0] channel element 3.14 is not allocated
[aac @ 0xfc04e0] More than one AAC RDB per ADTS frame is not implemented. Update your FFmpeg version to the newest one from Git. If the problem still occurs, it means that your file has a feature which has not been implemented.
[aac @ 0xfc04e0] Sample rate index in program config element does not match the sample rate index configured by the container.
.
.


I think this line is the most relevant:
Code: [Select]
[mpegts @ 0xfbc6a0] parser not found for codec none, packets or times may be invalid.


If this is not expected behavior. I'll submit a bug report.

Quote
Here is a sample from the first test transmissions of Opus on Eutelsat:
obe.tv/Downloads/extract_b1.ts


Is that file supposed to be all nulls? I stopped after a few MiBs.


Using Opus in broadcasting

Reply #17
FFmpeg is just attempting to create a broken Opus in TS file. As usual it thinks you can map anything into anything.

No the first 100MB or so is nulls iirc.

Using Opus in broadcasting

Reply #18
FFmpeg is just attempting to create a broken Opus in TS file. As usual it thinks you can map anything into anything.

No the first 100MB or so is nulls iirc.


Aha!
Thank you for your explanation.

 

Using Opus in broadcasting

Reply #19
I thought followers of this thread might be interested to see a splendid use of Opus related to broadcast and voice over work, that I recently found:

https://ipdtl.com/  and
https://now.source-elements.com/#!/ 

are both targeted to prefessionals as ISDN replacements, by using Opus (via WebRTC in Chrome) over a broadband connection for remote contributions. It's certainly simpler than piping over oggfwd or streaming from ffmpeg for your average broadcast journalist, or anyone else really.


 
SimplePortal 1.0.0 RC1 © 2008-2021