Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: Incorrect display of DTS-HD stream format from MKV (Read 1035 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

Incorrect display of DTS-HD stream format from MKV

When playing DTS-HS MA from an MKV container (with video stream) or MKA (audio only), UI element Selection Properties displays Codec: DTS, and Encoding: lossy.
The bitrate for MKV shows apparently cumulative with the video stream.
The same stream is demuxed as the dts file is displayed correctly, that is, Codec: DTS-HD (with profile), and Encoding: lossless.
Comparing the playback of streams from dts and mkv (mka) using the foo_bitcompare plugin did not reveal any differences.
On the working configuration of foobar2000, everything is the same.

Test configuration:
- Win7 SP1 x64;
- Foobar2000 1.4 beta 17 (portable);
- user-components folder:
   foo_bitcompare 2.1.1
   foo_input_dts 0.5.4;

Components report:
Spoiler (click to show/hide)



Re: Incorrect display of DTS-HD stream format from MKV

Reply #1
@kode54 can please you look into this? Is it possible to make foo_input_dts to display correct info about DTS-HD when it is packed in MKV?

 

Re: Incorrect display of DTS-HD stream format from MKV

Reply #2
It's a limitation of the packet_decoder interface. I'm only allowed to detect the format from the first packet, and DTS-HD requires at least two packets, since the first packet, as far as the container is concerned, is the DTS compatibility header. It doesn't help that every input packet is alternating with DTS lossy data, which is also essential to the decoding process.