Skip to main content

Recent Posts

1
In my case, I changed from EVENT mode to PUSH mode and it started working again. I'm never really sure what the different is between these modes but usually one of them will work  ;)
2
Support - (fb2k) / Re: need WASAPI help
Last post by greeneye48 -
Hi - new Win10 PC running latest Creators Update, v1703.  Using the new native Windows 10 USB audio drivers, selecting WASAPI Event Style, music plays though my USB DAC and sounds fine but the progress bar does not move.

Yes, same problem. Also the elapsed time doesn't update, and the frequency range visualization doesn't change. I reported this under another post (Playback doesn't follow cursor, and now the reverse). It's REALLY difficult to use foobar when I can't stop and reset to a certain point, etc. Still haven't seen any responses.

Hmmm... changed Event to Push mode and it works as before....
3
@ IgorC: I did that. Just finished.
I used opus-tools-buildC.zip from your link some posts ago.

I started with herding_calls using --bitrate 96. I was not able to ABX it. For a comparison I used --bitrate 48, and though I succeeded I'd call even the --bitrate 48 result pretty good.

I continued with trumpet and could ABX it using --bitrate 96. I could not ABX it using --bitrate 128.

I was not able to ABX trumpet_myPrince using --bitrate 96. Using --bitrate 48 I could but the --bitrate 48 result is acceptable.

I could ABX lead_voice using --bitrate 96, but not using --bitrate 128 (though I got at 6/8, so there is a chance I could ABX it with a lot more ABXing pain. But some years ago I decided not to do extreme ABXing any more: no fun and questionable significance for practical listening).

The weak point comes with harp40_1. I'd call the --bitrate 96 result pretty annoying, and also the --bitrate 128 result is easy to ABX.

With the exception of harp40_1 I'm really impressed by the quality of the --bitrate 128 and also the --bitrate 96 results.

Some remarks:
  • opus seems to have a good sense for critical situations. The bitrates for these difficult samples were quite a way higher than the average bitrate for the bitrate settings.
  • For my test set of various kind of pop music the average bitrate was ~9 per cent higher than the --bitrate value (for --bitrate 48, 64, 96, 128)
  • For my test set of various pop music the opus track peak value was very often something like 1.36. How can I make sure that clipping is avoided on any device I'd like to play the opus file with?


4
Observation: MOS are on par however  build C was significantly better on two critical samples (Castanets and Linchpin).  Better handling of HF pre-echo on castantets clicking and better on guitar of Linchpin sample (more detail & less distortion on both guitars and bass drum  hitting).

All in all, as for me Build C is way to go at 64 and 96 kbps.  Absolutely yes.
Thanks a lot for taking the time to test this. I just released 1.2-beta, incorporating this change. To be more precise, the change I made in build C is fully enabled below 64 kb/s and then gradually phased out between 64 kb/s and 80 kb/s. For now I didn't want to change the behaviour at high bit-rate just to be on the safe side (it's already high quality so I want to make sure I don't make anything worse).
5
Support - (fb2k) / Re: mactype incompatibility
Last post by スラッシュ -
Ah never mind, turns out I could apply an unofficial MacType patch that seemingly fixes things! Disregard...  :))
6
Thank you for this update! The frame no longer resizes--and keeping the image window open was a great improvement!

A small suggestion--is it possible to remove the top bar from the video window? The one that has the Chimera(EVR) and the minimize/maximize/close buttons.



Thanks again :)
7
@Lumitopia here's a x64 build of FFmpeg N-86248-gcfec0d6475 with @atomnuker psyacoustics patch included along with libopus git master. It's built using media-autobuild_suite (MSYS2 + MinGW + GCC 6.3)
8
Opus / Re: Opusenc's built-in resampler
Last post by saratoga -
I think if your definition of audio quality does not relate to audibility then it is not a useful thing to quantize. 
9
Opus / Re: Opusenc's built-in resampler
Last post by [JAZ] -
The 20 kHz "filter" isn't a filter. It's the effect of the MDCT when you don't code the highest frequencies. It's all on purpose. What you see is MDCT leakage and it's part of every codec, though there's a bit more of it in Opus due to some (quite voluntary) trade offs we made. Of course, we could have changed that MDCT to make the spectrogram nicer and the sound worse...
Ok thanks. That clears out my doubts and shows that it's intended behaviour given that it's precise enough for the intended usage.

I think trying to find problems in lossy codecs by looking at a power spectrum is a lot less logical. 
Sure... It is easier with JUnit.
But while you're Junit-ing it, looking at the data (be it represented in numbers or in pixels) can give you a zoom-out to see if everything is in place.

Again, look at the scale bar before you decide a difference in quality exists. 
Quality is a quantizable thing. If you have the adequate tools, you are capable of classifying and ordering the tested subjects based on the obtained value.
You actually wanted to say that I am talking about an INAUDIBLE quality difference, and I agree, and that's why I've been asking if it was intentional or it was an error.  jmvalin answered that.
10
DCC implemented an early kind of lossy or perceptually-based encoding.  As computational power per dollar increased, and more was discovered about the natural insensitivity of the human ear to certain sounds in certain circumstances, it was possible to have better sound quality using less space on the media. Of course, media density and flexibility also improved vastly.

If you haven't noticed, flash memory is eating optical media's lunch and doubling its already good price/performance every few years.

If you can't find what you need along these lines of data compression and media data density in current technology, maybe its time to revisit your requirements.  It is highly unlikely that a student is going to meaningfully improve these areas

This does not answer my question.

That's because you didn't ask the right question the first time. The second time, you did a little better.

Quote
My question was mostly asking how was the data modulated on the actual tape since technically a DCC tape does not differ to a Cassette tape as much as they differ to DAT and VHS tapes. The whole key was the modulation/demodulation process which I can't seem to be able to replicate. I doubt that a DCC tape as a media allowed for higher bandwith storage, it should be ideantical to a cassette tape.

The DCC tape was recorded using similar technology as was used in mainframe and minicomputer data tapes of 10-20 years earlier, which I am familiar with because I used to work for IBM on those things.  Its been a long time since I worked with this technology, and my memory was refreshed by the relevant Wikipedia article which it seems you never bothered to study.

Wikipedia.org Digital Compact Cassette

(1) The tape was recorded with signals that represented 1's, and 0's, not analog signals that varied. This was enhanced by using a similar tape formulation as was used on computer data tapes.

(2) Magnetorestitive tape heads were able to achieve much high longitudinal recording densities.  There were 9 parallel tracks on the tape, 8 for digital audio data and 1 for auxiliary information.  This differed from computer tapes that used the 9th track for parity. I suspect that at this late date a dedicated parity track had been superseded by a more sophisticated recording technique for the 8 data tracks, possibly related to CD Audio technology which had been in use for about a decade by then (1992).


Irrelevant. If you look at where the market is going, namely downloads and streaming, there is no bankable demand for media with longer play times.  Downloads and streaming are selling partially based on the fact that they distribute music in other ways than long playing prepackaged media. I don't see any market for longer playtimes at all.

Quote
You are talking about today standards when I am talking about standards of over a decade ago where record labels would go as far as cutting parts of a track or a remix shorter to fit the maximum length of a Redbook disc.

I am unaware of that happening. If shorter versions of tracks were made, they were typically made for reasons of air time.

I've never ever seen a first release pop CD that came close to filling a whole side of a CD. Maybe some "Greatest Hits" compendiums did that, but I have a large number of those, and they typically don't come close to running 70 or 80 minutes.

The preferences of consumers today are IME not that much different from the way they have been all along. If they could get just the track they wanted, they would have.  I have a few CDs with just a couple of standard length tracks that came nowhere near using the available space.

IME as soon as MP3s became available, they were distributed as single tracks.

Quote
I have Remix compilations that go even a step further and use overburning with longer than allowed playtimes which the playback cuts on the last track mostly on older equipment that don't seem to cope well with such thing. One of the discs funilly enough is a Sterlingsound reference disc that was sent to radio stations containing the instrumentals of an Album track containing the B-Sides too.

I understand that Sterling Sound has mastered over 27,000 CDs. If they approached or stretched some limits with a a few albums, so what?  I am unaware of enough of a demand for longer CDs to justify any extensions to the format. Proof of that is the fact that it never happened.  Proof of the fact of the adequacy of the existing format is the fact that so few commerical CDs even came close to fully exploiting the space that was there.  I think I've seen some cases where an album was spread over 2 discs where the music could have fit on 1. This was apparently done to create perceived value.