Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: Why do people use WMA? How does it compare to other codecs? etc. (Read 38493 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

Why do people use WMA? How does it compare to other codecs? etc.

Reply #25
Again, the OP asked a simple question 'why use wma'. I present a hypothetical situation where wma pro is most beneficial. If a user uses a windows pc, a zune device, xbox or windows phone, or all of those, they can theoretically benefit from the fact that wma pro is a competitive codec. Anti microsoft sentiment doesn't belong.

Why do people use WMA? How does it compare to other codecs? etc.

Reply #26
What belongs in this discussion is not for you to decide.

As far as love for Apple goes, in my residence resides two Apple laptops, three iPods, two iPhones and an Air Port Extreme.  My first computer was an Apple II.  iTunes is installed on just about every machine I use including those that run Windows.  Still, it would be woefully incorrect to call me an Apple fanboy.

Why do people use WMA? How does it compare to other codecs? etc.

Reply #27
FFmpeg can decode WMA.


AFAIK FFmpeg still cannot properly decode WMA 10 Pro LBR (= low bit rate)

Why do people use WMA? How does it compare to other codecs? etc.

Reply #28
Still trying to figure out how I posted in error. I didn't accuse anyone of being an apple fanboi, somebody else did. In fact Im fairly certain I pointed out the impartiality of the mods of ha, while unbiasedly outlining the uses of wma/wmapro


Why do people use WMA? How does it compare to other codecs? etc.

Reply #30
Ahem. While WMA Standard hasn't been updated in ages, wma pro is fully competitive with any modern codec for file size/transparency. Without providing actual results of extensive listening tests, it outperformed lame, autov vorbis and fhg aac at quality level 50. A user who tends to reside within the microsoft ecosystem would benefit greatly with wma pro as it plays on any zune device, xbox, windows phone or windows pc. It's not a horrible codec to utilize at all. It would seem as though we've exposed a chink in your bipartisan armour?


Aside from actual listening test it outperforms others?  Good argument!  The most recent mass test I've seen (Sebastian's 64kb/s July 2007)  showed WMA-pro 10 within the uncertainty limits between Nero HE-aac and vorbis AoTuV 5.  So, competitive but not outperform.

WRT compatibility in the Msoft ecosystem, they aren't great on maintaining that.

1) Msoft come up with a DRM scheme called "Playsforsure" for WMA, then dumped it when they came out with the Zune player.  Best solution I saw from them to play any music you might have bought on your new Zune was to burn a CD, then rerip it to unprotected files.

2) I don't know all the details, but they updated the WMA pro encoder (from 9 to 10) in a way that older WMA pro players could only playback at half sampling rate.  OK if you use WMP and just have to download an update.  Not so good if you have a portable that doesn't get updates.

Why do people use WMA? How does it compare to other codecs? etc.

Reply #31
FFmpeg can decode WMA.
Unless you're using Debian stable, perharps.


I'm on ubuntu 12.something, so Debian based, playing on Amarok.  When I tried a wma lossless file it put it on the playlist, and popped up a window indicating it was searching for a compatible plugin (which failed).  Perhaps a lossy, non-pro, non-drm wma file would play.

Why do people use WMA? How does it compare to other codecs? etc.

Reply #32
Again, the OP asked a simple question 'why use wma'. I present a hypothetical situation where wma pro is most beneficial. If a user uses a windows pc, a zune device, xbox or windows phone


The least “hypothetical” situation is, I guess “I have .wma files”. As long as you can still play them out-of-the-box, there is no reason to transcode. That even makes me a WMA user, sort of. I never encoded one, but I have the odd demo from a band who did. (I hated .ra so much that I don't even have a single file in my library though.)

Why do people use WMA? How does it compare to other codecs? etc.

Reply #33
Why not?  It's compatibility and support is 2nd only to MP3 (really, nearly universal), it's player is embedded on every Windows based computer, and at high bitrates (200 kbps +), it's as transparent as MP3.  I use all Windows software, have a Windows phone, and am thinking about a Surface.  Why don't you try it and see?  I think you might like it.

I think MP4/AAC is much more universal than WMA, so MP3, MP4/AAC and then WMA.

Why do people use WMA? How does it compare to other codecs? etc.

Reply #34
Quote
http://www.hydrogenaudio.org/forums/index....c=98841&hl=

Man, WMA 9 sounds like Blade at 128 kbps. And a bad codec at medium bitrates is usually bad on higher as well. Windows Media Player has an untuned, old encoder and even Microsoft pretty much silently abandoned the format. I gave WMA a chance and my ears responded "no!".


I'm not trying to start a war here.  You're entitled to your preference for a codec.  But Stevie Wonder could probably ABX a 128 kbps lossy file against any lossless codec as well as you did.


If I had any attachment space left, I would upload those samples. WMA Standard at 128 kbps absolutely messes up the percussion on the songs I encoded and warbles like a faulty cassette tape. It is quite obvious, not a subtle difference. WMA Pro is quite good, but not supported by most devices supporting WMA Standard. And even WMA Pro is only comparable to Vorbis or AAC, it does not surpass them.

Why do people use WMA? How does it compare to other codecs? etc.

Reply #35
Almost nobody uses WMA here as the poll indicates it.

For a love of god, let WMA to die peacefully.

Why do people use WMA? How does it compare to other codecs? etc.

Reply #36
Not sure if hydrogenaudio users are representative.

And once a lossy format once becomes sufficiently popular, then it should be supported for the sake of the files that cannot be replaced. There's nothing like the same downside to killing a lossless format. (Yet bootleggers still trade their .shn files ...)

Why do people use WMA? How does it compare to other codecs? etc.

Reply #37
WMA just got two full pages of people on HydrogenAudio bickering over it.

That's its biggest claim to relevancy in years.

Why do people use WMA? How does it compare to other codecs? etc.

Reply #38
WMA just got two full pages of people on HydrogenAudio bickering over it.

That's its biggest claim to relevancy in years.


Certainly more promotion for WMA in this thread than I've heard from MSFT in years. Are they still developing WMA?

Why do people use WMA? How does it compare to other codecs? etc.

Reply #39
WMA just got two full pages of people on HydrogenAudio bickering over it.

Which was never going to be particularly constructive, even if it were not the case that nobody ever even glancingly approached the OP’s actual questions.

Thankfully, it’s not all just ‘bah ms sux’ in here, and the discussion may well be useful to some, but it would be nice if someone who may have relevant knowledge could contribute some form of reply to TheSeven.

 

Why do people use WMA? How does it compare to other codecs? etc.

Reply #40
FFmpeg can decode WMA.
Unless you're using Debian stable, perharps.


I'm on ubuntu 12.something, so Debian based, playing on Amarok.  When I tried a wma lossless file it put it on the playlist, and popped up a window indicating it was searching for a compatible plugin (which failed).  Perhaps a lossy, non-pro, non-drm wma file would play.


You're apparently running an old ffmpeg. Update and those files will play.

Why do people use WMA? How does it compare to other codecs? etc.

Reply #41
Related thread here: http://www.hydrogenaudio.org/forums/index....mp;#entry863988

I should add that all my android devices have been able to play wma, no problem....

WMA is certainly not my first choice, but I have a small number of these files.

Why do people use WMA? How does it compare to other codecs? etc.

Reply #42
I used to work at Microsoft in the 2000s, during the height (and decline) of Windows Media.

Originally Microsoft shipped updates of Windows Media codecs as part of the Windows Media Format SDK runtime, both in-band (with Windows) and out-of-band (with various Windows Media apps, mostly commonly WMP). The WM Format SDK runtime itself was never made available as a standalone download package; instead it was only distributed as part of apps such as WMP, WME, Movie Maker, and the developer SDK. Windows Media Encoder 9, for example, was actually just a frontend app for codecs installed as part of the Windows Media Format runtime.

The WMF SDK runtime installer, fortunately, was smart enough never to install older binaries than the ones already existing on the system. The upside of this was that you could, for example, install a decade old WME9 encoder on Windows 7 and not have to worry about WME9 overwriting Win7 codecs with some ancient ones. Another advantage of it was that you could update XP or Server 2003 codecs simply by installing a newer version of WMP (which included WM Format runtime). Basically, newer codec DLLs always won out.

That was the case up until Windows Vista when Microsoft decided to stop shipping WM Format updates out-of-band. Starting with Windows Vista, Windows Media codecs were only updated with the operating system. Today, the latest versions of Windows Media codecs are the ones shipping with Windows 8.1 (and soon Win10), and there is no WM Format runtime installer anymore that would allow one to install them.

If this wasn't all complicated enough, the name and DLL versioning was even worse. WMA9 was actually the third version of the WMA codec (hence it's sometimes referred to as WMA3), but was changed to version 9 because that's where the rest of Windows Media ecosystem was at the time. WMA Standard progressed to versions 9.1 and 9.2, but the bitstream syntax actually remained unchanged - the dot versions implied improvements to the encoder, not any changes to the decoder. Same was true for WMA Lossless - there were enhancements made on the encoding side between 9.0 and 9.2, but the bitstream syntax remained unchanged.

WMA Professional progressed from v9 to v10, and the changes happened on both encoding and decoding side. WMA 10 Pro introduced an LBR mode (similar to HEv1 and HEv2 AAC) which kicked in at bitrates <= 96 kbps. WMA 10 Pro LBR bitstreams can be only be decoded at full fidelity by WMA 10 Pro decoders, while WMA 9 Pro decoders can only decode it at half-fidelity (again, similar to the LC vs HE AAC relationship).

It's been a while since I've been directly involved in any Windows Media codec development, but I believe most WM codec development stopped around the time of Windows 7. I'm sure there have been some bug fixes and perf improvements since 2008, so it's fair to assume that wmadmoe.dll in Windows 8.1 is "better" than the one in Windows Vista, but I doubt it's any more efficient in its compression or quality. I could be wrong (and maybe somebody more recently involved with MS codec development can correct me).

WMA Pro was a superior codec to WMA Standard - with support for multi-channel, 24-bit audio, and sample rates up to 192 kHz - but it never quite caught on, which is a shame since it's a pretty decent codec compared to AAC. WMA Lossless was never efficient enough to compete with FLAC and others, but it did have the advantage of enjoying native metadata support in Windows.



Why do people use WMA? How does it compare to other codecs? etc.

Reply #43
Very Informative,

I assume that the last "out-of-band" update for Windows Media Format SDK runtime is still capable of decoding all current versions of codec. Because if not, we would have surely heard of the incompatibility?

In other words, "Windows Media Player 11 for Windows XP" should still be able to decode all versions of the codec.

Why do people use WMA? How does it compare to other codecs? etc.

Reply #44
Pardon my ignorance and please, this is a real question: why people use WMA? I've hear many people using it and I didn't used once.


At a guess it's because WMP is the default audio player for many versions of Windows, if not it's the first thing people see to download. Remember many people are not technical and don't care about their audio, they just want to listen to it on their PC. On Windows, they put a CD in, Windows rips it so it's wma.

You get WMA (lossy and lossless) as part of Windows standard issue, so if your DAP is compatible, one might ask why bother with anything else? Until you buy an iPod or something else that doesn't handle WMA and then, if you're not a technical person, you convert the WMP default (lossy) WMA to MP3 or AAC and you've buggered your music library a bit. If you have the curiosity to do a little reading, you'd probably consider another format, but not everybody does.

I still have quite a few albums in WMA lossless, but I use FLAC these days.

My Samsung DAP plays WMA lossy and lossless, as do Sony players (I'm pretty sure I've seen it claimed by them) and a cheapo Chinese player I tried (can't remember the name). I don't know about other brands, like the Sansa, which seems to be a popular choice on HA.

Why do people use WMA? How does it compare to other codecs? etc.

Reply #45
Very Informative,  I assume that the last "out-of-band" update for Windows Media Format SDK runtime is still capable of decoding all current versions of codec. Because if not, we would have surely heard of the incompatibility?  In other words, "Windows Media Player 11 for Windows XP" should still be able to decode all versions of the codec.


Correct - WM Format 11 was distributed with WMP11, which was the last standalone (out-of-band) version of WMP or WM Format made available for XP and Server 2003. They were almost identical to the player and format runtime that shipped in Vista at the time (in fact, I think they were compiled from the same source). This is the version of the WM format runtime that introduced WMA Standard 9.2 and WMA Pro 10 LBR.

None of the decoder specs have changed since then, so you are correct - any XP or WS2K3 box with WMF11 installed can decode (in full fidelity) every WMV or WMA file created by later versions of Windows.

(If I recall correctly, WMF/WMP11 were not explicitly supported on Server 2003, but they could be installed anyway if the compatibility option on the installer was set to "Windows XP SP2".)

I'd actually be curious to see how WMA 10 Pro from Windows 8.1 stacks up against some modern audio codecs. My guess is that it wouldn't do too badly - maybe not quite on par with the best AAC encoders, but probably better than FAAC and some others. BTW, one correction to my previous post: I had stated WMA 10 Pro supports up to 192 kHz - that's incorrect; it "only" goes up to 96 kHz.

As far as lossy codecs go, there still aren't many other codecs out there which support 24-bit multichannel hi-res audio like WMA Pro does, so I'd say WMA Pro might still have some legitimate advantage in that category.

Finally, for those who are still hanging on to WMA Lossless over FLAC due to its native Windows support, here's some good news from Redmond: http://www.windowscentral.com/windows-10-s...ac-audio-format. Looks like Windows 10 will have native FLAC support, which hopefully includes metadata management too.

Why do people use WMA? How does it compare to other codecs? etc.

Reply #46
As far as lossy codecs go, there still aren't many other codecs out there which support 24-bit multichannel hi-res audio like WMA Pro does, so I'd say WMA Pro might still have some legitimate advantage in that category.


Lossy formats don't really have an associated number of bits per sample.  Instead they just decode to some number of bits per sample, which can be quite high.  For instance, 24 bit output was pretty common for MP3, Vorbis, AAC and ... WMA std.  No idea why MS first decided to start marking that feature with WMA Pro.

Why do people use WMA? How does it compare to other codecs? etc.

Reply #47
Lossy formats don't really have an associated number of bits per sample.  Instead they just decode to some number of bits per sample, which can be quite high.  For instance, 24 bit output was pretty common for MP3, Vorbis, AAC and ... WMA std.  No idea why MS first decided to start marking that feature with WMA Pro.
 

That's a fair point. I'm guessing maybe WMA Standard encoder either required 16-bit PCM input or intentionally truncated LSB > 16, so when WMA Pro was being developed they decided to get rid of that limitation and capitalize on it as a marketing advantage. I'm only speculating, as unfortunately I wasn't on that MSFT team at the time of WMA9 Pro development so I can't explain it first hand.


Why do people use WMA? How does it compare to other codecs? etc.

Reply #48
Just to add to the discussion (I hope!)...

I think the death knell for WMA came when MSN Music and the Zune store both started selling files in MP3 format instead of WMA.

WMA may be a very good codec, but I don't trust that anything will be able to play it 10 or 20 years from now.

When a lot of those "PlayForSure" services shut down in the 90s, such as Buy Music, and Wallmart Music, people got screwed out of their legally purchased music, when the PlayForSure DRM scheme stopped working when the servers were taken down.

I think that whole experience kind of left a sour taste in people's mouths when it came to WMA.

As far as I know, pretty much every DAC on a chip can play WMA and has been able to for quite a long time.  I know I used to own a 20 GB iPod (2nd gen) and it had the hardware needed to play WMA files.  It was just the iPod OS that didn't support it.

I believe WMA reaches transarency at lower bitrates than MP3 does.  But with AAC, MP3 and Ogg Vorbis being the 3 most used lossy codecs connercially for consumer music sales and streaming, I don't see any improvements being made to WMA any more.

Why do people use WMA? How does it compare to other codecs? etc.

Reply #49
I tested WMA on one particular 80s mix CD and got terrible results at 128 kbps, but I just discovered Skyrim and Fallout 4 both encode their sounds as 48 kbps xWMA (AFAIK it is WMA simplified and customized for videogame use) and I never found their sound bad quality (granted, Fallout 4 is a retro-future setting and mostly plays songs that were first recorded on shellac or early magnetic tape, but you can turn the radio off and the ambient ingame music and game sound still sounds hi-fi despite low bitrate). So it depends. A 48 kbps mp3 will be almost unlistenable, 48 kbps wma is OK for game use where audiophile listening is not the priority. I don't believe it reaches transparency before mp3 as 128 kbps LAME was perfectly good on that particular CD. But I'd say under 64 kbps wma is indeed a lot better. Plenty of people would swear how anything below 320 kbps sucks while they've been listening to 48 kbps xWMA game music without even knowing about it. And I bet plenty of them would suddenly find the game music "unlistenable" if you told them the bitrate.

By the way, guys, does WMA 9 Pro use Parametric Stereo? I saw it being referred to use PS here https://www.hydrogenaud.io/forums/index.php?showtopic=58932 but isn't PS an AAC only technology? It also wouldn't make sense to us PS with 64 kbps as AAC uses it only at 32 kbps and below. So does it really use PS or only SBR?