Public Multiformat Listening Test @ 96 kbps (July 2014)
The purpose is to test AAC, Vorbis and Opus at 96 kbps against a classic MP3 128 kbps, and find out which codec produces the best quality.
The test is finished, results are available here.
Thank You To All Who Participated.
Which codecs and settings are tested?
AAC iTunes 11.2.2 with CoreAudioToolbox 188.8.131.52 via qaac 2.41
--cvbr 96(Equivalent to "VBR enabled" in iTunes)
Opus 1.1 with opus-tools-0.1.9-win32
Ogg Vorbis aoTuV Beta6.03
MP3 LAME 3.99.5 *bitrate is around 136 kbps.
AAC FAAC v1.28 (Mid-low Anchor)
AAC FAAC v1.28 (Low Anchor) *bitrate is around 52 kbps.
- AAC iTunes 11.2.2 with CoreAudioToolbox 184.108.40.206 via qaac 2.41
Is it normal that the bitrate is very high on some samples (even 176 kbps)?
Yes, and that is the beauty of VBR encoding - it will simply ignore bitrate limitations whenever possible, using as much bits as needed to encode a problematic sample.
Although that raises issues of fairness, it is the best way to compare modern codecs that shine most in VBR mode. Trying to force a VBR setting to match a desired bitrate, although fairer, is far from the usual practice of audio encoding, where it's more usual that a user just sticks to a quality setting, not caring much about a specific bitrate.
The quality settings for the VBR codecs were chosen because they average out to about 96 kbps over a number of encoded albums. It would be unfair to tie the hands of VBR codecs and punish them for being smart about where to spend what turns out to be the same number of bits over the long run.
Who should take the test?
Anyone interested in lossy audio quality, or people who have no interest but would like to help making this test better are invited. You don't need excellent hearing, but some good gear is welcome. Headphones are a must-have.
Can I take the test even if I am not running Microsoft Windows?
Any person running an operating system with Java 6 or later can participate. Run
java -versionto confirm it's
java version "1.6.0_30"or later.
How do I take the test?
We provide sound samples, with a software that enables you to perform blind tests at home.
Download one or more sample packages from below.
You don't need to test all these samples to participate. Even one single result is already very helpful. Of course, the more you test, the better for the final results' significance.
XXbeing the number of the test you want to take). You need a 7-Zip archiver.
You may need the Sun Java Runtime Environment to run ABC/HR. In case it is not already installed, download it from Java website.
- WINDOWS/MAC USERS: Double-click
- *NIX USERS: Run
java -jar abchr.jarfrom the shell.
Once ABC/HR is open, click "Open ABC/HR Config..." and load the file
You are now ready to take the test.
Notice the six groups, twelve buttons, and six buttons. These six groups are either a representation of AAC, Opus, Ogg Vorbis, MP3, Mid-low anchor, Low anchor. These six groups are shuffled in randomized order, so you don't know which group is which codec.
In each group, there are two buttons. Either one of the left or right buttons plays the encoded sound, and the other button plays the Reference (original) sound. button plays the Reference (original) sound.
The recommended procedure is to first identify the low anchor among the twelve buttons. The low anchor should sounds like the reference played back through a telephone. It may sound hollow, crackly, and overall, bad. Give the low anchor the lowest grade. Lower the slider just above the low anchor button you identified.
Then continue by trying to identify the mid-low anchor. The mid anchor sounds similar to the reference but notably washed out, blurry, warbly, or different in its stereo image. Give the mid anchor a grade in the middle of the scale based on its overall quality when compared to the reference.
Things get harder from here. Try the other buttons and identify the remaining four non-reference. When you are not exactly sure which the non-reference is, it's always advisable to .
Beginners may want to start from the and the low anchor for Sample B. Answer whether X is A or B, and press Next Trial. If you are correct, the results will say . This p-value is likeliness that you get the correct answers by chance. The less p-value means it is less likely that you have passed tests by pure luck; i.e. you certainly have the ability to distinguish A and B. Try lowering the p-value to or below. Now you are very sure you can tell the difference between A and B.
Then, switch to the . The p-value won't be shown, but you can check the Completed number and . If you get more correct answers, you can gray out the reference (5.0) slider after you push , which is much recommended.
Sometimes it can be very difficult to tell the difference between original and non-reference. We do not recommend turning the volume up in that case since it leads to auditory fatigue. Rather, we recommend:
- Try to gray out the slider of easier groups.
- Keep the room silent, if possible. Disabling unnecessary appliances may help.
- Take breaks between sessions.
- Leave it to 5.0.
Remember that you should not risk lowering the reference slider and submitting it. Your result will be discarded if you do so. You can gray out the reference slider after passing the test.
After you finish the test, save the test results as
resultXX, Encrypted Result Files (*.erf).
Mail the erf file to:
All results and comments we receive will be published. If you want to be associated with your results, please enter your (nick)name in the "Show name in results file" field in ABC/HR (check the checkbox next to it to enable the field). Otherwise, your results will be anonymous.
Thank you very much!
- WINDOWS/MAC USERS: Double-click
I don't hear any sound / Java outputs exception
In some cases, ABC/HR might not default to the primary audio driver. Simply change the device from ABC/HR's Options, Settings..., Playback.
When will the test finish?
The test ends on August 30th 2014. No results will be accepted after that date, unless the test is extended. Possible extensions will be announced on this page.
The test finished on 23:59 UTC-12:00, September 12th 2014.
If you have any questions, feel free to contact us via e-mail: email@example.com.