Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Μ-law algorithm
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Usage justification== μ-law encoding is used because [[speech]] has a wide [[dynamic range]]. In analog signal transmission, in the presence of relatively constant background noise, the finer detail is lost. Given that the precision of the detail is compromised anyway, and assuming that the signal is to be perceived as audio by a human, one can take advantage of the fact that the perceived [[acoustic intensity level]] or [[loudness]] is logarithmic by compressing the signal using a logarithmic-response operational amplifier ([[Weber–Fechner law]]). In telecommunications circuits, most of the noise is injected on the lines, thus after the compressor, the intended signal is perceived as significantly louder than the static, compared to an uncompressed source. This became a common solution, and thus, prior to common digital usage, the μ-law specification was developed to define an interoperable standard. This pre-existing algorithm had the effect of significantly lowering the amount of bits required to encode a recognizable human voice in digital systems. A sample could be effectively encoded using μ-law in as little as 8 bits, which conveniently matched the symbol size of the majority of common computers. μ-law encoding effectively reduced the dynamic range of the signal, thereby increasing the [[Channel coding|coding]] efficiency while biasing the signal in a way that results in a signal-to-[[distortion]] ratio that is greater than that obtained by linear encoding for a given number of bits. [[File:Ulaw.JPG|thumb|400px|right|μ-law decoding as generated with the Sun Microsystems C-language routine g711.c commonly available on the Internet]] The μ-law algorithm is also used in the [[Au file format|.au format]], which dates back at least to the [[SPARCstation 1]] by Sun Microsystems as the native method used by the /dev/audio interface, widely used as a de facto standard for sound on Unix systems. The au format is also used in various common audio [[API]]s such as the classes in the sun.audio [[Java package]] in [[Java platform|Java]] 1.1 and in some [[C Sharp (programming language)|C#]] methods. This plot illustrates how μ-law concentrates sampling in the smaller (softer) values. The horizontal axis represents the byte values 0-255 and the vertical axis is the 16-bit linear decoded value of μ-law encoding.
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Μ-law algorithm
(section)
Add topic