Skip to main content
Recent Posts
2
Scientific Discussion / Re: Help me understand why sound is one dimensional
Last post by polemon -
The initial conversation started by my friend pointing out that "its interesting how we are so much better at frequency separation using our ears then through vision. Even though sound is one dimensional, we are still better at separating two different sounds (like two different notes or instruments) then we are at two different frequencies (or wavelengths (I dont know if they are comparable and can be used interchanged)) through vision, as they merge to form a separate color." My position was that we indeed are good at separating wavelengths into individual colors, but he disagreed. But thats for a different topic, I suppose. Also, this was after a few beers so I might be quoting it incorrectly. Sounds like a fun conversation to have at a music festival, right?

So to sum up how I understand it... Sound is considered one dimensional since it at any given point in time only can have one value.  Sorry if this became an ELI5-type of situation, but I am glad to see that this has started a discussion among others.
It is considered one-dimensional because it is a one-dimensional function. I.e. the variable determining the value (amplitude or level) is referenced by only one, scalar value, in our case time. And as we all know, time is one-dimensional, in our perception of the world, anyhow.

What your friend says however is true when it comes to frequency separation, between vision and hearing. However, we have to consider the spectrum. We have to consider these as a fraction of our aural or visual frame. Also, human hearing and vision isn't linear. A change in tone between 50Hz and 60Hz is quite noticeable, while a change between 3000Hz and 3010Hz isn't.

It's also important to note, that it's quite difficult to discern mixed signals in aurally. Add 1kHz, 1.1kHz, and 1.2kHz, and people will have a hard time, separating these three frequencies from a single sound chunk. However in vision, our brain has only three different frequencies to work with, and mixes them rather nicely to create a color gamut. So much so, that it's easier for us to describe a color by it's hue, brightness, and saturation, rather than their red, green, and blue color values. Also, our vision isn't linear either. Our perception of blue is much weaker than green and red. Also, resolution also doesn't line up nicely either. Most of our highest sharpness is in the green. Also, the the spectral width as well as the distance between the frequency responses isn't linear or equal either. The frequency responses of the blue and green cone cells are much closer to each other, than the green and red cone cells. And to make matters worse, our perception in low-light condition changes yet again, this is because rod-cells tend to respond to blue-ish light more than further down the spectrum. In low light conditions, we see "better" with green-blue-ish light, while in bright light conditions, we see better in green/reddish light.
3
Scientific Discussion / Re: Help me understand why sound is one dimensional
Last post by sizetwo -
Thanks for all the feedback. I really appreciate people taking the time to try to make sense of this. Unfortunately, I dont have the necessary mathematical background to understand a lot of the math equations, even though I have tried my best in this case. Also, good to see that its not just me who finds this confusing. Thanks for the lengthy explanation, polemon. As I clearly am not very math-inclined, I think both your explanation in plain English as well as Rotareneg (and Rumbah) makes sense to me here,

Or put very simply:

A linear array of data has one dimension as it requires only a single number to specify which piece of data is being considered. The data contained within the array is irrelevant to the dimensionality of the array itself:
.

The initial conversation started by my friend pointing out that "its interesting how we are so much better at frequency separation using our ears then through vision. Even though sound is one dimensional, we are still better at separating two different sounds (like two different notes or instruments) then we are at two different frequencies (or wavelengths (I dont know if they are comparable and can be used interchanged)) through vision, as they merge to form a separate color." My position was that we indeed are good at separating wavelengths into individual colors, but he disagreed. But thats for a different topic, I suppose. Also, this was after a few beers so I might be quoting it incorrectly. Sounds like a fun conversation to have at a music festival, right?

So to sum up how I understand it... Sound is considered one dimensional since it at any given point in time only can have one value.  Sorry if this became an ELI5-type of situation, but I am glad to see that this has started a discussion among others.
4
Support - (fb2k) / Re: DLNA
Last post by abax2000 -
Currently, only foo_out_upnp is installed (with which I had the short lucky spell).
Still no honey.

Probably success or failure depends on the devices involved (and how every brand implements DLNA).
Any Samsung experience around?
7
Scientific Discussion / Re: Help me understand why sound is one dimensional
Last post by Rotareneg -
Or put very simply:

A linear array of data has one dimension as it requires only a single number to specify which piece of data is being considered. The data contained within the array is irrelevant to the dimensionality of the array itself: It could be a sequence of air pressure measurements (aka sound,) 3D models for a game, video files, forum posts, or a mix of any and all of that. All that matters is that each piece of data is referenced by a single number.
9
Scientific Discussion / Re: Help me understand why sound is one dimensional
Last post by polemon -
If that is your idea of an image, then I would say that a point sound source - or a "point" as a model for an eardrum - would be 0D rather tham 1D ...
But rather than claiming "0D", I would say that your model of an "image" might be wrong or at least not in line with your model of soumd.
Each point in the image carries a compound of (time-) frequencies. So: if you insist on "time" in a sound waveform, why don't you insist on time in the light waveform?

There are at least two answers to that latter question. 1: In how the human eye projects colour down to a triplet. But that is how humans work, not what is emitted. 2: In that you think that sound changes over time; "music", not just "chord". But then the analogy should be motion picture rather than image.
I'm not sure this is helping, but that aside, the "0D" is a bit conflated in mathematics, rather if you think of something having no dimension, they're simply non-dimensional, or: scalar. A point has no dimensional attributes. A point might be addressed by coordinates and return a scalar, or higher-dimension value.

For instance you can map one 2D space into another, a common example is the conversion of polar coordinates into Cartesian coordinates, and back. Cartesian coordinates are points defined by x and y, while polar coordinates are defined by r and θ (where θ is an angle).

To convert from polar to Cartesian, you'd do: f(r, θ) = {r * cos(θ), y * sin(θ)} → {x, y}
To convert from Cartesian to polar, you'd do: g(x, y) = {sqrt(x² + y²), atan(y/x)} → {r, θ}
I.e. both functions take two values, and return two values, one 2D point, returns a 2D number.

In terms of an RGB color bitmap image, you could say that each x and y coordinate for each pixel, returns three values: r, g, b. Of course we can map this number onto a linear scale (since most are limited for all color spaces), but theoretically the color plane is infinite, and cannot be linearized like we do in a fixed color gamut, like 24-bit color, etc. So in these terms, the pixel coordinates in a color picture, return a three-dimensional number value. In case we have a grayscale image, where each pixel is just one number, each pixel coordinates return a scalar value.

Each higher-order value, can be composed of an arbitrary number of dimensions, including scalar. In cases of an RGB color image, each two dimensional pixel coordinate, of which each component is scalar, maps to a three-dimensional value, where each component of that value is a scalar as well.

A point has no length or area or volume. A single point only defines itself. A line is defined by at least two points in n-dimensional space, it may have a length, but no area or volume. A plane is defined by at least three points, which are not on the same gradient as the other two. Planes may have areas, but no volume. And finally a space needs at least four points, etc.
Higher order objects also exist, things like hypercubes, in 4D space, etc. Anything of a higher order than a point, is a set of points.

I believe this is kinda where the confusion of Op comes from. Plotting a waveform is essentially a function that maps all values of a 1-dimensional discrete function into a 2-dimensional discrete plane, where each valid point in the mapped function is assigned one color, and each invalid point no color (background).

Having said that, the statement "Sound is one-dimensional" is incredibly ambiguous. In terms of signal definitions it is, but in terms of propagation in space it isn't. So, yeah...
10
General Audio / Beats Per Minute
Last post by triumphtrident -
   How can I add a heading for BPM? The data is readily available from Mixmeister or other sites. I just don't see a method for doing it.
SimplePortal 1.0.0 RC1 © 2008-2018