atomic_fungus (atomic_fungus) wrote,

#304: Stupid, stupid UPS! (and random bits)

The computer was not delivered on Thursday.

Right now it's 4 AM and it still shows "out for delivery". I'm not sure what that means.

Argh etc.

* * *

While watching an episode of Joan of Arcadia I noticed that Mageina Tovah has got HUGE EYES. I mean HUGE. It's like she's almost a real-live bishoujo anime character--especially the way she's so skinny and tall.

* * *

Big surprise #4,918: The Kyoto Accord is causing economic malaise in Europe. China and India, who are not even remotely beholden to the Accord, are poised to overtake Europe in terms of GDP.

* * *

Interesting point from the atmospheric temperature forcing due to CO2 diminishes logarithmically with concentration. In other words, doubling the CO2 in the atmosphere will produce half as much warming.
If we consider the warming effect of the pre-Industrial Revolution atmospheric carbon dioxide (about 280 parts per million by volume or ppmv) as 1, then the first half of that heating was delivered by about 20ppmv (0.002% of atmosphere) while the second half required an additional 260ppmv (0.026%). To double the pre-Industrial Revolution warming from CO2 alone would require about 90,000ppmv (9%) but we'd never see it - CO2 becomes toxic at around 6,000ppmv (0.6%, although humans have absolutely no prospect of achieving such concentrations).

So: even if we did double the atmospheric concentration of CO2 we're still not going to kill the planet.

BTW that link is to a seriously technical discussion of why anthropogenic CO2 is nothing to worry about.

* * *


Kind of makes me wonder. Remember entry #170 on the ozone hole with the following image?

I was wondering how NASA could justify posting such an image when observed ozone hole data disagreed with it; now we know, I guess.

Just remember: if you destroy or alter inconvenient data, you are not doing science.

* * *

Concentrated stupidity say that there is no scientific evidence--and I mean real science, not "global warming" science--to support any of this would be understating the matter rather severely.

It's really the same kind of thing as with "Monster Cables". Monster Cables are hideously expensive speaker, interconnect, and power cord wires which are supposedly better than the cheap stuff. Somehow, using a $50 power cord will make your $300 receiver sound better.

If you run a scientific test of Monster Cables and compare them to cheap Radio Shack cables, using an oscilloscope, you might see some difference in performance at the high end of the cable's particular bandwidth.

But that is at a frequency measured in megahertz.

For analog audio, the critical frequency bandwidth is a smidge under 20 kilohertz--20 Hz under 20 kHz, in fact, since the human ear's low frequency response rolls off somewhere around 20 cycles per second. And in electronic terms 20 kHz is nothing. The bandwidth of 18 gauge lamp cord is in the tens of megahertz at least and--as we'll see in the next paragraph--a typical audio interconnect cable with RCA plugs has a bandwidth greater than 6 MHz.

For analog video (assuming NTSC video) the critical bandwidth is somewhere around 6 MHz. You can reliably transmit analog video using an audio interconnect cable, and on most TVs or monitors you won't be able to tell the difference without an oscilloscope. But better-quality audio-video interconnect cables include a 75-ohm coaxial cable for the video signal, which is the standard (and recommended) interconnect when dealing with NTSC video. It works fine, and even on high-end displays/TVs/monitors you won't see a difference.

For most people--even people with expensive plasma screen TVs--the cheap $10 set of AV interconnect cables will work just fine. For the others, there are better-quality interconnects which are still relatively inexpensive and probably won't cost more than $20 at most stores.

But if you want Monster Cables, expect to pay $50 for a basic set of AV interconnect cable. And they won't do any better than the $20 cables; in most cases they won't provide any improvement over the $10 set.

The issue gets even more ludicrous when you start talking about digital signals.

The advantage of using a digital signal is noise rejection: the receiver expects to see either a 1 or a 0, and these are defined as ranges rather than absolutes. In TTL digital circuits, for example, a "0" (or "low) is defined as any voltage between 0.8 V and 0.0 V, compared to the voltage at the ground terminal. A "1" (or "high") is any voltage above 2.0 V, up to 5 V. The advantage here is that the signal can be fairly noisy, yet the receiving device won't see any ambiguity: as long as the signal stays above 2 V it is considered a "high", and as long as it stays below 0.8 V it's condsidered a "low".

But what, you ask, if it's in the middle? What if the line is around 1.4 V?

Most engineers won't build an interface without buffering it. A buffer serves a couple of purposes but the most notable one is signal conditioning, particularly when it comes to interconnecting circuits. A good engineer will use an operational amplifier as just such a buffer to force any voltage below 2 V to "low" or zero volts, and any voltage above 2 V to "high" or 5 V.

The important principle here is that, as long as the receiving circuit can tell the 1's from the 0's, it doesn't matter what the actual waveform looks like.

Over a run of three feet or one meter--a typical length for home theater interconnects--there will not be enough signal degradation in an inexpensive digital cable to make any audible or visual difference. Consider that CD-quality audio has a bandwidth of 1411.2 kilobits per second; if your digital audio cable is dropping enough of those 1,411,200 bits per second to audibly degrade the signal, you have a bad interconnect cable!

Digital video can require data transfer rates of up to 4.17 megabytes per second. (More or less, depending on compression.) If your digital video cable is introducing enough noise to make an obvious glitch, you have a bad cable.

The point is, spending three times as much for a "high performance" digital cable is wasting money. If the information carried by the digital signal (the 1's and 0's) is being reliably transported from transmitter to receiver, it doesn't matter how the actual waveform degrades.

So what about this "Frybaby" thing?

The idea that cables need "burn in" is patently ridiculous. There is no way that any effect of "chaotic manufacturing processes" could cause any audible difference in any signal.

Even better, this thing "conditions" digital cables, too.


I guess there is no shortage of stupid audiophiles out there--stupid audiophiles with lots and lots of money to burn.

* * *

Al Gore uses 221 megawatt-hours of electricity per year. He uses literal hundreds of megawatts of power. Damn. Maybe he needs "Monster Cables".

  • Post a new comment


    default userpic

    Your reply will be screened

    Your IP address will be recorded 

    When you submit the form an invisible reCAPTCHA check will be performed.
    You must follow the Privacy Policy and Google Terms of use.