04blackmaxx
Well-Known Member
I think its a way for electronics manufacturers to reduce returns, a bunch of BS.
Rather than dogmatically and somewhat harshly (e.g., "have no clue") insisting that this is unquestionably true, it would be reasonable to acknowledge that this is a controversial issue with possible alternative explanations.People should learn not to comment topcs where they have no clue. Everything needs some kind of break-ins, moore or less, CD-players are no exeption.
Why sound "Better" versus "different" .
That would imply everything was out of tolerance when new, then magically came into tolerance after burning in.
For one thing you know nothing about me to assume I do not understand the concept. I guess what I said you cannot open your mind too. Try to put some thought into what I wrote as it was in the spirit of the original post. The original post talks about electronics sounding better and I said why not just different. Why not In someway worse? So all the circuits get magically better? Maybe a resistor or two got worse after some heat cycles? Different yes, everything magically better is ridiculous.You would not asked if you knew a bit about how things works. First; there will be a lot of static loads in evey new component. And second; most components performs best after they`ve reached a certain temperature, not only tubes.
That`s one of the reasons to give any system something like an hour of warm-up. The other is the human ear, same way as our eyes needs some time to adjust to darkness our ears (or more exactly this thing we have between our ears) needs some time to clean out stress from our short-time memory before we become able to relax, open up and start listening to the music.
Soak testing however is very valid to weed out early onset failures etc.
How can you tell it sounds better? You would need to compare a brand new player to one that is "broken in". I believe that most of break in is just getting used to the new sound and equipment.A lot of folks don't believe in burn in or break in, so I'll just say it this way, my Rega Apollo cd player sounded much better after 100 hrs playing time and now my Rega Saturn is following a similar pattern and improving greatly sfter about 65 hrs.
Not just BS, but there are no facts anywhere to prove otherwise. There is only personal testimony and that ain't science.'Burn in' in CD players, amplifiers, preamplifiers is utter BS IMO.
Components warming up and reaching their design parameters and performance at a given operating temperature, and returning to their "cold" performance levels after cooling down, is well-known in electronics.And second; most components performs best after they`ve reached a certain temperature, not only tubes.
That`s one of the reasons to give any system something like an hour of warm-up. The other is the human ear, same way as our eyes needs some time to adjust to darkness our ears (or more exactly this thing we have between our ears) needs some time to clean out stress from our short-time memory before we become able to relax, open up and start listening to the music.
Components warming up and reaching their design parameters and performance at a given operating temperature, and returning to their "cold" performance levels after cooling down, is well-known in electronics.
That's very different from the usual descriptions of "burn-in", in which a component supposedly changes permanently in some unspecified fashion after running for a while -- particularly when it's speaker wires and interconnects. That is not a known effect in electronics, aside from long un-used electrolytic capacitors that "re-form" (increase in capacity from low capacitance to their nominal value) as they're used.
This thread is about hypothetical and controversial burn-in, not well-known and well-established warm-up. Making the distinction clear was, and is, warranted.Oh come on Voorhis, aren`t you able to see that this is just about that, the "warm up" procedure that comes on top of the "burn in" process. Did you read my posting With the intention to understand what I said or just to find something to pick on?
At least not in the generic electronics world.That is not a known effect in electronics, aside from long un-used electrolytic capacitors that "re-form" (increase in capacity from low capacitance to their nominal value) as they're used.
We "moved on from there" because we had scientific evidence to do so. I assume you have the same for "universal break-in theory"?Some of the flat-earthers here might want to try arguing with Nelson Pass, who will tell you that his Class A designs sound better (though maybe not by a lot) after being left on for an hour or so and reaching operating temperature.
It's a similar thing with break-in. I can tell you that from experience with Morrow cables and e-stat speakers for example (where an e-stat diaphragm is basically a capacitor), among many other components over the last few decades.
There was a time when we believed the Sun revolved around the Earth, and witches were burned at the stake. Thankfully some of us have moved on from there.