I have one and a half B&K ST-140 amps, and I'd like to convert them to a dual mono amp that would be similar in design (but not identical) to the older versions of the B&K EX-442. One of the amps I have now is the regular stereo model which I've already done some parts replacements and modifications to and sounds good. The other was converted to a monoblock by B&K before I bought it, the seller said about 8 years ago. To do this they just removed one of the two amp boards, keeping the two Mosfet output devices for that one in place, and jumpered over to the pair on the remaining stereo board. Here's my question. I'm hoping that I can just keep the ones in the monoblock together as they are and keep the four drivers in the stereo one as they are too, do the same thing to the stereo amp in terms of removing one board and jumpering the orphaned output devices with the proper partner in the pair that stays on the other board, then put both sides in the monoblock configuration back into the stereo case with the two improved power supplies. I want to make sure that the output devices would not be mismatched, though, and I'm not sure if I can be safe in this assumption. I've put some nice parts in the stereo one I don't want to have wasted this effort and to have sound and reliability problems that result from mismatched output devices. I've heard that most manufacturers just use devices from the same batch (presumably there's some way to tell this now?) and are satisfied with that. I've also read recommendaions and descriptions of equipment where devices were matched to the millivolt level using a big batch of potential candidates, and that this makes all the difference. This was in the context of input stagee on preamps, though. How much precision is needed in matching output devices, specifically Mosfets, in power amp output stages? Thanks is advance for any advice.