2010-11-18, 08:29 AM
So the other day, I thought I'd take a foray into MythTV. I did a little reading and found that I could forget about getting any real use out of my ATI video card. Right-O, I buy an nVidia for just over $100. OK, now I'm sure my garden variety Azurewave (Twinhan) AD-SP200 DVB-S cards will be supported, after all they're common as mud - WRONG! OK, how about my two USB DVB-S units - NOPE, in fact only two or three of all the USB DVB interfaces ever made are supported in Linux... By the time I realised I'd need to drop another couple of hundred bucks on a pair of Linux supported DVB cards, I gave up.
I thought after all this time, the driver support in Linux would be pretty good. The truth is there is a mountain of hardware out there either not supported, or only partially supported.
Sure, I guess if I had started out with the intention of building a MythTV box before I had bought any hardware, I could have done my research and chosen hardware carefully. The real "Myth" is that you can use any old hardware to build a linux box and use it for MythTV. As an example, none of the hardware video enhancements are available to me if I choose to use my recent ATI card, and it's not even clear it it will use all the GPU horsepower to off-load video rendering.
Every couple of years I have a crack at Linux, find it absolutely useless for anything more than web browsing, and un-install. Sure, it can be useful as a back-end server, but as a desktop OS, it's still a waste of perfectly good PC hardware.
I thought after all this time, the driver support in Linux would be pretty good. The truth is there is a mountain of hardware out there either not supported, or only partially supported.
Sure, I guess if I had started out with the intention of building a MythTV box before I had bought any hardware, I could have done my research and chosen hardware carefully. The real "Myth" is that you can use any old hardware to build a linux box and use it for MythTV. As an example, none of the hardware video enhancements are available to me if I choose to use my recent ATI card, and it's not even clear it it will use all the GPU horsepower to off-load video rendering.
Every couple of years I have a crack at Linux, find it absolutely useless for anything more than web browsing, and un-install. Sure, it can be useful as a back-end server, but as a desktop OS, it's still a waste of perfectly good PC hardware.