I have to say. Being new to Linux and trying to administer my own system I have had loads of trouble. Initially with eth(0), which seems to have been fixed in newer kernels and now I am trying to work out how to compile my own kernel because the basic setup to install the Nvidia drivers doesn't work on my Dell Inspiron 9300. The Go6800 is a pretty standard card. i wouldn't have imagined that it should be difficult. I will persist, and continue my studies in how this all works. I am sure it is only my ignorence holding me back.
I can see that a good option would be for the initial installation to have two GRUB entries for the first kernel.
- Normal boot.
- Reduced level boot.
It might make things easier the first time around.
For fresh installs where rebooting the system after install and before firstboot, such an option being availble would be handy. At least grub does make it easier to change runlevels or add parameters to the boot command when problems arise
With the GCC4 problem that really took its toll on X for FC4 release, it would have spared users from turning away from Fedora, as stated by some who were detoured to other distros temporarily. A new set of FC4 ISOs with all improvements to date should be released once FC3 becomes unsupported. Sooner if developers or community provided an updated set of ISOs to lessen the frustrations new installers of FC4 experienced. FC5 should be a lot cleaner of a distro snapshot. (pure guess) - The time period between the two snapshots and things learned from the FC3 to FC4 snapshot which had the shorter time period to smooth out the installation ISOs for FC4.
Let's hope for a better FC5 and if we help weed out the bugs through the upcoming beta cycle.
Jim
Hi all!
I intent to buy a RHEL 4 WS. For that reason I have some questions:
- Are fedora packages with RHEL compatible? - RHEL ships with yum! Are repositories out there for it like LIVNA?
Thank s for any answer,
Claus
___________________________________________________________ Gesendet von Yahoo! Mail - Jetzt mit 1GB Speicher kostenlos - Hier anmelden: http://mail.yahoo.de
Claus Reheis wrote:
Hi all!
I intent to buy a RHEL 4 WS.
Then you should be on the RHEL 4 list.
https://www.redhat.com/mailman/listinfo/nahant-list
For that reason I have some questions:
- Are fedora packages with RHEL compatible?
There is no direct corrilation between Fedora Core releases and RHEL releases. Some Fedora packages may work on RHEL, some may not.
- RHEL ships with yum! Are repositories out there for it like LIVNA?
RHEL doesn't ship with yum. There are third party packages repositries out there for RHEL (Dag comes to mind).
http://dag.wieers.com/home-made/apt/FAQ.php#B
On Sun, 2005-10-30 at 22:23 -0500, William Hooper wrote:
Claus Reheis wrote:
Hi all!
I intent to buy a RHEL 4 WS.
Then you should be on the RHEL 4 list.
https://www.redhat.com/mailman/listinfo/nahant-list
For that reason I have some questions:
- Are fedora packages with RHEL compatible?
There is no direct corrilation between Fedora Core releases and RHEL releases. Some Fedora packages may work on RHEL, some may not.
- RHEL ships with yum! Are repositories out there for it like LIVNA?
RHEL doesn't ship with yum. There are third party packages repositries out there for RHEL (Dag comes to mind).
http://dag.wieers.com/home-made/apt/FAQ.php#B
-- William Hooper
Thanks
___________________________________________________________ Gesendet von Yahoo! Mail - Jetzt mit 1GB Speicher kostenlos - Hier anmelden: http://mail.yahoo.de
On Sat, Oct 29, 2005 at 09:56:16PM -0700, David Abbott wrote:
I have to say. Being new to Linux and trying to administer my own system I have had loads of trouble.
That's pretty normal, when you're new to something, generally.
I am trying to work out how to compile my own kernel because the basic setup to install the Nvidia drivers doesn't work on my Dell Inspiron 9300. The Go6800 is a pretty standard card. i wouldn't have imagined that it should be difficult.
This you can blame on nvidia, for refusing to release the specs of their hardware so that a proper driver could be integrated into the Kernel. Instead, we're dependent upon them to try to support a binary-only kernel module on dozens of different versions of the Linux kernel (both official releases, and also vendor-modified kernels).
They are clearly not up to the task. Not that we should expect they would be; it's a pretty big task... But it becomes a small task if nvidia either releases source code for their drivers, or releases complete specs to their hardware, so that a driver can be written and maintained as part of the Linux kernel proper.
Another issue is that Fedora Core is intended to be a cutting edge development platform, and as such there will always be bugs. In some ways I think it is really not a very good choice for someone venturing into the world of Linux for the first time... particularly for people who are not already somewhat adept with computers. Except that you will learn a lot by figuring out how to fix all the bugs and make your system work the way you want it to. So in that regard, it's a good thing. ;-)
I will persist, and continue my studies in how this all works.
Good; if you do, you will be rewarded with a much greater understanding of how your machine works, and why it works that way. Enjoy!
On Sun, 2005-10-30 at 02:58 -0500, Derek Martin wrote:
On Sat, Oct 29, 2005 at 09:56:16PM -0700, David Abbott wrote:
I have to say. Being new to Linux and trying to administer my own system I have had loads of trouble.
That's pretty normal, when you're new to something, generally.
I am trying to work out how to compile my own kernel because the basic setup to install the Nvidia drivers doesn't work on my Dell Inspiron 9300. The Go6800 is a pretty standard card. i wouldn't have imagined that it should be difficult.
This you can blame on nvidia, for refusing to release the specs of their hardware so that a proper driver could be integrated into the Kernel. Instead, we're dependent upon them to try to support a binary-only kernel module on dozens of different versions of the Linux kernel (both official releases, and also vendor-modified kernels).
They are clearly not up to the task. Not that we should expect they would be; it's a pretty big task... But it becomes a small task if nvidia either releases source code for their drivers, or releases complete specs to their hardware, so that a driver can be written and maintained as part of the Linux kernel proper.
Another issue is that Fedora Core is intended to be a cutting edge development platform, and as such there will always be bugs. In some ways I think it is really not a very good choice for someone venturing into the world of Linux for the first time... particularly for people who are not already somewhat adept with computers. Except that you will learn a lot by figuring out how to fix all the bugs and make your system work the way you want it to. So in that regard, it's a good thing. ;-)
I will persist, and continue my studies in how this all works.
Good; if you do, you will be rewarded with a much greater understanding of how your machine works, and why it works that way. Enjoy!
-- Derek D. Martin http://www.pizzashack.org/ GPG Key ID: 0x81CFE75D
My first forray into Linux was stopping at Barnes and Noble on the way home, and getting a Slackware 3.1 book. Got a basic install without any x server the first weekend( never did so many ftp's in one weekend, hehe). Two weekends later I finally had an X server going, granted 99.99% of what I was doing was beyond me at that time, but the measly 0.01% at that time helped my career take off.
As far as issues with nvidia, you should be able to get a display going, you may have to manually edit your config (/etc/X11/xorg.conf) and lower settings for color depth and resolution, I can't fathom that generic vga is broken, but with Nvidia nothing surprises me. Since I stopped using their cards my prodcutivity has increased dramatically.
If you post your xorg.conf I am sure you will get some suggestions to try, if you have an old non nvidia pci card kicking around you can throw that in, it is likely to work. That is one of the best things about Linux, generally once a piece of hardware works, it will be supported for a very long time.
Regards, Ted
On Sun, 30 Oct 2005, Derek Martin wrote:
On Sat, Oct 29, 2005 at 09:56:16PM -0700, David Abbott wrote:
I have to say. Being new to Linux and trying to administer my own system I have had loads of trouble.
That's pretty normal, when you're new to something, generally.
I am trying to work out how to compile my own kernel because the basic setup to install the Nvidia drivers doesn't work on my Dell Inspiron 9300. The Go6800 is a pretty standard card. i wouldn't have imagined that it should be difficult.
This you can blame on nvidia, for refusing to release the specs of their hardware so that a proper driver could be integrated into the Kernel. Instead, we're dependent upon them to try to support a binary-only kernel module on dozens of different versions of the Linux kernel (both official releases, and also vendor-modified kernels).
They are clearly not up to the task. Not that we should expect they would be; it's a pretty big task... But it becomes a small task if nvidia either releases source code for their drivers, or releases complete specs to their hardware, so that a driver can be written and maintained as part of the Linux kernel proper.
But (rightly or wrongly) video card makers consider these two pieces of information to be their competitive edge. So you are not likely to see them releasing either, at least for their latest cards. Also, Linux users still aren't a very large fraction of their customer base.
It also would be a small task if there were standards that allowed them to write a single installer that would work with any distribution, without having to deal with loads of special cases.
Nevertheless, I've found that the NVIDIA drivers work reasonably well. I use the RPM packages from rpm.livna.org, but there are also packages from atrprms.net. These are updated rapidly when new official kernels come out, and they aren't hard to rebuild if necessary for other cases.
You could also "blame" Red Hat and Fedora for their policy of not including proprietary/binary-only packages in their distributions. I admire them for sticking to their guns on this issue, but it does affect usability, particularly with respect to cartain hardware drivers and multimedia codecs.
Another issue is that Fedora Core is intended to be a cutting edge development platform, and as such there will always be bugs. In some ways I think it is really not a very good choice for someone venturing into the world of Linux for the first time... particularly for people who are not already somewhat adept with computers. Except that you will learn a lot by figuring out how to fix all the bugs and make your system work the way you want it to. So in that regard, it's a good thing. ;-)
I will persist, and continue my studies in how this all works.
Good; if you do, you will be rewarded with a much greater understanding of how your machine works, and why it works that way. Enjoy!
On Sun, Oct 30, 2005 at 04:53:38AM -0500, Matthew Saltzman wrote:
They are clearly not up to the task. Not that we should expect they would be; it's a pretty big task... But it becomes a small task if nvidia either releases source code for their drivers, or releases complete specs to their hardware, so that a driver can be written and maintained as part of the Linux kernel proper.
But (rightly or wrongly) video card makers consider these two pieces of information to be their competitive edge. So you are not likely to see them releasing either, at least for their latest cards. Also, Linux users still aren't a very large fraction of their customer base.
I'm well aware of the reasons... But support for their hardware (or rather usability of it) suffers because of them.
As for what fraction of their customer base uses Linux, those are some statistics I'd like to see... But I think it raises some interesting questions, like what percentage of their users actually use both? What percentage of Linux users who DON'T use their hardware would switch if there were native drivers?
Also don't forget that there are other free OSes, such as the *BSD family (and still others as well). All of these would benefit from such a release as well... When you add up all users of all free OSes, how does that compare to say, users of Mac OS X? I'm not sure, but I believe I have read that it is much larger...
It also would be a small task if there were standards that allowed them to write a single installer that would work with any distribution, without having to deal with loads of special cases.
This is largely impossible. In this case we're dealing with the kernel, so the reasons are slightly different (but very similar) than for application programs.
The kernel is very actively developed. The distribution vendors often make their own custom modifications, to enhance functionality or performance, or just to fix bugs. For a hardware vendor to maintain its own binary-only driver which is compatible with all of these varying kernels is a task which is, practically speaking, essentially impossible. So, from time to time, with various kernels, their driver will crash your system.
I found this to be the case with a recent release of their driver and glx libs in combination with a recent release of the kernel. After updating from Livna maybe a month ago, my system locked up hard with the latest kernel and glx stuff when X tried to start, even after rebooting several times. Reverting to an older glx library fixed the problem for me.
I guess I do not "blame" them for wanting to keep their trade secrets, but when the newest release of their drivers crashes your system, the above is the direct cause, even if you can argue that the "fault" isn't theirs... The only practical solution is for them to release the code, whether or not they are willing to do so.
Nevertheless, I've found that the NVIDIA drivers work reasonably well.
Usually. Unless they crash your system, which happens from time to time, i.e. this was not the first time. Also with certain earlier versions of their driver, my system would experience random lock-ups while I was using it (which for my money is a lot more annoying than crashing at boot time). Both the last version I used and the version I am using now don't seem to suffer from that problem...
You could also "blame" Red Hat and Fedora for their policy of not including proprietary/binary-only packages in their distributions.
No, I think you really can't; in many cases (if not all) it is illegal for them to do so, which is a large part of why they don't do it. Even if it is not illegal in your country, it might be in many others... Their hands are tied. But, this is why we have the extras and Livna RPM repositories...
I admire them for sticking to their guns on this issue, but it does affect usability, particularly with respect to cartain hardware drivers and multimedia codecs.
Which are illegal for them to redistribute, because they are proprietary. Or, they would have to pay licensing fees in some cases, which means they would have to charge you (more) for their products. I agree this is inconvenient, but I'd much rather suffer some inconvenience than see Red Hat sued out of existence... Don't you agree? :-)
On Sun, 2005-10-30 at 10:10, Derek Martin wrote:
But (rightly or wrongly) video card makers consider these two pieces of information to be their competitive edge. So you are not likely to see them releasing either, at least for their latest cards. Also, Linux users still aren't a very large fraction of their customer base.
I'm well aware of the reasons... But support for their hardware (or rather usability of it) suffers because of them.
If you are going to assign blame here, consider that there are several parties involved in creating this problem and that other popular OS's do not cause end-user problems by refusing to include vendor-supplied binaries. Nvidia can't legally change their position no matter how much the GPL crowd wants to pretend that it is their fault.
As for what fraction of their customer base uses Linux, those are some statistics I'd like to see... But I think it raises some interesting questions, like what percentage of their users actually use both? What percentage of Linux users who DON'T use their hardware would switch if there were native drivers?
But they do make an effort to provide native drivers.
It also would be a small task if there were standards that allowed them to write a single installer that would work with any distribution, without having to deal with loads of special cases.
This is largely impossible. In this case we're dealing with the kernel, so the reasons are slightly different (but very similar) than for application programs.
The reason here is not technical.
The kernel is very actively developed.
Is that an excuse for not freezing a driver api so vendors can supply optimized drivers that don't have to be re-written every few months? Other OS's have managed to solve this problem.
The distribution vendors often make their own custom modifications, to enhance functionality or performance, or just to fix bugs. For a hardware vendor to maintain its own binary-only driver which is compatible with all of these varying kernels is a task which is, practically speaking, essentially impossible. So, from time to time, with various kernels, their driver will crash your system.
If you follow the fedora list you'd be well aware that crashes happen as a result of a lot of the other drivers as well. You can't blame that all on vendor binaries.
I guess I do not "blame" them for wanting to keep their trade secrets, but when the newest release of their drivers crashes your system, the above is the direct cause, even if you can argue that the "fault" isn't theirs... The only practical solution is for them to release the code, whether or not they are willing to do so.
Or use an OS with a more stable design...
Nevertheless, I've found that the NVIDIA drivers work reasonably well.
Usually. Unless they crash your system, which happens from time to time, i.e. this was not the first time.
But you can say that about a lot of other drivers too. I crash regularly if I try to run raid with a firewire drive included and that's all built from source, no one else to blame.
You could also "blame" Red Hat and Fedora for their policy of not including proprietary/binary-only packages in their distributions.
No, I think you really can't; in many cases (if not all) it is illegal for them to do so, which is a large part of why they don't do it.
I don't think that has anything to do with their choice not to do it.
Even if it is not illegal in your country, it might be in many others... Their hands are tied.
How would other OS's be able to include drivers if it were illegal to distribute vendor supplied binaries?
I admire them for sticking to their guns on this issue, but it does affect usability, particularly with respect to cartain hardware drivers and multimedia codecs.
Which are illegal for them to redistribute, because they are proprietary. Or, they would have to pay licensing fees in some cases, which means they would have to charge you (more) for their products. I agree this is inconvenient, but I'd much rather suffer some inconvenience than see Red Hat sued out of existence... Don't you agree? :-)
Being proprietary doesn't automatically make it illegal to redistribute. That depends on the license supplied by the copyright holder, and you need to separate the cases in your argument above. I can understand RedHat not including outside binaries even if they are legal to redistribute in their supported products since it might add problems that they would be unable to solve (at least if they have the hubris to think they would be better than the vendor at this...). It doesn't make any sense other than some kind of political statement in an unsupported product like fedora, though. And for the things that are freely available but can't be redistributed, the distribution could include push-button scripts to pull the required files from their home sites, letting the end user respond to any needed click-through agreements.
On Sun, Oct 30, 2005 at 12:24:27PM -0600, Les Mikesell wrote:
I'm well aware of the reasons... But support for their hardware (or rather usability of it) suffers because of them.
If you are going to assign blame here, consider that there are several parties involved in creating this problem and that
If you are refering to the kernel developers, their decision is based on not allowing a fixed API to stand in the way of making the OS better. This is the right decision; failure to adhere to such a policy is one of the things that has made, for example, Microsoft Windows so unstable over the years. Their initial design was flawed, but they could not fix it without breaking everyone. So it stayed broken. And it remains broken today... Yay!
other popular OS's do not cause end-user problems by refusing to include vendor-supplied binaries.
I'm assuming that here, you mean other free OSes. Vendors which only sell their software for profit must be excluded, because they do not give away the source code and binaries for their OS, and therefore need not be concerned about paying licensing fees or signing NDAs. They have that luxury -- they can afford it. Linux is free software, and as such does not have the same luxury. Why can't Linux vendors afford to pay license fees? If most of their users aren't paying for the software, how can they collect them? It's a fundamental necessity of free software.
Now, it's certainly possible that Red Hat could enter into some sort of agreement with NVidia to distribute their drivers with the OS. It may even be the case that NVidia would not charge them license fees for distributing the drivers with Fedora, though I have my doubts. However, in absence of such an agreement, Red Hat MUST NOT distribute the drivers:
2.1.1 Rights. Customer may install and use one copy of the SOFTWARE on a single computer, and except for making one back-up copy of the Software, may not otherwise copy the SOFTWARE. This LICENSE of SOFTWARE may not be shared or used concurrently on different computers.
[From the license agreement for NVidia's Linux drivers]
Nevermind that this clause is completely preposterous, and that NVidia is being retarded. The end-user is free to download the software as many times as they like, agreeing to this agreement each time, and use it on as many systems as they have such hardware... Nevertheless the terms of the EULA prohibit copying and distributing the drivers.
[Driver software does one thing only: drive hardware. A license to use it should be automatic for anyone owning that peice of hardware, e.g. 1 piece of hardware = 1 license to use the correspoding driver, including any and all updates.]
Nvidia can't legally change their position no matter how much the GPL crowd wants to pretend that it is their fault.
Why not?
As for what fraction of their customer base uses Linux, those are some statistics I'd like to see... But I think it raises some interesting questions, like what percentage of their users actually use both? What percentage of Linux users who DON'T use their hardware would switch if there were native drivers?
But they do make an effort to provide native drivers.
Well, we're using different definitions of "native" here. I mean native in the sense of code which is included and properly integrated into the kernel source tree.
It also would be a small task if there were standards that allowed them to write a single installer that would work with any distribution, without having to deal with loads of special cases.
This is largely impossible. In this case we're dealing with the kernel, so the reasons are slightly different (but very similar) than for application programs.
The reason here is not technical.
Yes, it is. It is a technical decision on the part of the kernel developers which causes this to be (essentially) impossible for the vendor. The decision is to make the kernel better, at the cost of stability.
The kernel is very actively developed.
Is that an excuse for not freezing a driver api so vendors can supply optimized drivers that don't have to be re-written every few months?
Yes it is. If the API is wrong, it needs to be fixed. It can't be kept just because it will break 3rd party code. Integrating the driver into the kernel source solves the problem, because when the API changes, all driver code that uses it will get fixed at the same time. That said, I do agree to some extent; see below.
Other OS's have managed to solve this problem.
You mean like Windows? It's a lot better than it used to be, but I've seen far too many blue screens for that to hold water with me... If you're talking about other OSes, I can't really speak to that. I have little familiarity with the other free Unixes or with Mac OS X.
Other commercial OSes may be more stable, but most of them also support a much smaller selection of hardware (I'm thinking commercial Unix here, mostly).
The distribution vendors often make their own custom modifications, to enhance functionality or performance, or just to fix bugs. For a hardware vendor to maintain its own binary-only driver which is compatible with all of these varying kernels is a task which is, practically speaking, essentially impossible. So, from time to time, with various kernels, their driver will crash your system.
If you follow the fedora list you'd be well aware that crashes happen as a result of a lot of the other drivers as well. You can't blame that all on vendor binaries.
Indeed. But as I said, Fedora is meant to be a cutting-edge development platform. The drivers are bound to be broken.
FWIW, I do disagree with the kernel development model to this extent: If the kernel is supposed to be stable, it ought to be. It remains true as it has been historically so that early releases of the stable kernel really are development kernels in disguise. ;-) I think the kernel developers ought to adopt a 3-tier approach to testing the kernel, much like Debian does with their distribution. [I'm not saying that Debian's development model is inherently better than any other -- I happen to think that their model makes stable releases of the distro far too infrequent -- but I think it would be well suited to kernel development.] The trouble is, of course, getting enough users to use testing kernels to make bug identification and fixing feasable. And I believe this is the argument for the model we are stuck with now.
But an in-depth descussion of these issues is really not on-topic for this list, I think. :)
I guess I do not "blame" them for wanting to keep their trade secrets, but when the newest release of their drivers crashes your system, the above is the direct cause, even if you can argue that the "fault" isn't theirs... The only practical solution is for them to release the code, whether or not they are willing to do so.
Or use an OS with a more stable design...
Indeed. Fedora is not meant to be that, and isn't. It's important to remember context when having such discussions... That's why I said in a different message that I don't think Fedora is necessarily the right choice for someone who's just breaking into Linux. You WILL have problems... it's not stable. Get over that, or use something else.
Nevertheless, I've found that the NVIDIA drivers work reasonably well.
Usually. Unless they crash your system, which happens from time to time, i.e. this was not the first time.
But you can say that about a lot of other drivers too. I crash regularly if I try to run raid with a firewire drive included and that's all built from source, no one else to blame.
There is a big difference though. With the mainstream kernel, stability is generally a function of how many people are using a particular device in a particular configuration. Popular devices are generally extremely stable. NVidia graphics cards SHOULD be such a device, since they are quite popular, but they are not.
You could also "blame" Red Hat and Fedora for their policy of not including proprietary/binary-only packages in their distributions.
No, I think you really can't; in many cases (if not all) it is illegal for them to do so, which is a large part of why they don't do it.
I don't think that has anything to do with their choice not to do it.
Well, you mean they could obtain license agreements to distribute the drivers. While true, this is financially infeasable. OSS OS vendors give away a lot more software than they sell; they'd still be liable for license fees for each copy downloaded, for which they aren't receiving any money. That's a big problem.
There's also the problem of supporting someone else's binary only code. That's a nightmare.
It doesn't make any sense other than some kind of political statement in an unsupported product like fedora, though.
See above about money. Some vendors probably are happy to give away such license agreements... OTOH those vendors' hardware probably already has a maintstream kernel, for exactly that reason. Of course, I do not do business with these companies, so I can't really say what sort of compensation they would expect.
What I can tell you is this: business is about making money. It's true that some companies, mostly privately owned ones, do take political stands. But in the end it's about making money. If it were in Red Hat's best interest to distribute the drivers, I'm sure they would. For reasons that you and I will never completely understand, it apparently isn't. I for one don't find that very surprising, given the respective natures of free software and business.
On Sun, 2005-10-30 at 13:34, Derek Martin wrote:
On Sun, Oct 30, 2005 at 12:24:27PM -0600, Les Mikesell wrote:
I'm well aware of the reasons... But support for their hardware (or rather usability of it) suffers because of them.
If you are going to assign blame here, consider that there are several parties involved in creating this problem and that
If you are refering to the kernel developers, their decision is based on not allowing a fixed API to stand in the way of making the OS better.
Oh, so you are complaining because it is so much better?
This is the right decision; failure to adhere to such a policy is one of the things that has made, for example, Microsoft Windows so unstable over the years. Their initial design was flawed, but they could not fix it without breaking everyone. So it stayed broken. And it remains broken today... Yay!
Yes, you should be able to get it wrong a few times. We've lived through that already. How many years has Linux been around now? If it isn't right yet, it probably isn't ever going to be. Freeze the thing and give up. And don't make that argument about Microsoft unless you are prepared to demonstrate how much better your video works under Linux. MS has done plenty of things wrong, but dealing with vendor's video drivers isn't one of them.
other popular OS's do not cause end-user problems by refusing to include vendor-supplied binaries.
I'm assuming that here, you mean other free OSes.
No.
Vendors which only sell their software for profit must be excluded, because they do not give away the source code and binaries for their OS, and therefore need not be concerned about paying licensing fees or signing NDAs.
You've been drinking too much of that GPL kool-aid. Source is not necessary for something to be free. It would be entirely possible to link in vendor-supplied binaries if both sides were reasonable about it.
They have that luxury -- they can afford it. Linux is free software, and as such does not have the same luxury. Why can't Linux vendors afford to pay license fees? If most of their users aren't paying for the software, how can they collect them? It's a fundamental necessity of free software.
That would be a good argument if there were some reason to think license fees would be involved.
[From the license agreement for NVidia's Linux drivers]
Nevermind that this clause is completely preposterous, and that NVidia is being retarded. The end-user is free to download the software as many times as they like, agreeing to this agreement each time, and use it on as many systems as they have such hardware... Nevertheless the terms of the EULA prohibit copying and distributing the drivers.
Even given that, it could be fully-automatic when the hardware was detected and the internet was available.
Nvidia can't legally change their position no matter how much the GPL crowd wants to pretend that it is their fault.
Why not?
Nvidia assembles components manufactured by others and has to abide by the NDA's that come with them. They can't release source even if they wanted.
But they do make an effort to provide native drivers.
Well, we're using different definitions of "native" here. I mean native in the sense of code which is included and properly integrated into the kernel source tree.
I think you are overestimating the value of access to the source code. Compare the video on the Mac and Windows even on hardware where the Linux folks have all the specs and their own sources. See if you can find one where you can identify what you've gained from that source code.
This is largely impossible. In this case we're dealing with the kernel, so the reasons are slightly different (but very similar) than for application programs.
The reason here is not technical.
Yes, it is. It is a technical decision on the part of the kernel developers which causes this to be (essentially) impossible for the vendor. The decision is to make the kernel better, at the cost of stability.
If it were better we wouldn't have anything to talk about here. We are talking about problems, remember?
Other OS's have managed to solve this problem.
You mean like Windows? It's a lot better than it used to be, but I've seen far too many blue screens for that to hold water with me...
I blame most of those blue screens om Microsoft's own coding but since Windows NT SP6a it has been possible to keep a windows box running long enough to call it stable.
If you're talking about other OSes, I can't really speak to that. I have little familiarity with the other free Unixes or with Mac OS X.
If you are going to make a claim about the Linux approach being 'better' you need to be able to point out some other things that are worse. OS X would be a good one to compare since they use a lot of nVidia video.
Other commercial OSes may be more stable, but most of them also support a much smaller selection of hardware (I'm thinking commercial Unix here, mostly).
The x86 version of Solaris might be interesting too. I have no idea what they include but they are distributing it for free. From what is on line, it looks like they include 3d support for nvidia under Xorg.
If you follow the fedora list you'd be well aware that crashes happen as a result of a lot of the other drivers as well. You can't blame that all on vendor binaries.
Indeed. But as I said, Fedora is meant to be a cutting-edge development platform. The drivers are bound to be broken.
The drivers only break when the API changes, which only has to happen when it was wrong before...
Indeed. Fedora is not meant to be that, and isn't. It's important to remember context when having such discussions... That's why I said in a different message that I don't think Fedora is necessarily the right choice for someone who's just breaking into Linux. You WILL have problems... it's not stable. Get over that, or use something else.
My experience is that around X.X.20, Linux kernels stop giving me unpleasant surprises. The bulk of my servers are still running 2.4.x based distributions and probably will for a while longer.
With the mainstream kernel, stability is generally a function of how many people are using a particular device in a particular configuration. Popular devices are generally extremely stable. NVidia graphics cards SHOULD be such a device, since they are quite popular, but they are not.
How popular can something be when it needs a 3rd party driver and there is no coordination from the OS distribution to make sure it is available before making changes that break the previous one?
There's also the problem of supporting someone else's binary only code. That's a nightmare.
It wouldn't be a nightmare with a stable API.
Remember, we're talking about ONE driver. One driver which does not fit the development model of Linux.
On Sun, Oct 30, 2005 at 10:18:17PM -0600, Les Mikesell wrote: [Re: Kernel API]
Freeze the thing and give up.
No thanks. For all but ONE driver that I use, which the Linux kernel developers did not write and don't support, the Linux kernel has proven far more stable than Windows for the last 10 years on every system I've used them both on. I'm quite satisfied with the development model. For this one specific case (the nvidia driver), I would prefer that the vendors cave, or else for things to remain as they are. I'm quite happy that Linux has the development model it does, because for the last ten years, my experience has been that while all my colleagues and friends and family members who stick steadfastly to their Windows OSes curse their machines for crashing in the middle of important work they were doing, I keep right on going, doing whatever it was I was doing, with narry a complaint... ;-) If it means that I need to be careful about which version of the damned proprietary NVidia drivers I install, because version X crashes my system, but Y doesn't (at least only sometimes, and not until I try to shut the system down), I can live with that. It's just one driver. If I eventually get really annoyed, I can just use the Xorg driver (which I have used extensively in the past, and which has never crashed my system)... I don't really NEED the extra performance, though I admit it's nice to have...
Open-source code (in general) IS better than proprietary, not because of ideology, but simply because a lot more people are looking at the code, and those people are doing it not just for pay (or not for pay at all), but because they just love doing it. It's a labor of love... Commercial vendors just simply can't compete with that. Free code is better in practice, because it just can't help but be. My 10 years of managing both Windows systems and Linux systems professionally have proven that out; Windows crashes a lot (less so today, but even still, it does) and Linux doesn't [with certain well-understood exceptions involving drivers that are highly experimental or still under heavy active development -- or proprietary drivers like this one, which break the model]. You can say I've drank too much GPL kool-aid all you like; I've watched the development of both Linux and Windows very closely -- it's just a fact that Linux is generally more stable than Windows, and has been since the 0.something days. It's funny that, even with constant development, and constantly changing Kernel APIs, Linux has been much more stable than Windows since forever. While Windows HAS improved a lot in the last 5 years or so, from the standpoint of stability (and other ways as well) it's been playing catch-up with Linux since there was Linux. You have to ask yourself why that is, given that Microsoft has been around a lot longer... My conclusion is that their development model, with their fixed APIs, which you claim is better, simply isn't.
I actually had a lot more to say to rebutt your specific points, but I've decided that it's a waste of time to pursue this; you're entitled to your opinion, even if you are wrong... ;-)
People never believe me when I tell them I was once a big fan of Windows... I was though. Thankfully, I've learned a lot since then.
That said, should Fedora provide some automatic means of getting the latest NVidia driver which is compatible with your system? Yeah, they probably should. But don't blame that shortcoming on Linux; blame it on the Fedora developers. Even still, it would be better to have the driver natively included in the kernel, and I don't really see how you can argue otherwise.
On Mon, 2005-10-31 at 12:23, Derek Martin wrote:
Remember, we're talking about ONE driver. One driver which does not fit the development model of Linux.
Actually we are talking about the one vendor that is really making an effort in spite of the obstacles through by the distribution. There are plenty of drivers that could be better done by the vendors - and have many more features in the OSes that accept the vendors' driver.
Open-source code (in general) IS better than proprietary, not because of ideology, but simply because a lot more people are looking at the code, and those people are doing it not just for pay (or not for pay at all), but because they just love doing it.
That might be true for some programs. I don't think it is true for device drivers.
My 10 years of managing both Windows systems and Linux systems professionally have proven that out; Windows crashes a lot (less so today, but even still, it does)
Windows is not the only system that ships vendor-written drivers.
and Linux doesn't [with certain well-understood exceptions involving drivers that are highly experimental or still under heavy active development -- or proprietary drivers like this one, which break the model]. You can say I've drank too much GPL kool-aid all you like; I've watched the development of both Linux and Windows very closely -- it's just a fact that Linux is generally more stable than Windows, and has been since the 0.something days.
You can't generalize from a single bad example. Show how Linux is more stable than Solaris, or IBM's mainframe OS's, or OS X and how the drivers are better.
I actually had a lot more to say to rebutt your specific points, but I've decided that it's a waste of time to pursue this; you're entitled to your opinion, even if you are wrong... ;-)
Rebut with some examples other than Windows. Windows problems stem from MS-written code. But, I'd accept some specific cases where Linux device drivers are measurably better or more feature rich than the windows counterparts. I'd be particularly interested in your experiences with SATA cards and 802.1g devices over the years they have been available.
On Sun, 30 Oct 2005, Derek Martin wrote:
On Sun, Oct 30, 2005 at 04:53:38AM -0500, Matthew Saltzman wrote:
They are clearly not up to the task. Not that we should expect they would be; it's a pretty big task... But it becomes a small task if nvidia either releases source code for their drivers, or releases complete specs to their hardware, so that a driver can be written and maintained as part of the Linux kernel proper.
But (rightly or wrongly) video card makers consider these two pieces of information to be their competitive edge. So you are not likely to see them releasing either, at least for their latest cards. Also, Linux users still aren't a very large fraction of their customer base.
I'm well aware of the reasons... But support for their hardware (or rather usability of it) suffers because of them.
I never claimed it didn't, just recapped the reasons.
As for what fraction of their customer base uses Linux, those are some statistics I'd like to see... But I think it raises some interesting questions, like what percentage of their users actually use both? What percentage of Linux users who DON'T use their hardware would switch if there were native drivers?
Also don't forget that there are other free OSes, such as the *BSD family (and still others as well). All of these would benefit from such a release as well... When you add up all users of all free OSes, how does that compare to say, users of Mac OS X? I'm not sure, but I believe I have read that it is much larger...
And yet vendors need to make a judgment about whether the return is worth the effort or the perceived sacrifice of a competitive edge.
It also would be a small task if there were standards that allowed them to write a single installer that would work with any distribution, without having to deal with loads of special cases.
This is largely impossible. In this case we're dealing with the kernel, so the reasons are slightly different (but very similar) than for application programs.
The kernel is very actively developed. The distribution vendors often make their own custom modifications, to enhance functionality or performance, or just to fix bugs. For a hardware vendor to maintain its own binary-only driver which is compatible with all of these varying kernels is a task which is, practically speaking, essentially impossible. So, from time to time, with various kernels, their driver will crash your system.
A clean, stable, nonrestrictive module interface would help. I don't know a lot about kernel internals, but I've seen references to discussions about changes that break old ABIs and about deliberately making life difficult for proprietary module vendors.
I found this to be the case with a recent release of their driver and glx libs in combination with a recent release of the kernel. After updating from Livna maybe a month ago, my system locked up hard with the latest kernel and glx stuff when X tried to start, even after rebooting several times. Reverting to an older glx library fixed the problem for me.
At lease NVIDIA is reasonably responsive to these issues when they come up.
I guess I do not "blame" them for wanting to keep their trade secrets, but when the newest release of their drivers crashes your system, the above is the direct cause, even if you can argue that the "fault" isn't theirs... The only practical solution is for them to release the code, whether or not they are willing to do so.
Then I suspect we are kind of stuck with the impractical alternatives. The X.org NVIDIA driver is fine if you don't need 3D. The X.org ATI driver is fine if you don't mind older hardware. The NVIDIA 3D driver is probably the best among propriatary ones. ATI doesn't seem to be so responsive with their proprietary drivers. I don't know much about other cards, but there don't seem to be many other high-performance 3D options.
Nevertheless, I've found that the NVIDIA drivers work reasonably well.
Usually. Unless they crash your system, which happens from time to time, i.e. this was not the first time. Also with certain earlier versions of their driver, my system would experience random lock-ups while I was using it (which for my money is a lot more annoying than crashing at boot time). Both the last version I used and the version I am using now don't seem to suffer from that problem...
You could also "blame" Red Hat and Fedora for their policy of not including proprietary/binary-only packages in their distributions.
No, I think you really can't; in many cases (if not all) it is illegal for them to do so, which is a large part of why they don't do it. Even if it is not illegal in your country, it might be in many others... Their hands are tied. But, this is why we have the extras and Livna RPM repositories...
Sure, but of course it does affect usability. But there are other distros that do include these features. Either they (or their users) are on risky legal ground or they have made the appropriate arrangements and paid the fees. Or they are offshore, in which case, I don't know what the legal implications are for them or their US users.
I admire them for sticking to their guns on this issue, but it does affect usability, particularly with respect to cartain hardware drivers and multimedia codecs.
Which are illegal for them to redistribute, because they are proprietary. Or, they would have to pay licensing fees in some cases, which means they would have to charge you (more) for their products. I agree this is inconvenient, but I'd much rather suffer some inconvenience than see Red Hat sued out of existence... Don't you agree? :-)
Of course.
On Sun, 2005-10-30 at 11:10 -0500, Derek Martin wrote:
I guess I do not "blame" them for wanting to keep their trade secrets, but when the newest release of their drivers crashes your system, the above is the direct cause, even if you can argue that the "fault" isn't theirs.
I can understand it, but to any rival worth their salt, they'll be able to reverse engineer the software, and hardware, if they wanted to. So holding back the information doesn't really achieve that aim.
I've got the complete circuit diagrams for all sorts of equipment, though that's only of some help if I really wanted to duplicate it. But it does mean that I can service it. There isn't really a good argument for withholding technical specifications needed for writing drivers.
Thanks for the responses.
We have Core 3 installed on all the workstations at my office which is about 70 or so, and they all have nvidia effects 1300's and they don't seem to have too many problems. The reason I am working at getting Core 4 rolling on my laptop is that I love using it and want to improve my knowledge on how to get things working on it. i.e there is no USB wacom support (no absolute mode or pressure sensitivity) which hurts me, so i really want to be able to do my own experimentation withthings, as I obviously (and thankfully so) don't have administrative priveleges at work.
It's not a necessity but would be way cool if i can get over these hurdles. I've had the main Linux tech guy give my machine a quick workout with the various Nvidia installers (pretty much trying all the stuff i'd already tried) and he had no joy. I can get the computer to boot o.k but only using the default driver.
So am I right in thinking that compiling a kernel for my machine is the way to go? I was under the impression that this is the gauranteed way to get it to work.
I am going to try a fresh install to make sure I didn't screw anything up.
I notice that the vanilla kernel is up a version from the 2.6.11-1.1369-fc4that I am using.. there seems to be a few nvidia fixes in there..should I be aiming to get that in my build. I'll have to work out exactly what I am doing before I throw any serious questions out there. Is there others out there who have installed fc4 on their Dell Inspiron 9300's with the Nvidia Go6800?? a pointer in the right direction would be cool. Sorry if i seem a real dumbass, I know I am in a bit over my head with this. But I am not giving up until I get this thing rolling. As it is a laptop i don't really have the option of swapping the graphics card for something else.
Thanks
On 10/30/05, Derek Martin code@pizzashack.org wrote:
On Sat, Oct 29, 2005 at 09:56:16PM -0700, David Abbott wrote:
I have to say. Being new to Linux and trying to administer my own system I have had loads of trouble.
That's pretty normal, when you're new to something, generally.
I am trying to work out how to compile my own kernel because the basic setup to install the Nvidia drivers doesn't work on my Dell Inspiron 9300. The Go6800 is a pretty standard card. i wouldn't have imagined that it should be difficult.
This you can blame on nvidia, for refusing to release the specs of their hardware so that a proper driver could be integrated into the Kernel. Instead, we're dependent upon them to try to support a binary-only kernel module on dozens of different versions of the Linux kernel (both official releases, and also vendor-modified kernels).
They are clearly not up to the task. Not that we should expect they would be; it's a pretty big task... But it becomes a small task if nvidia either releases source code for their drivers, or releases complete specs to their hardware, so that a driver can be written and maintained as part of the Linux kernel proper.
Another issue is that Fedora Core is intended to be a cutting edge development platform, and as such there will always be bugs. In some ways I think it is really not a very good choice for someone venturing into the world of Linux for the first time... particularly for people who are not already somewhat adept with computers. Except that you will learn a lot by figuring out how to fix all the bugs and make your system work the way you want it to. So in that regard, it's a good thing. ;-)
I will persist, and continue my studies in how this all works.
Good; if you do, you will be rewarded with a much greater understanding of how your machine works, and why it works that way. Enjoy!
-- Derek D. Martin http://www.pizzashack.org/ GPG Key ID: 0x81CFE75D
-- fedora-list mailing list fedora-list@redhat.com To unsubscribe: https://www.redhat.com/mailman/listinfo/fedora-list
On Sun, Oct 30, 2005 at 11:09:50PM +1030, David Abbott wrote:
It's not a necessity but would be way cool if i can get over these hurdles. I've had the main Linux tech guy give my machine a quick workout with the various Nvidia installers (pretty much trying all the stuff i'd already tried) and he had no joy. I can get the computer to boot o.k but only using the default driver.
What is the problem you're having when you install the driver? If we knew that, we might be able to help more, but I don't recall seeing any mention of it in your previous e-mail (at least I don't think I did)... :)
As I just mentioned in another mail, one specific combination of the driver and Fedora kernel was crashing my system. It may well be the most recent one, and you might be running into the same problem... I was able to solve it by using an older driver.
Livna has some "legacy" drivers, e.g.:
kernel-module-nvidia-legacy-2.6.12-1.1380_FC3-1.0.7174-0.lvn.1.3 nvidia-glx-legacy-1.0.7174-0.lvn.1.3
You might try using yum to install one of these specifically, rather than installing the latest (make sure to use a driver for FC4 if that is what you're using)... You can find out what versions are available from Livna (if you have their repositories set up in your yum.conf) using this:
# yum search kernel-module-nvidia-legacy nvidia-glx-legacy
HTH
Maurie
-----Original Message----- From: Derek Martin code@pizzashack.org To: For users of Fedora Core releases fedora-list@redhat.com Sent: Sun, 30 Oct 2005 11:35:08 -0500 Subject: Re: FC4 does not work, "out of the box" for me; GUI/X11 fails
On Sun, Oct 30, 2005 at 11:09:50PM +1030, David Abbott wrote:
It's not a necessity but would be way cool if i can get over these hurdles. I've had the main Linux tech guy give my machine a quick workout with the various Nvidia installers (pretty much trying all the stuff i'd already tried) and he had no joy. I can get the computer to boot o.k but only using the default driver.
What is the problem you're having when you install the driver? If we knew that, we might be able to help more, but I don't recall seeing any mention of it in your previous e-mail (at least I don't think I did)... :)
As I just mentioned in another mail, one specific combination of the driver and Fedora kernel was crashing my system. It may well be the most recent one, and you might be running into the same problem... I was able to solve it by using an older driver.
Livna has some "legacy" drivers, e.g.:
kernel-module-nvidia-legacy-2.6.12-1.1380_FC3-1.0.7174-0.lvn.1.3 nvidia-glx-legacy-1.0.7174-0.lvn.1.3
You might try using yum to install one of these specifically, rather than installing the latest (make sure to use a driver for FC4 if that is what you're using)... You can find out what versions are available from Livna (if you have their repositories set up in your yum.conf) using this:
# yum search kernel-module-nvidia-legacy nvidia-glx-legacy
HTH
Derek Martin wrote:
On Sat, Oct 29, 2005 at 09:56:16PM -0700, David Abbott wrote:
I have to say. Being new to Linux and trying to administer my own system I have had loads of trouble.
That's pretty normal, when you're new to something, generally.
I am trying to work out how to compile my own kernel because the basic setup to install the Nvidia drivers doesn't work on my Dell Inspiron 9300. The Go6800 is a pretty standard card. i wouldn't have imagined that it should be difficult.
This you can blame on nvidia, for refusing to release the specs of their hardware so that a proper driver could be integrated into the Kernel. Instead, we're dependent upon them to try to support a binary-only kernel module on dozens of different versions of the Linux kernel (both official releases, and also vendor-modified kernels).
They are clearly not up to the task. Not that we should expect they would be; it's a pretty big task... But it becomes a small task if nvidia either releases source code for their drivers, or releases complete specs to their hardware, so that a driver can be written and maintained as part of the Linux kernel proper.
Another issue is that Fedora Core is intended to be a cutting edge development platform, and as such there will always be bugs. In some ways I think it is really not a very good choice for someone venturing into the world of Linux for the first time... particularly for people who are not already somewhat adept with computers. Except that you will learn a lot by figuring out how to fix all the bugs and make your system work the way you want it to. So in that regard, it's a good thing. ;-)
I actually read an interview with one of the nVidia head honcho's and this issue was raised. Due to licensing restrictions placed on them by chip manufacturers and other coders, they cannot release any code. They also develop their drivers in a neutral way so by theory, the Linux and Windows drivers should be at the same level. They also said that any distro can include the binary driver.
I will say that it would be nice if the driver was part of the total package but you also have to remember that RH has decided not to include any code that is of questionable nature. This I feel is more of the issue.
Legal litigation is the biggest headaches for all of us that would like a complete distro. Until copyright and Patent laws are changed to allow easier distribution, then we will have to learn to live with it.
At least nVidia drivers are complete and work much better than any ATI drivers that I have tried.
On Mon, 2005-10-31 at 08:26 -0700, Robin Laing wrote:
I actually read an interview with one of the nVidia head honcho's and this issue was raised. Due to licensing restrictions placed on them by chip manufacturers and other coders, they cannot release any code.
This sounds like a cop out. They're the ones in a position to dictate terms to their coders and chip suppliers.
Who else but to nVidia can a chipset manufacturer sell custom nVidia chips? And they'll be selling generic chips to any manufacturer, so what's to hide in that regards.
Write code for nVidia, and you obey their directives. Be difficult, go and write for someone else.
Tim wrote:
On Mon, 2005-10-31 at 08:26 -0700, Robin Laing wrote:
I actually read an interview with one of the nVidia head honcho's and this issue was raised. Due to licensing restrictions placed on them by chip manufacturers and other coders, they cannot release any code.
This sounds like a cop out. They're the ones in a position to dictate terms to their coders and chip suppliers.
Who else but to nVidia can a chipset manufacturer sell custom nVidia chips? And they'll be selling generic chips to any manufacturer, so what's to hide in that regards.
Write code for nVidia, and you obey their directives. Be difficult, go and write for someone else.
Remember that just because X company makes their own chips does not mean that they did all of the development to get that chip to market. Some of the development may have been purchased from outside sources and thus put under restriction.
Heck, I wanted to open a file from one of our pieces of equipment on my Linux box. The hoops that I have to jump through make this almost impossible to work into the software that I am working on. I would not be able to integrate both of them together. I refuse to jump through the restrictive hoops.
I took the time and did a search on one of the interviews. This is not the most recent one that I read. In hindsight, I think the one I read was on Slashdot.
http://www.linuxquestions.org/questions/t253027.html [quote]NV) We have lots of IP in our supported closed source Linux driver some of which is licensed and cannot be open sourced. While we did our best to ensure that there was open source driver (nv) for our chips available, we got lots of feedback from our professional partners as well as end users that wanted a driver that had the same quality and performance characteristics of our supported drivers for platforms such as Windows and Apple. [/quote]
A single individual with a patent can force Microsoft into court, all the way to the supreme court (refused to hear) with a patent. This is my point on litigation. Microsoft, with all their might cannot win all the time.
Supremes shun Microsoft's Eolas appeal http://www.channelregister.co.uk/2005/10/31/microsoft_eolas/
Is it a cop out? If you were in the same situation, would you take the risk? How about the risk on losing the IP rights all together?
From some of the different articles I scanned, I see that this issue keeps coming up.
On Mon, 2005-10-31 at 23:56, Tim wrote:
On Mon, 2005-10-31 at 08:26 -0700, Robin Laing wrote:
I actually read an interview with one of the nVidia head honcho's and this issue was raised. Due to licensing restrictions placed on them by chip manufacturers and other coders, they cannot release any code.
This sounds like a cop out. They're the ones in a position to dictate terms to their coders and chip suppliers.
Who else but to nVidia can a chipset manufacturer sell custom nVidia chips?
Ummm, the cheap generic competior that can undercut nVidia's pricing with a knock-off since they don't have to develop anything.
And they'll be selling generic chips to any manufacturer, so what's to hide in that regards.
If they give away their trade secrets it will be some other manufacturer selling those chips too.
Write code for nVidia, and you obey their directives. Be difficult, go and write for someone else.
Give away your secrets, go out of business.
I like open source, but it should be a matter of choice whether you give your work away or not. We need a way to let them keep that freedom of choice while still being able to use their product. They are going to a great effort trying to give away free binaries yet the OS distribution continues to make it difficult for them.
On Tue, 2005-11-01 at 10:31 -0600, Les Mikesell wrote:
I like open source, but it should be a matter of choice whether you give your work away or not. We need a way to let them keep that freedom of choice while still being able to use their product. They are going to a great effort trying to give away free binaries yet the OS distribution continues to make it difficult for them.
---- I thought that commitment to open source and not distributing software with restrictive licensing to be a virtue. That puts pressure on those who want to come to the dance to dress according to the rules.
I don't want to dis on Adobe/Real Networks/nVidia/ATI/Sun et al. They have every right to hold on to their source and only distribute binaries for free as they wish, they just don't get included with source only distributions.
If third party efforts exist to script the download and install of these binary only distributions to make life easier for users, then I am all for it but the distribution itself isn't going to bother with it. I sort of like the idea of having users take the extra efforts to get the binary only software installed so they at least recognize that there is a distinction.
Craig
On Tue, 2005-11-01 at 11:53, Craig White wrote:
I like open source, but it should be a matter of choice whether you give your work away or not. We need a way to let them keep that freedom of choice while still being able to use their product. They are going to a great effort trying to give away free binaries yet the OS distribution continues to make it difficult for them.
I thought that commitment to open source and not distributing software with restrictive licensing to be a virtue.
It's a religion. Attempting to force others to give away their work doesn't agree with mine. Choosing to give away your work is fine, but if it isn't your choice it can't be much of a virtue. You can pick your own religion, but if you are going to justify it to others, pick some real examples and follow them through.
That puts pressure on those who want to come to the dance to dress according to the rules.
People pushing their religion on others has often caused problems... The big problem here is that the GPL concept makes it next to impossible to fairly spread the development cost of something new over the appropriate set of users.
I don't want to dis on Adobe/Real Networks/nVidia/ATI/Sun et al. They have every right to hold on to their source and only distribute binaries for free as they wish, they just don't get included with source only distributions.
Source makes sense for things of general interest where a lot of people will work to improve it. Device drivers should be written once by someone who understands the hardware and never touched again. If you poke around, I think you'll see lots of examples of source-available drivers that were done by one, and only one person. In the unfortunate case of that person's demise or change of interests they were abandoned or languished a long time before anyone else picked them up.
If third party efforts exist to script the download and install of these binary only distributions to make life easier for users, then I am all for it but the distribution itself isn't going to bother with it.
It's one thing to have to make an effort. It's something else to have the distribution arbitrarily break driver API's in their updates without arranging for the replacement to already be available. I don't see how anyone can consider or recommend such a distribution for anything more than a testbed. It is fun to play with, though...
I sort of like the idea of having users take the extra efforts to get the binary only software installed so they at least recognize that there is a distinction.
Yes, especially if they notice that the distinction is that the people who build the hardware do a better job of writing the drivers.
On Tue, 2005-11-01 at 13:21 -0600, Les Mikesell wrote:
On Tue, 2005-11-01 at 11:53, Craig White wrote:
I like open source, but it should be a matter of choice whether you give your work away or not. We need a way to let them keep that freedom of choice while still being able to use their product. They are going to a great effort trying to give away free binaries yet the OS distribution continues to make it difficult for them.
I thought that commitment to open source and not distributing software with restrictive licensing to be a virtue.
It's a religion. Attempting to force others to give away their work doesn't agree with mine. Choosing to give away your work is fine, but if it isn't your choice it can't be much of a virtue. You can pick your own religion, but if you are going to justify it to others, pick some real examples and follow them through.
---- I don't think anyone is forcing anybody to do anything. Open source and non-restrictive licensing seems to offer a pretty large opening for those who wish to be invited to the dance. ----
That puts pressure on those who want to come to the dance to dress according to the rules.
People pushing their religion on others has often caused problems... The big problem here is that the GPL concept makes it next to impossible to fairly spread the development cost of something new over the appropriate set of users.
---- assuming of course that the body of work that is already open source that gets you as far as it has doesn't already have a value far in excess of the development cost of something new (as you put it), I would agree with you but of course, I am not willing to do that. I suspect that people who contribute to an open source (GPL type license) project assume a quid pro quo of others to do the same on other projects but of course, there is no guarantee.
Where you are discussing corporate product where they are unwilling to release the source under a reasonably non-restrictive license, which they have every right (if not corporate responsibility to their shareholders), they can of course make their code available in binary form to be installed post Linux distribution install and that's hardly a problem except they also accept the burden for making it happen whereas the burden shifts to the distribution developers if it is released with source and non-restrictive license. The choice of course is theirs to make.
One would surmise by your comments is that your complaint is with the restrictions on the GPL license itself and I'm thinking that this is hardly the place to debate that. ----
I don't want to dis on Adobe/Real Networks/nVidia/ATI/Sun et al. They have every right to hold on to their source and only distribute binaries for free as they wish, they just don't get included with source only distributions.
Source makes sense for things of general interest where a lot of people will work to improve it. Device drivers should be written once by someone who understands the hardware and never touched again.
---- Until you have a device that is dropped from the distribution but still in the generic kernel - I know this first hand...I have a Perc 2/DC in my server running on CentOS 4 ----
If you poke around, I think you'll see lots of examples of source-available drivers that were done by one, and only one person. In the unfortunate case of that person's demise or change of interests they were abandoned or languished a long time before anyone else picked them up.
---- and if the source is available, at least someone can pick it up. What's the point?
Craig
On Tue, Nov 01, 2005 at 01:21:15PM -0600, Les Mikesell wrote:
I sort of like the idea of having users take the extra efforts to get the binary only software installed so they at least recognize that there is a distinction.
Yes, especially if they notice that the distinction is that the people who build the hardware do a better job of writing the drivers.
They often don't though. That's the problem, and it's been proven out throughout the history of Linux.
Intel took over the maintenance of the ether pro 100 driver. What happened? Tranceiver lock-ups when the card got busy. Adaptec took over maintenance of the AIC7xxx drivers for Linux. What happened? CRASH. And the proprietary NVidia drivers are widely known to crash systems... even with kernels that they used to develop and test the driver on. Granted, the XFree/Xorg drivers lack the performance and some of the features of the proprietary driver, but they also don't crash my system. AFAIK, the same is true of the other OSS drivers, including the DRI ones.
The only reason the OSS NVidia driver isn't better than the proprietary one is because the vendor won't release the specs to code the thing.
Derek Martin wrote:
On Tue, Nov 01, 2005 at 01:21:15PM -0600, Les Mikesell wrote:
I sort of like the idea of having users take the extra efforts to get the binary only software installed so they at least recognize that there is a distinction.
Yes, especially if they notice that the distinction is that the people who build the hardware do a better job of writing the drivers.
They often don't though. That's the problem, and it's been proven out throughout the history of Linux.
Intel took over the maintenance of the ether pro 100 driver. What happened? Tranceiver lock-ups when the card got busy. Adaptec took over maintenance of the AIC7xxx drivers for Linux. What happened? CRASH. And the proprietary NVidia drivers are widely known to crash systems... even with kernels that they used to develop and test the driver on. Granted, the XFree/Xorg drivers lack the performance and some of the features of the proprietary driver, but they also don't crash my system. AFAIK, the same is true of the other OSS drivers, including the DRI ones.
The only reason the OSS NVidia driver isn't better than the proprietary one is because the vendor won't release the specs to code the thing.
I don't have any crashes with the nVidia drivers. My uptimes are based on kernel update releases plus a few weeks (I am slow to reboot).
I do agree that if more people have access to the code, things should only get better.
On Wed, Nov 02, 2005 at 09:43:44AM -0700, Robin Laing wrote:
I don't have any crashes with the nVidia drivers. My uptimes are based on kernel update releases plus a few weeks (I am slow to reboot).
That's nice to hear. ;-) But just because one person does not experience problems, or 10 people, or 10,000 people, does not mean the thing is not broken. Lots of people DO have crashes.
I have a driver that works for me, mostly. Occasionally when I boot, the display locks up -- the machine doesn't crash, but I still can't really use it. Often, when I shut the machine down, the display changes to something that reminds me of Yar's Revenge, and the machine fails to shut down cleanly. It has behaved this way with every version of the kernel I've installed. As for the latest version of the drivers, they just don't work for me at all. The display never initializes properly, leaving the machine (my laptop) essentially unusable.
On Wed, 2005-11-02 at 09:44, Derek Martin wrote:
On Tue, Nov 01, 2005 at 01:21:15PM -0600, Les Mikesell wrote:
I sort of like the idea of having users take the extra efforts to get the binary only software installed so they at least recognize that there is a distinction.
Yes, especially if they notice that the distinction is that the people who build the hardware do a better job of writing the drivers.
They often don't though. That's the problem, and it's been proven out throughout the history of Linux.
Intel took over the maintenance of the ether pro 100 driver. What happened? Tranceiver lock-ups when the card got busy. Adaptec took over maintenance of the AIC7xxx drivers for Linux. What happened? CRASH. And the proprietary NVidia drivers are widely known to crash systems... even with kernels that they used to develop and test the driver on. Granted, the XFree/Xorg drivers lack the performance and some of the features of the proprietary driver, but they also don't crash my system. AFAIK, the same is true of the other OSS drivers, including the DRI ones.
New code gets new bugs... If you want to try to claim that the source-available drivers have never had bugs or crashed or been abandoned, you won't get far since a search of this mailing list will easily disprove it. Have those bugs been fixed? Were they triggered by changes in kernel API's?
The only reason the OSS NVidia driver isn't better than the proprietary one is because the vendor won't release the specs to code the thing.
Which of the OSS drivers have measurably better performance than a vendor-written driver on some other OS?
On Wed, 2005-11-02 at 10:44 -0500, Derek Martin wrote:
Intel took over the maintenance of the ether pro 100 driver. What happened? Tranceiver lock-ups when the card got busy. Adaptec took over maintenance of the AIC7xxx drivers for Linux. What happened? CRASH. And the proprietary NVidia drivers are widely known to crash systems... even with kernels that they used to develop and test the driver on.
Makes you wonder if that's because they're used to systems where the users are used to them crashing and not complaining about it, because the users *mistakenly* think that's normal, and don't see any need to produce better code.
e.g. Windows.
News flash. Windows doesn't crash.
On Sun, 2005-11-06 at 17:23, David Abbott wrote:
News flash. Windows doesn't crash.
I'm not sure why this is posted to the fedora list, but you've obviously never installed windows yourself on unsupported hardware. Try even the simplest thing, like a SCSI controller that isn't included in the base install on a server that doesn't have a floppy drive.
It's not an insult. I would prefer to use Fedora, it just seems a bit lame to keep dropping in how Windows crashes, it's like a flashback to the '80s or something.. I haven't had to install a SCSI controller and from what you have said I won't try. What windows does or doesn't do shouldn't be an issue. Who cares. This is the Fedora list that is why I posted my comment.
On Mon, Nov 07, 2005 at 10:19:35AM +1030, David Abbott wrote:
It's not an insult. I would prefer to use Fedora, it just seems a bit lame to keep dropping in how Windows crashes, it's like a flashback to the '80s or something.. I haven't had to install a SCSI controller and from what you have said I won't try. What windows does or doesn't do shouldn't be an issue. Who cares. This is the Fedora list that is why I posted my comment.
I agree with the sentiment that Windows behavior is irrelevant but mine crashed yesterday and I did not have to install a SCSI controller, which I have done before in Windows.
Les Mikesell wrote:
On Sun, 2005-11-06 at 17:23, David Abbott wrote:
News flash. Windows doesn't crash.
I'm not sure why this is posted to the fedora list, but you've obviously never installed windows yourself on unsupported hardware. Try even the simplest thing, like a SCSI controller that isn't included in the base install on a server that doesn't have a floppy drive.
My new Dell that came with Windows SP Pro used to crash two or three times a day. It was a pain. My FC4 at ho0me has crashed once in two months.
On Mon, 2005-11-07 at 09:53 +1030, David Abbott wrote:
News flash. Windows doesn't crash.
Yeah right...
Oh, that's right, it doesn't crash anymore, now it reboots. :-p
Anyway, that's a load of waffle. Windows does crash. It if can't handle some things, it crashes, rather than aborts trying to do *that* thing.
All computers crash, just some do it better than others. ;-\
---- For users of Fedora Core releases fedora-list@redhat.com wrote:
On Tue, 2005-11-01 at 13:21 -0600, Les Mikesell wrote:
On Tue, 2005-11-01 at 11:53, Craig White wrote:
I like open source, but it should be a matter of choice whether you give your work away or not. We need a way to let them keep that freedom of choice while still being able to use their product. They are going to a great effort trying to give away free binaries yet the OS distribution continues to make it difficult for them.
I thought that commitment to open source and not distributing software with restrictive licensing to be a virtue.
It's a religion. Attempting to force others to give away their work doesn't agree with mine. Choosing to give away your work is fine, but if it isn't your choice it can't be much of a virtue. You can pick your own religion, but if you are going to justify it to others, pick some real examples and follow them through.
I don't think anyone is forcing anybody to do anything. Open source and non-restrictive licensing seems to offer a pretty large opening for those who wish to be invited to the dance.
That puts pressure on those who want to come to the dance to dress according to the rules.
People pushing their religion on others has often caused problems... The big problem here is that the GPL concept makes it next to impossible to fairly spread the development cost of something new over the appropriate set of users.
assuming of course that the body of work that is already open source that gets you as far as it has doesn't already have a value far in excess of the development cost of something new (as you put it), I would agree with you but of course, I am not willing to do that. I suspect that people who contribute to an open source (GPL type license) project assume a quid pro quo of others to do the same on other projects but of course, there is no guarantee.
Where you are discussing corporate product where they are unwilling to release the source under a reasonably non-restrictive license, which they have every right (if not corporate responsibility to their shareholders), they can of course make their code available in binary form to be installed post Linux distribution install and that's hardly a problem except they also accept the burden for making it happen whereas the burden shifts to the distribution developers if it is released with source and non-restrictive license. The choice of course is theirs to make.
One would surmise by your comments is that your complaint is with the restrictions on the GPL license itself and I'm thinking that this is hardly the place to debate that.
I don't want to dis on Adobe/Real Networks/nVidia/ATI/Sun et al. They have every right to hold on to their source and only distribute binaries for free as they wish, they just don't get included with source only distributions.
Source makes sense for things of general interest where a lot of people will work to improve it. Device drivers should be written once by someone who understands the hardware and never touched again.
Until you have a device that is dropped from the distribution but still in the generic kernel - I know this first hand...I have a Perc 2/DC in my server running on CentOS 4
If you poke around, I think you'll see lots of examples of source-available drivers that were done by one, and only one person. In the unfortunate case of that person's demise or change of interests they were abandoned or languished a long time before anyone else picked them up.
and if the source is available, at least someone can pick it up. What's the point?
Craig
-- This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean.
-- fedora-list mailing list fedora-list@redhat.com To unsubscribe: https://www.redhat.com/mailman/listinfo/fedora-list
Your comments about forcing people to give away their work indicates that you don't understand the concepts and philosiphy of open source development and and open source business models. Poke around on the web for examples of open source business models and if you can pick up a copy of the book: "The Cathederal and the Bazaar". This may help...
On Tue, 2005-11-01 at 15:42 -0500, Kevin Kempter wrote:
Your comments about forcing people to give away their work indicates that you don't understand the concepts and philosiphy of open source development and and open source business models. Poke around on the web for examples of open source business models and if you can pick up a copy of the book: "The Cathederal and the Bazaar". This may help...
---- I have indeed read it and I don't see how that applies to my comments.
Craig
---- For users of Fedora Core releases fedora-list@redhat.com wrote:
On Tue, 2005-11-01 at 15:42 -0500, Kevin Kempter wrote:
Your comments about forcing people to give away their work indicates that you don't understand the concepts and philosiphy of open source development and and open source business models. Poke around on the web for examples of open source business models and if you can pick up a copy of the book: "The Cathederal and the Bazaar". This may help...
I have indeed read it and I don't see how that applies to my comments.
Craig
-- This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean.
-- fedora-list mailing list fedora-list@redhat.com To unsubscribe: https://www.redhat.com/mailman/listinfo/fedora-list
The point is you don't get to "own" the source code - ever. The source code is owned by the community. This is the basis for open source software. No-one ever gets to move ownership from the community to their own domain. This allows for the community to make great strides in software development and stability. This means that any commercial ventures must find revenue streams outside of the "rent my binaries" model, thus the cash for services model seems the most realistic.
If a company could take open source code and declare that it's now closed source due to the most recent additions that the company added then the entire model breaks down since the community looses any incentive to make better source code if that same code could at any time be declared closed source based on the work that the community started. Which seems reasonable to me.
Just my .02 cents...
On Tue, 2005-11-01 at 16:06 -0500, Kevin Kempter wrote:
---- For users of Fedora Core releases fedora-list@redhat.com wrote:
On Tue, 2005-11-01 at 15:42 -0500, Kevin Kempter wrote:
Your comments about forcing people to give away their work indicates that you don't understand the concepts and philosiphy of open source development and and open source business models. Poke around on the web for examples of open source business models and if you can pick up a copy of the book: "The Cathederal and the Bazaar". This may help...
I have indeed read it and I don't see how that applies to my comments.
Craig
The point is you don't get to "own" the source code - ever. The source code is owned by the community. This is the basis for open source software. No-one ever gets to move ownership from the community to their own domain. This allows for the community to make great strides in software development and stability. This means that any commercial ventures must find revenue streams outside of the "rent my binaries" model, thus the cash for services model seems the most realistic.
---- I'm not sure that I'd agree with that both semantically and in realistic terms. I'm gathering that you are speaking of GPL license and are being very imprecise. There are other open source licenses. Companies can release under dual license and thus your term of source code as owned is imprecise as there are things such as copyrights.
I have vaguely made reference to things such as 'you own it' in terms of open source in that you can change it to make it fit your scenario but that is somewhat flippant because ownership of copyright and licensing truly represent ownership. Licenses such as GPL define the terms of usage, participation and continuity. ----
If a company could take open source code and declare that it's now closed source due to the most recent additions that the company added then the entire model breaks down since the community looses any incentive to make better source code if that same code could at any time be declared closed source based on the work that the community started. Which seems reasonable to me.
---- that of course is your opinion and that is not always how it's done. There are open source releases that some projects have asked contributors to sign before putting their changes into the tree. ----
Just my .02 cents...
---- Still don't see how any of this had to do with my previous comments or relate to this message base.
My .02 cents
Craig