Draft Product Description for Fedora Workstation
misc at zarb.org
Wed Nov 6 18:10:40 UTC 2013
On Tue, Nov 05, 2013 at 01:23:01PM -0800, Adam Williamson wrote:
> On Mon, 2013-11-04 at 23:50 +0100, Michael Scherer wrote:
> > Le lundi 04 novembre 2013 à 21:02 +0100, Reindl Harald a écrit :
> > >
> > > Am 04.11.2013 20:56, schrieb drago01:
> > > > On Mon, Nov 4, 2013 at 8:49 PM, Reindl Harald <h.reindl at thelounge.net> wrote:
> > > >> that's all true but you can be pretty sure if a "app-store" with
> > > >> bundeled applications exists *nobody* would package and maintain
> > > >> them as RPM -> everybody would point with his finger to the app
> > > >
> > > > No because RPM packages apps *do* have benifits .. otherwise we
> > > > wouldn't have this discussion.
> > > >
> > > >> if it goes in that direction, and it starts faster than anybody likes
> > > >> you do a dramatical harm to the userbase which likes the consistent
> > > >> package managment and *really used* conecpt of shared libraries
> > > >
> > > > Again those are NOT MUTUALLY EXCLUSIVE. You can have sandboxed *and*
> > > > rpm packaged apps at the same time.
> > >
> > > the most imporant word in your answer is *CAN*
> > >
> > > but you will not, nobody will package whatever application
> > > as RPM if he is fine with the app-store, so you *could*
> > > have both but i doubt at the end of the day it will happen
> > If no one think that using a rpm is bringing any value, then indeed, no
> > one will do the job. Now, if someone think this is better for whatever
> > reasons, then this someone will do the job.
> > It seems that your fear is that if people are not forced to make rpm,
> > they will not see the value of doing so, and so would not do it.
> > So if that's the problem, then the solution is to demonstrate the value
> > of packaging and rpm rather than restricting all others alternatives.
> So to me this is the nub of the debate, and it's both fantastically
> interesting and fantastically difficult to work out in advance.
> In an ideal world things would work the way Michael describes, and also,
> the stock market would behave precisely as neat theories based on
> rational actors predict, and no-one would have any difficulty solving
> the three door problem, and healthcare.gov would never have been
> launched in a state in which it could not possibly work...
> And in the real world, well, it's the real world. :)
Excuse me to remove your bnice piece of anticipation,
and excuse me to not answer in the same way as I do not have a good idea for now.
While what you show is likely true, there is just 1 single problem, this is not the
future. This is the present.
Currently, if you do not have the big button of your story, you still have a few
choice as a ISV or upstream ( free or non free, the problem are the same for both,
except that in the case of free software, people will complain to you for stuff
that you didn't decide to do ):
1) you just distribute nothing, besides a tarball with source code. Or maybe just a gem
or similar. This seems to satisfy perfectly some people who are quite happy of the
sisyphean tasks that is the integration of a always growing pack of line of code, but strangely
enough, it doesn't suit that well some end-users.
And the choice of version shipped by the distribution is a choice that the distribution
do. So far, this model served us quite well, and frankly, it satisfy most of my needs,
because I have several years of experience.
But besides some end-users, that's also not satisfying for some upstream, since they
want to have their code sent to users fast for various reasons, like "having feedback
on new features" ( the faster, the better, if possible the day of release ), sometime, they want
to distribute bugfixes or proposed bugfixes. Sometimes they also want to have
a supported version, potentially older.
For every policy decided by a distro, you can find one reason to do thing differently.
In the end, it all boils down on who decide the version, between the upstream, the
end users and distribution. And while of course, with enough ressources, everybody can
choose what he want, there is no one here with a unlimited amount of ressources.
SO in the end, that's just distribution using their power to decide for others,; because others
either do not have the ressource (users), or spend their ressources on making software (upstream).
In order to take/give back some choice to upstream and/or end users, people tried
various systems :
2) compile stuff in static
That's ugly. And yet, that work enough well for distributing games on Linux since a few
years (earliest personal example I can find would be Unreal tournament), for
distributing software used in industry or even plugin. Some upstream do this for now.
Despites us saying "don't do that, it kill kittens, ulrich drepper said it".
2bis) use some "magical" system, like autoinstall, zeroinstall, etc. They didn't work that well.
Potentially for chicken and eggs problem, potentially because some problem were left unsolved.
However, since company like Microsoft and Apple ( or Google ) managed to make it work fine enough,
this show the technical issues are not really that hard to solve.
3) compile on every distribution in the native format. That's what Open build system, from suse,
is doing. While I wouldn't call that a big success, it is working more than fine. Of course,
the maintenance cost is high for complex packages, because distributions prefer to not
standardize on anything so you have ugly spec files.
4) compile only on the distribution that matter to you, or just the one with the more perceived
market share. Currently, that basically mean Ubuntu, or for specialized products, that mean
RHEL, sometime SLES. Some people view that as a subtle form of lock-in. And people
with lesser know distribution usually complain about this. This also bring lots
of subtle issue. PPAs are full of them, between repository incompatibilities, distribution
problem, conflicts, etc.
5) offer a magic script that install everything. Node.js do that. Salt do that.
That's a bit crappy from security point of view, but so far, the main problem
is mostly about user education ( ie, training people to not do dumb stuff ).
Because running a shell script on a amazon ec2 instance is not a so big deal. If
you do not trust the script, installing a package from upstream wouldn't be more trusted.
So we have all of those systems, that are already used, each with different success rate
>From there, we can see a few points :
- the whole issue of who distribute between upstream and distribution is mostly who decide what
users see and get. Of course, as with any change, people who currently have the "power" see initiative
to give the power to others as bad, mostly due to human nature because every body deeply believe to do the
best job, and knowing what is best. No one believe to do a crappy job and yet invest time in it.
And of course, every group say they are the best suited for that. But this power play is divisive,
and division doesn't benefit free software.
- posting angry mail on ml and refusing alternatives didn't prevent them from emerging,
and from having a moderate success.
PPA are populars, so does OBS. They are not perfect, but they work good enough for people ( and
it seems good enough for us to replicate, despites PPAs being a time bomb, breaking Ubuntu upgrade
in various way ).
- compile stuff as static is a perfectly acceptable system for most OEM. That's unclean, likely insecure,
that consume memory, but since we as a community do not offer anything better, OEM usually choose that, and
as long as our only answer to their problems will not solve their issues, they will continue.
And despites being problematic, most regular users do not care, OEMs do not care, only packagers
do. And no one besides packagers do care about what packagers care, most of the time. The ressource issues
will just become mooth as the time pass if not already. My laptop has 8G of ram. That's more than enough to
be wasted for static binaries. ( same goes for the disk in my case, and CPU ). And on security point of view,
well, that's also a lost cause when people are advocating exchanging security for speed, ease of use or
anything ( like "disable selinux or apparmor, it is useless and slow stuff" ),
so not caring about security of libraries is just another step in that direction.
- despites having alternatives, rpm/deb packaging didn't stop being used. In fact, even more
interesting, despites having both, this didn't prevent others packaging system to be created.
>From emerge to macports, pip, pkgng, there is new system being created on a regular basis.
And people still make rpm and deb, despites being more complex, gothic and cumbersome
to use than others. It was historically easier to make a gentoo initscript than a sysv initscript, and yet people
didn't move. Debian package use several file of different formats, is not at all unified, non parsable by
regular tools, and yet, Debian has a gigantic community making packages.
So something has to be really really better to disrupt the current model
of rpm/deb and centralized repository. If in the past, none of the attempts did,
I cannot see why suddenly something will.
And finally, having sandboxing system is not the future. This is the present.
People do distribute complete VM in the entreprise side of the life.
Even in Fedora, we already have docker in rawhide, which seems to be quite similar
to what people speculate on what sandboxed application will be :
- some kind of appstore ( aka a central repo )
- working on various distributions, by design
- you get a full OS in bundle, also by design
- isolated from the main system using lxc and likely selinux
Yet, I haven't seen people saying "this shouldn't be offered", or anyone
thinking this is unacceptable for Fedora, or something that will bring doom to current
I would also be against having that as a primary way of deploying
applications. But I am fine about giving the possibility, and I am prefectly fine
knowing that some people prefer this, that this solve some people issues.
More information about the devel