Ideas for separation of "base system"

News and discussion about development of the Debian OS itself

Ideas for separation of "base system"

Postby digthemdeep » 2008-08-05 01:02

Here is a suggestion for Debian development, if you'll forgive my audacity:

I love the stability and reliability of Debian's stable branch, but there is a well known compromise in oldness. For me, the oldness is mostly fine, but there are a handful of applications which I feel I must upgrade manually, (OO.o, Sauerbraten, Ardour...)

Perhaps Debian could borrow from FreeBSD conceptually in this regard; by separating the "base system" (kernel plus userland) from the "applications". Ideally the "applications" should get cutting-edge updates, but the "base system" should stick to the stable-branch policy, such that only security and bug updates are applied, thereby retaining (much of) the stability and security that Debian is famed for.

Stop me if I'm wrong, but implementing such a scheme might be done by changing the repo categories from "main, contrib" to "base, main, contrib" and changing the branches from "stable, testing, unstable" to "stable, stablenhalf, testing, unstable", where stablenhalf would not have a base repo category. The traditional system-wide stable config could also be preserved as an option. This would mean that stablenhalf would receive package updates from unstable just like testing, but they would be compiled against the stable base-system. One has to assume that most packages have compile-time compatibility going back at least as far as the latest stable base, but it's my impression this would be true the vast majority of the time.

Another way to implement this might be to increase granularity of apt, such that different update policies can be applied to different package categories. This seems non-ideal to me, however, because it radically increases the number of possible end-user configurations, tends towards package incompatibility, and the bugs thereof may become unmanageable.

It looks like this mindset has already been adopted for hardware drivers, which I am excited to see, but it's my humble opinion that it would be a benefit to many (the majority?) of users to have this done for applications as well.

Thanks for you consideration!
Chris
digthemdeep
 
Posts: 4
Joined: 2008-08-04 23:02

Postby rickh » 2008-08-05 01:40

That's backports.org you're talking about.
Debian-Lenny/Sid 32/64
Desktop: Generic Core 2 Duo, EVGA 680i, Nvidia
Laptop: Generic Intel SIS/AC97
User avatar
rickh
 
Posts: 3473
Joined: 2006-06-29 02:13
Location: Albuquerque, NM USA

Postby industrialpunk » 2008-08-05 04:39

You can also mix releases and only pull in the updated software you want.
-Josh Willingham
User avatar
industrialpunk
 
Posts: 733
Joined: 2007-03-07 22:30
Location: San Diego, CA, USA

Postby Pick2 » 2008-08-05 13:42

Don't get me wrong , it a great Idea ...
But you are trying to spoon feed the wrong people , try a Fork instead ! :lol:
User avatar
Pick2
 
Posts: 797
Joined: 2007-07-07 13:31
Location: Decatur Il

Re: Ideas for separation of "base system"

Postby saulgoode » 2008-08-05 14:51

I suspect the greatest problem would be in defining what a "base system" consists of. Should it include all the X and graphics libraries (GTK, QT, Cairo, Pango, etc)? If not, then you are left with pretty much the core basics which don't typically change rapidly enough to be a problem.

If those libraries are to be "locked in" by a distro, then you will find upstream projects will basically ignore that distro. Developers aren't going to avoid availing themselves of a library's feature just because some distro has decided not to upgrade to a newer version of that library.

If you decide to backport such libraries, you are basically replicating the efforts of the distro's testing branch within its stable branch. If you have the manpower then this would be fine; but it seems to me that taking away development efforts from Testing to maintain Stable would tend to increase the bugginess of both Stable and Testing.

Also, keep in mind that with Unix-type systems the distinction between an application and a library can be very then. Programs such as FFMPEG and MPlayer (not to mention all the GNU Tools) are quite commonly employed as "library subroutines" within other programs.
Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it. -- Brian Kernighan
User avatar
saulgoode
 
Posts: 1545
Joined: 2007-10-22 11:34

Postby ciol » 2008-08-06 18:21

I really think developers should change something.
For instance, instead of backports.org being an additional project, it could be -stable.
Not everything need to be that stable. Who *really* wants stability? And who can afford a Red Hat?

If those libraries are to be "locked in" by a distro, then you will find upstream projects will basically ignore that distro.


It's exactly what happens with -stable.
Yes, I think you can freeze only the libraries. When a software can't be compiled against your old libraries, then you can begin to maintain and freeze this software exactly as Debian does in Etch or Sarge etc.

I suspect the greatest problem would be in defining what a "base system" consists of. Should it include all the X and graphics libraries (GTK, QT, Cairo, Pango, etc)?


What is an operating system? An OS is kernel + libc + X + a windows/desktop manager.
I don't know what Debian is. Debian is as well the kernel than a little game like tetris.
If Debian had focused on less packages, maybe the openssl problem would have not happened.
ciol
 
Posts: 28
Joined: 2006-10-27 20:24

Postby FolkTheory » 2008-08-06 18:48

dude you dont even know what youre talking about. you mention 50 different things (wtf tetris?) then your solution to solve all security problems is have less programs...well no shit sherlock. if we had no openssl we wouldnt have had an openssl vulnerability but guess what? we'd have no users either!

then you say nobody wants stability...where the hell did you get this idea from? the people that run stable are very much interested in stability!
Image
User avatar
FolkTheory
 
Posts: 291
Joined: 2008-05-18 23:02

Postby ciol » 2008-08-06 19:04

if we had no openssl we wouldnt have had an openssl vulnerability


I did not say that. What I said: if you only backport security fixes for a base system (and openssl can be in the base system), you have more developers to check fewer packages.
For the non-essential applications, you can upgrade to the last upstream version (if possible), instead of backporting.

then you say nobody wants stability...where the hell did you get this idea from? the people that run stable are very much interested in stability!


I think Debian does -stable but does not know what its target is.
The people who need stability (stability for Debian means the software doesn't change its behavior) are servers administrators for very critical sites (i.e very few people, not everyone administrate the NASA servers).
I don't think Apache 2.2.9 is less stable than Debian Apache 2.2.3-4+etch5.

I think it's not a matter if non-essential applications crash during an update, because:
1) All your system still runs.
2) You know where the problem is (since the libraries stay frozen).
3) You can simply downgrade.
4) It will be the upstream fault. Users can understand that.
ciol
 
Posts: 28
Joined: 2006-10-27 20:24

Postby BioTube » 2008-08-06 20:03

So you think abandoning a TRIED AND TRUE development model to ape BSD? Most of the base system doesn't change very often(being recompiled against newer libraries is probably the biggest change some of them have gotten in years) and having a frozen target lets the security team handle the three myriads of packages well enough.
Image
Ludwig von Mises wrote:The elite should be supreme by virtue of persuasion, not by the assistance of firing squads.
User avatar
BioTube
 
Posts: 7551
Joined: 2007-06-01 04:34

Postby ciol » 2008-08-06 20:55

Just think about it.
ciol
 
Posts: 28
Joined: 2006-10-27 20:24

Postby BioTube » 2008-08-07 04:02

Debian's model's worked excellently. If it ain't broke, don't fix it.
Image
Ludwig von Mises wrote:The elite should be supreme by virtue of persuasion, not by the assistance of firing squads.
User avatar
BioTube
 
Posts: 7551
Joined: 2007-06-01 04:34

Postby saulgoode » 2008-08-07 04:50

ciol wrote:I think it's not a matter if non-essential applications crash during an update, because:
1) All your system still runs.
2) You know where the problem is (since the libraries stay frozen).
3) You can simply downgrade.
4) It will be the upstream fault. Users can understand that.


1) The hardest problems (to find and to fix) are not that an application "crashes"; they are that an application runs "wrong". If a project asserts that you should version 2.8 of a library, that typically means that using version 2.10 hasn't been tested, that it has been tested and something failed, or that there is a known conflict in the API or behavior of the library which has not yet been addressed.

2) Therein lies the rub. You can't upgrade Apache from 2.2 to 2.9 unless you also upgrade the libraries. If you upgrade the libraries that Apache uses, you introduce problems for other programs which used the older library versions. If you don't upgrade the libraries, you can't upgrade Apache.

4) What would be upstream's fault? The fact that they depend on newer libraries than a distro provides? Or that a distro provides newer libraries than the upstream specifies? Either way to see it as upstream's fault is to suggest that it is the responsibility if upstream projects to follow the dictates of a distribution (which one?).
Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it. -- Brian Kernighan
User avatar
saulgoode
 
Posts: 1545
Joined: 2007-10-22 11:34

Postby ciol » 2008-08-07 06:43

BioTube wrote:Debian's model's worked excellently. If it ain't broke, don't fix it.


I don't think it works excellently. A lot of people use -testing because -stable is too obsolete, but they should not use -testing neither.
Can you explain me why all the windows managers are frozen in -stable? I don't think an administrator needs e.g ion in -stable.

If a project asserts that you should version 2.8 of a library, that typically means that using version 2.10 hasn't been tested, that it has been tested and something failed, or that there is a known conflict in the API or behavior of the library which has not yet been addressed.


2.8 is compatible with 2.10. If not, you can change the soname or send a patch to the library developers.

You can't upgrade Apache from 2.2 to 2.9 unless you also upgrade the libraries.


But you can do what I said: upgrade from 2.2.x to 2.2.9 without upgrading the libraries.

What would be upstream's fault?


You did not understand.
In a separated base system, if a non-essential application like Firefox has a new bug after an upgrade, it will be very unlikely the Debian fault.
On the contrary, in e.g -testing, since you upgrade more packages at a time, it's more difficult to find the problem.
ciol
 
Posts: 28
Joined: 2006-10-27 20:24

Postby saulgoode » 2008-08-07 07:43

ciol wrote:
saulgoode wrote:You can't upgrade Apache from 2.2 to 2.9 unless you also upgrade the libraries.

But you can do what I said: upgrade from 2.2.x to 2.2.9 without upgrading the libraries.

I apologize for misreading your version numbers; but regardless, if the changes between versions are security or bug fixes, they are backported to Stable releases. If they are changes in functionality and insignificant, then why backport?

If they are changes in functionality and significant, they will require testing before being backported. This testing takes away developer resources from the testing being done in the Testing branch and will cause delay in the release of the next stable.

ciol wrote:You did not understand.
In a separated base system, if a non-essential application like Firefox has a new bug after an upgrade, it will be very unlikely the Debian fault.

And in a non-essential application like GIMP, upgrading to a newer version may require upgrading about 50 different libraries. Are the libraries upon which GIMP depends to be considered essential, in which case what you propose can't be done; or non-essential, in which case all other programs which use any of those same libraries must be re-tested?

ciol wrote:On the contrary, in e.g -testing, since you upgrade more packages at a time, it's more difficult to find the problem.

And the good people who contribute to Testing know what to expect. They are familiar with applications they are testing and often know how to use debugging tools to provide useful feedback to the developers. At a minimum, they are familiar with provided mechanisms for reporting the problems and have an understanding of the type of information developers require (and if such is not the case, they will quickly be educated about it).
Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it. -- Brian Kernighan
User avatar
saulgoode
 
Posts: 1545
Joined: 2007-10-22 11:34

Postby ciol » 2008-08-07 07:59

I apologize for misreading your version numbers; but regardless, if the changes between versions are security or bug fixes, they are backported to Stable releases. If they are changes in functionality and insignificant, then why backport?

If they are changes in functionality and significant, they will require testing before being backported. This testing takes away developer resources from the testing being done in the Testing branch and will cause delay in the release of the next stable.


I think the guys from the Apache Foundation are strong enough to trust them. The more you trust the upstream, the less work you have as a distribution maintainer.

And in a non-essential application like GIMP, upgrading to a newer version may require upgrading about 50 different libraries.


In a bump from The Gimp 2.2 to 2.4 maybe. But look: Debian has The Gimp 2.2.13 in etch. The last from the 2.2 branch is 2.2.17. I think you can safely and easily upgrade in this case. That's all what I'm trying to say.
If The Gimp released 2.2.17, there are some reasons. I don't know why ignore them.

Are the libraries upon which GIMP depends to be considered essential, in which case what you propose can't be done; or non-essential, in which case all other programs which use any of those same libraries must be re-tested?


It's hard to say. It's something that can be discuss for each library.
ciol
 
Posts: 28
Joined: 2006-10-27 20:24

Next

Return to Debian Development

Who is online

Users browsing this forum: No registered users and 3 guests

fashionable