Scheduled Maintenance: We are aware of an issue with Google, AOL, and Yahoo services as email providers which are blocking new registrations. We are trying to fix the issue and we have several internal and external support tickets in process to resolve the issue. Please see: viewtopic.php?t=158230

 

 

 

Ideas for separation of "base system"

User discussion about Debian Development, Debian Project News and Announcements. Not for support questions.
Message
Author
digthemdeep
Posts: 4
Joined: 2008-08-04 23:02

Ideas for separation of "base system"

#1 Post by digthemdeep »

Here is a suggestion for Debian development, if you'll forgive my audacity:

I love the stability and reliability of Debian's stable branch, but there is a well known compromise in oldness. For me, the oldness is mostly fine, but there are a handful of applications which I feel I must upgrade manually, (OO.o, Sauerbraten, Ardour...)

Perhaps Debian could borrow from FreeBSD conceptually in this regard; by separating the "base system" (kernel plus userland) from the "applications". Ideally the "applications" should get cutting-edge updates, but the "base system" should stick to the stable-branch policy, such that only security and bug updates are applied, thereby retaining (much of) the stability and security that Debian is famed for.

Stop me if I'm wrong, but implementing such a scheme might be done by changing the repo categories from "main, contrib" to "base, main, contrib" and changing the branches from "stable, testing, unstable" to "stable, stablenhalf, testing, unstable", where stablenhalf would not have a base repo category. The traditional system-wide stable config could also be preserved as an option. This would mean that stablenhalf would receive package updates from unstable just like testing, but they would be compiled against the stable base-system. One has to assume that most packages have compile-time compatibility going back at least as far as the latest stable base, but it's my impression this would be true the vast majority of the time.

Another way to implement this might be to increase granularity of apt, such that different update policies can be applied to different package categories. This seems non-ideal to me, however, because it radically increases the number of possible end-user configurations, tends towards package incompatibility, and the bugs thereof may become unmanageable.

It looks like this mindset has already been adopted for hardware drivers, which I am excited to see, but it's my humble opinion that it would be a benefit to many (the majority?) of users to have this done for applications as well.

Thanks for you consideration!
Chris

User avatar
rickh
Posts: 3434
Joined: 2006-06-29 02:13
Location: Albuquerque, NM USA

#2 Post by rickh »

That's backports.org you're talking about.
Debian-Lenny/Sid 32/64
Desktop: Generic Core 2 Duo, EVGA 680i, Nvidia
Laptop: Generic Intel SIS/AC97

User avatar
industrialpunk
Posts: 731
Joined: 2007-03-07 22:30
Location: San Diego, CA, USA

#3 Post by industrialpunk »

You can also mix releases and only pull in the updated software you want.
-Josh Willingham

User avatar
Pick2
Posts: 790
Joined: 2007-07-07 13:31
Location: Decatur Il

#4 Post by Pick2 »

Don't get me wrong , it a great Idea ...
But you are trying to spoon feed the wrong people , try a Fork instead ! :lol:

User avatar
saulgoode
Posts: 1445
Joined: 2007-10-22 11:34
Been thanked: 4 times

Re: Ideas for separation of "base system"

#5 Post by saulgoode »

I suspect the greatest problem would be in defining what a "base system" consists of. Should it include all the X and graphics libraries (GTK, QT, Cairo, Pango, etc)? If not, then you are left with pretty much the core basics which don't typically change rapidly enough to be a problem.

If those libraries are to be "locked in" by a distro, then you will find upstream projects will basically ignore that distro. Developers aren't going to avoid availing themselves of a library's feature just because some distro has decided not to upgrade to a newer version of that library.

If you decide to backport such libraries, you are basically replicating the efforts of the distro's testing branch within its stable branch. If you have the manpower then this would be fine; but it seems to me that taking away development efforts from Testing to maintain Stable would tend to increase the bugginess of both Stable and Testing.

Also, keep in mind that with Unix-type systems the distinction between an application and a library can be very then. Programs such as FFMPEG and MPlayer (not to mention all the GNU Tools) are quite commonly employed as "library subroutines" within other programs.
Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it. -- Brian Kernighan

ciol
Posts: 28
Joined: 2006-10-27 20:24

#6 Post by ciol »

I really think developers should change something.
For instance, instead of backports.org being an additional project, it could be -stable.
Not everything need to be that stable. Who *really* wants stability? And who can afford a Red Hat?
If those libraries are to be "locked in" by a distro, then you will find upstream projects will basically ignore that distro.
It's exactly what happens with -stable.
Yes, I think you can freeze only the libraries. When a software can't be compiled against your old libraries, then you can begin to maintain and freeze this software exactly as Debian does in Etch or Sarge etc.
I suspect the greatest problem would be in defining what a "base system" consists of. Should it include all the X and graphics libraries (GTK, QT, Cairo, Pango, etc)?
What is an operating system? An OS is kernel + libc + X + a windows/desktop manager.
I don't know what Debian is. Debian is as well the kernel than a little game like tetris.
If Debian had focused on less packages, maybe the openssl problem would have not happened.

User avatar
FolkTheory
Posts: 284
Joined: 2008-05-18 23:02

#7 Post by FolkTheory »

dude you dont even know what youre talking about. you mention 50 different things (wtf tetris?) then your solution to solve all security problems is have less programs...well no crap sherlock. if we had no openssl we wouldnt have had an openssl vulnerability but guess what? we'd have no users either!

then you say nobody wants stability...where the hell did you get this idea from? the people that run stable are very much interested in stability!
Image

ciol
Posts: 28
Joined: 2006-10-27 20:24

#8 Post by ciol »

if we had no openssl we wouldnt have had an openssl vulnerability
I did not say that. What I said: if you only backport security fixes for a base system (and openssl can be in the base system), you have more developers to check fewer packages.
For the non-essential applications, you can upgrade to the last upstream version (if possible), instead of backporting.
then you say nobody wants stability...where the hell did you get this idea from? the people that run stable are very much interested in stability!
I think Debian does -stable but does not know what its target is.
The people who need stability (stability for Debian means the software doesn't change its behavior) are servers administrators for very critical sites (i.e very few people, not everyone administrate the NASA servers).
I don't think Apache 2.2.9 is less stable than Debian Apache 2.2.3-4+etch5.

I think it's not a matter if non-essential applications crash during an update, because:
1) All your system still runs.
2) You know where the problem is (since the libraries stay frozen).
3) You can simply downgrade.
4) It will be the upstream fault. Users can understand that.

User avatar
BioTube
Posts: 7520
Joined: 2007-06-01 04:34

#9 Post by BioTube »

So you think abandoning a TRIED AND TRUE development model to ape BSD? Most of the base system doesn't change very often(being recompiled against newer libraries is probably the biggest change some of them have gotten in years) and having a frozen target lets the security team handle the three myriads of packages well enough.
Image
Ludwig von Mises wrote:The elite should be supreme by virtue of persuasion, not by the assistance of firing squads.

ciol
Posts: 28
Joined: 2006-10-27 20:24

#10 Post by ciol »

Just think about it.

User avatar
BioTube
Posts: 7520
Joined: 2007-06-01 04:34

#11 Post by BioTube »

Debian's model's worked excellently. If it ain't broke, don't fix it.
Image
Ludwig von Mises wrote:The elite should be supreme by virtue of persuasion, not by the assistance of firing squads.

User avatar
saulgoode
Posts: 1445
Joined: 2007-10-22 11:34
Been thanked: 4 times

#12 Post by saulgoode »

ciol wrote:I think it's not a matter if non-essential applications crash during an update, because:
1) All your system still runs.
2) You know where the problem is (since the libraries stay frozen).
3) You can simply downgrade.
4) It will be the upstream fault. Users can understand that.
1) The hardest problems (to find and to fix) are not that an application "crashes"; they are that an application runs "wrong". If a project asserts that you should version 2.8 of a library, that typically means that using version 2.10 hasn't been tested, that it has been tested and something failed, or that there is a known conflict in the API or behavior of the library which has not yet been addressed.

2) Therein lies the rub. You can't upgrade Apache from 2.2 to 2.9 unless you also upgrade the libraries. If you upgrade the libraries that Apache uses, you introduce problems for other programs which used the older library versions. If you don't upgrade the libraries, you can't upgrade Apache.

4) What would be upstream's fault? The fact that they depend on newer libraries than a distro provides? Or that a distro provides newer libraries than the upstream specifies? Either way to see it as upstream's fault is to suggest that it is the responsibility if upstream projects to follow the dictates of a distribution (which one?).
Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it. -- Brian Kernighan

ciol
Posts: 28
Joined: 2006-10-27 20:24

#13 Post by ciol »

BioTube wrote:Debian's model's worked excellently. If it ain't broke, don't fix it.
I don't think it works excellently. A lot of people use -testing because -stable is too obsolete, but they should not use -testing neither.
Can you explain me why all the windows managers are frozen in -stable? I don't think an administrator needs e.g ion in -stable.
If a project asserts that you should version 2.8 of a library, that typically means that using version 2.10 hasn't been tested, that it has been tested and something failed, or that there is a known conflict in the API or behavior of the library which has not yet been addressed.
2.8 is compatible with 2.10. If not, you can change the soname or send a patch to the library developers.
You can't upgrade Apache from 2.2 to 2.9 unless you also upgrade the libraries.
But you can do what I said: upgrade from 2.2.x to 2.2.9 without upgrading the libraries.
What would be upstream's fault?
You did not understand.
In a separated base system, if a non-essential application like Firefox has a new bug after an upgrade, it will be very unlikely the Debian fault.
On the contrary, in e.g -testing, since you upgrade more packages at a time, it's more difficult to find the problem.

User avatar
saulgoode
Posts: 1445
Joined: 2007-10-22 11:34
Been thanked: 4 times

#14 Post by saulgoode »

ciol wrote:
saulgoode wrote:You can't upgrade Apache from 2.2 to 2.9 unless you also upgrade the libraries.
But you can do what I said: upgrade from 2.2.x to 2.2.9 without upgrading the libraries.
I apologize for misreading your version numbers; but regardless, if the changes between versions are security or bug fixes, they are backported to Stable releases. If they are changes in functionality and insignificant, then why backport?

If they are changes in functionality and significant, they will require testing before being backported. This testing takes away developer resources from the testing being done in the Testing branch and will cause delay in the release of the next stable.
ciol wrote:You did not understand.
In a separated base system, if a non-essential application like Firefox has a new bug after an upgrade, it will be very unlikely the Debian fault.
And in a non-essential application like GIMP, upgrading to a newer version may require upgrading about 50 different libraries. Are the libraries upon which GIMP depends to be considered essential, in which case what you propose can't be done; or non-essential, in which case all other programs which use any of those same libraries must be re-tested?
ciol wrote:On the contrary, in e.g -testing, since you upgrade more packages at a time, it's more difficult to find the problem.
And the good people who contribute to Testing know what to expect. They are familiar with applications they are testing and often know how to use debugging tools to provide useful feedback to the developers. At a minimum, they are familiar with provided mechanisms for reporting the problems and have an understanding of the type of information developers require (and if such is not the case, they will quickly be educated about it).
Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it. -- Brian Kernighan

ciol
Posts: 28
Joined: 2006-10-27 20:24

#15 Post by ciol »

I apologize for misreading your version numbers; but regardless, if the changes between versions are security or bug fixes, they are backported to Stable releases. If they are changes in functionality and insignificant, then why backport?

If they are changes in functionality and significant, they will require testing before being backported. This testing takes away developer resources from the testing being done in the Testing branch and will cause delay in the release of the next stable.
I think the guys from the Apache Foundation are strong enough to trust them. The more you trust the upstream, the less work you have as a distribution maintainer.
And in a non-essential application like GIMP, upgrading to a newer version may require upgrading about 50 different libraries.
In a bump from The Gimp 2.2 to 2.4 maybe. But look: Debian has The Gimp 2.2.13 in etch. The last from the 2.2 branch is 2.2.17. I think you can safely and easily upgrade in this case. That's all what I'm trying to say.
If The Gimp released 2.2.17, there are some reasons. I don't know why ignore them.
Are the libraries upon which GIMP depends to be considered essential, in which case what you propose can't be done; or non-essential, in which case all other programs which use any of those same libraries must be re-tested?
It's hard to say. It's something that can be discuss for each library.

User avatar
AdrianTM
Posts: 2499
Joined: 2004-09-19 01:08

#16 Post by AdrianTM »

Have you considered that maybe Debian Stable is not for you?

Instead of trying to change something that's not for you, try to find something that is. From 400 distros I'm sure you'll find something appropriate, but if you still don't find something you fully like and you still like Debian then maybe you should give them the benefit of doubt, maybe, just maybe they do things right.
Ubuntu hate is a mental derangement.

ciol
Posts: 28
Joined: 2006-10-27 20:24

#17 Post by ciol »

I thought one of the priorities of Debian was their users.
If not, they should remove "The Universal Operating System" from their main website.

User avatar
AdrianTM
Posts: 2499
Joined: 2004-09-19 01:08

#18 Post by AdrianTM »

ciol wrote:I thought one of the priorities of Debian was their users.
If not, they should remove "The Universal Operating System" from their main website.
The main priorities of Debian is their users, not you specifically. But whatever if you think that you talk for most of the users I will let you believe that...

But again, if you don't like how Debian does things you should probably use something else, why not BSD since you seem to appreciate how they do stuff?
Ubuntu hate is a mental derangement.

User avatar
saulgoode
Posts: 1445
Joined: 2007-10-22 11:34
Been thanked: 4 times

#19 Post by saulgoode »

ciol wrote:In a bump from The Gimp 2.2 to 2.4 maybe. But look: Debian has The Gimp 2.2.13 in etch. The last from the 2.2 branch is 2.2.17. I think you can safely and easily upgrade in this case. That's all what I'm trying to say.
If The Gimp released 2.2.17, there are some reasons. I don't know why ignore them.
I agree with you here. It would seem reasonable to provide a package to update GIMP to version 2.2.17. But this is permitted under the existing updates policy and the failure is most likely an oversight on the part of the package maintainer.
Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it. -- Brian Kernighan

ciol
Posts: 28
Joined: 2006-10-27 20:24

#20 Post by ciol »

@AdrianTM:
I was a Debian user, digthemdeep and tuomov are debian users. I don't think we are less important than others.
It would seem reasonable to provide a package to update GIMP to version 2.2.17. But this is permitted under the existing updates policy
It's not.

Post Reply