Scheduled Maintenance: We are aware of an issue with Google, AOL, and Yahoo services as email providers which are blocking new registrations. We are trying to fix the issue and we have several internal and external support tickets in process to resolve the issue. Please see: viewtopic.php?t=158230
Suggestion: Differential software updates
Suggestion: Differential software updates
I think, software updates can be made faster by including only the affected files. The unchanged files need not be included in updates. I think, this reduces the download time considerably.
Debian == { > 30, 000 packages }; Debian != systemd
The worst infection of all, is a false sense of security!
It is hard to get away from CLI tools.
The worst infection of all, is a false sense of security!
It is hard to get away from CLI tools.
Thanks for answering me.
I understand your point, because I practice programming as a hobby, obviously when I feel like it.but actually implementing it is easier said than done...
Debian == { > 30, 000 packages }; Debian != systemd
The worst infection of all, is a false sense of security!
It is hard to get away from CLI tools.
The worst infection of all, is a false sense of security!
It is hard to get away from CLI tools.
So, I will try to imagine a way how this [see title of topic] can be done
- 1) download the package's modified files
2) rebuild the package using the downloaded files and the files on the client computer
3) when all the required packages are rebuilt, install the packages using the debian's package management system
Debian == { > 30, 000 packages }; Debian != systemd
The worst infection of all, is a false sense of security!
It is hard to get away from CLI tools.
The worst infection of all, is a false sense of security!
It is hard to get away from CLI tools.
I don't see how rebuilding packages would be faster than simply installing the new ones.edbarx wrote:So, I will try to imagine a way how this [see title of topic] can be done
- 1) download the package's modified files
2) rebuild the package using the downloaded files and the files on the client computer
3) when all the required packages are rebuilt, install the packages using the debian's package management system
Debian Sid Laptops:
AMD Athlon(tm) 64 X2 Dual-Core Processor TK-55 / 1.5G
Intel(R) Pentium(R) Dual CPU T2390 @ 1.86GHz / 3G
AMD Athlon(tm) 64 X2 Dual-Core Processor TK-55 / 1.5G
Intel(R) Pentium(R) Dual CPU T2390 @ 1.86GHz / 3G
Ok. I don't mean "rebuild the packages" using the compilers, but using something like the command "dpkg-repack".
Alternatively [ie ignoring what I said earlier], the update can be done as follows: once the modified (and if applicable compiled) files are downloaded on the client computer, the outdated files can be replaced by the respective updated files. Finally, if necessary, a "dpkg-reconfigure" command can be run for those packages which may need it.
Alternatively [ie ignoring what I said earlier], the update can be done as follows: once the modified (and if applicable compiled) files are downloaded on the client computer, the outdated files can be replaced by the respective updated files. Finally, if necessary, a "dpkg-reconfigure" command can be run for those packages which may need it.
Debian == { > 30, 000 packages }; Debian != systemd
The worst infection of all, is a false sense of security!
It is hard to get away from CLI tools.
The worst infection of all, is a false sense of security!
It is hard to get away from CLI tools.
the devil is in the details.
you would have to
1: create a system for comparing two debs and generating some kind of "patch deb" containing the modified files (not too hard)
2: integrate theese patch debs into the debian infrastructure tools including some system to decide what patch debs to keep at any time (harder).
3: integtate theese patch debs into the debian package management tools (about as hard as 2) remembering to include a fallback system to fetch the whole deb when say a file that should have been there from the previous version is not there or currupted.
4: get the debian powers that be to accept your system (even harder).
you would have to
1: create a system for comparing two debs and generating some kind of "patch deb" containing the modified files (not too hard)
2: integrate theese patch debs into the debian infrastructure tools including some system to decide what patch debs to keep at any time (harder).
3: integtate theese patch debs into the debian package management tools (about as hard as 2) remembering to include a fallback system to fetch the whole deb when say a file that should have been there from the previous version is not there or currupted.
4: get the debian powers that be to accept your system (even harder).
Thanks for suggesting me "debdelta". Now, I can understand what are the difficulties. You (plugwash) are right to say, that the most difficult part is in convincing people, rather than developing a package that does what I said. It is always not easy to convince people. I cannot blame the debian people for holding on their beliefs.
Debian == { > 30, 000 packages }; Debian != systemd
The worst infection of all, is a false sense of security!
It is hard to get away from CLI tools.
The worst infection of all, is a false sense of security!
It is hard to get away from CLI tools.
I don't think there is the need of a new system... When one issues the command "aptitude upgrade", the program which does the upgrade, can have a subroutine which should handle the differential updates. So, I am speaking of an update of an existing package and not of a complete upheaval of the system currently in use.
Regarding the extra space required on mirrors, I think there is no need of extra space. The system can remain as it is. Only the server program, which uploads the debian packages to the client computer, will need an update. The server program, has to be able to deliver only the modified files to the client computer and at the end, deliver a file containing the details of what to do with the unchanged files. With this file, the client computer can recreate the packages using the unmodified files and the downloaded files.
Regarding the extra space required on mirrors, I think there is no need of extra space. The system can remain as it is. Only the server program, which uploads the debian packages to the client computer, will need an update. The server program, has to be able to deliver only the modified files to the client computer and at the end, deliver a file containing the details of what to do with the unchanged files. With this file, the client computer can recreate the packages using the unmodified files and the downloaded files.
Debian == { > 30, 000 packages }; Debian != systemd
The worst infection of all, is a false sense of security!
It is hard to get away from CLI tools.
The worst infection of all, is a false sense of security!
It is hard to get away from CLI tools.
What have I learnt from this discussion?
At the moment, I am still a beginner in Debian GNU/Linux. Before I started to use Debian, I used to program for Windows as a hobby. Had I the same experience with Debian GNU/Linux, I would have tried to program what I am proposing myself, because I believe that it is a valid idea. However, I do not have the required expertise.
Please, do not misinterpret me. Here, I am only exposing my idea. I am not trying to oblige you (the reader) to do it for me. I strongly believe that, sharing ideas is one of the most important aspects of an advanced society.
- Server maintainers do not like custom programs running on their servers
- Differential upgrades are not yet supported by Debian
- Long and bandwidth consuming updates cannot be yet avoided by Debian users.
- data compression is sequential in its nature ie the data, although compressed, is not scrambled ie the data, although unreadable, it is still in order. I think this is true, because files can be extracted from file cabinets without having to decompress everything.
- the list of what files must be delivered cannot be a long file
- the server can be requested to do partial package uploads to the client computer
- the server need not decompress the package to get the requested files
- contrary to what some are saying, the burden on the servers will not be increased. On the contrary, it can be reduced.
- there is no need to install a "server program" on the server
- partial downloads are already used in practice. ie by "download manager"
At the moment, I am still a beginner in Debian GNU/Linux. Before I started to use Debian, I used to program for Windows as a hobby. Had I the same experience with Debian GNU/Linux, I would have tried to program what I am proposing myself, because I believe that it is a valid idea. However, I do not have the required expertise.
Please, do not misinterpret me. Here, I am only exposing my idea. I am not trying to oblige you (the reader) to do it for me. I strongly believe that, sharing ideas is one of the most important aspects of an advanced society.
Debian == { > 30, 000 packages }; Debian != systemd
The worst infection of all, is a false sense of security!
It is hard to get away from CLI tools.
The worst infection of all, is a false sense of security!
It is hard to get away from CLI tools.
Yet another idea which might help improve updates.
In updates, I suggest to skip all the mechanisms that install and uninstall packages. I suggest to include only the modified binaries and to replace them on the client's computer. This should always result in faster updates. The problem with slow updates has to do with using apt, aptitude, apt-get and dpkg. The latter programs should not be allowed to do the updates, because they require a .deb file. In updates, it makes more sense to replace ONLY the updated binaries and settings on the client's computer.
Debian == { > 30, 000 packages }; Debian != systemd
The worst infection of all, is a false sense of security!
It is hard to get away from CLI tools.
The worst infection of all, is a false sense of security!
It is hard to get away from CLI tools.
I disagree, apt has all the infrastructure for working out what packages need to be updated and downloading them. dpkg has all the infrastructure for carrying out the maintainers configuration related changes and tracking what versions are installed as well as updating the main files of the package. Throwing away that infrastructure would be stupid. For a system to work it needs to be built within that existing infrastructure.The problem with slow updates has to do with using apt, aptitude, apt-get and dpkg. The latter programs should not be allowed to do the updates, because they require a .deb file. In updates, it makes more sense to replace ONLY the updated binaries and settings on the client's computer.
Most compression algorithms require you to start reading from the beginning. You can stop before you reach the end but if your file is in the middle of a solid archive you are going to have to uncompress everything that comes before that file. Tarballs are solid archives, zips are not, rars can be either depending on the options. A deb is an ar achive containg two tarballs.# data compression is sequential in its nature ie the data, although compressed, is not scrambled ie the data, although unreadable, it is still in order. I think this is true, because files can be extracted from file cabinets without having to decompress everything.
Are you sure? Have you ever heard about progress?plugwash wrote:For a system to work it needs to be built within that existing infrastructure.
If a system puts an imposed limit by itself, it should be reexamined, revised and if necessary, replaced.
The current package management system is what is causing unnecessarily slow updates, because it assumes to download All the files, modified or not! Downloading unmodified files simply does not make sense.
Debian == { > 30, 000 packages }; Debian != systemd
The worst infection of all, is a false sense of security!
It is hard to get away from CLI tools.
The worst infection of all, is a false sense of security!
It is hard to get away from CLI tools.
Sometimes an existing system is so broken or outdated that replacement is the only way to make progress. However very often people are all too eagar to rip up estabilished working systems and start from scratch without a sufficiantly good reason.edbarx wrote:Are you sure? Have you ever heard about progress?plugwash wrote:For a system to work it needs to be built within that existing infrastructure.
right but that is a relatively small part of what the package management system does.The current package management system is what is causing unnecessarily slow updates, because it assumes to download All the files modified or not! Downloading unmodified files simply does not make sense.
A system for doing differential updates is a sensible idea but to pull it off requires several things
* Someone with enough knowlage of compression, mirror operation, existing file formats and operational characteristcis etc to do a sensible design.
* Someone prepared to get to know the existing codebases and implement that sensible design within those code bases.
* Someone with the rescources to set up and host a test/demonstration setup that people can use.
* Someone with the political skill to get it accepted into the official system.
Can you provide some specific examples and show us how much difference this would make?edbarx wrote: The current package management system is what is causing unnecessarily slow updates, because it assumes to download All the files, modified or not! Downloading unmodified files simply does not make sense.