Debian Traffic #18 For 12 Jan 2001

Editor: Zack Brown

By Prashanth MundkurSteve Robbins  and  Zack Brown

Debian Home Page (http://www.debian.org) | Weekly News (http://www.debian.org/News/weekly/) | Social Contract (http://www.debian.org/social_contract) | Constitution (http://www.debian.org/devel/constitution) | Policy Manual (http://www.debian.org/doc/debian-policy/) | Developer's Reference (http://www.debian.org/doc/packaging-manuals/developers-reference/) | Documentation Project (http://www.debian.org/doc/ddp) | debian-devel Archives (http://lists.debian.org/#debian-devel)

Table Of Contents

Introduction

Want to help write KC Debian? See the KC Authorship page (../author.html) the KC Debian homepage (index.html) , and the Thread Summary FAQ (../summaryfaq.html) . Send any questions to the KCDevel mailing list. (mailto:kcdevel@zork.net)

Mailing List Stats For This Week

We looked at 533 posts in 1937K.

There were 188 different contributors. 96 posted more than once. 0 posted last week too.

The top posters of the week were:

1. Waiting For Packages In Testing

1 Jan 2001 - 2 Jan 2001 (6 posts) Archive Link: "Questions about testing"

Summary By Steve Robbins

People: Anthony TownsJohn O Sullivan

The "testing" distribution continues to generate questions on the mailing list. In the present thread, the questions centre on how long a newly-uploaded package (to unstable) must wait before it shows up in testing.

Anthony Towns clarified:

The idea is that for a package to get into testing it should:

The first two of those points can be automatically checked, and are. The latter point, though, requires people to actually try the package, and report any problems.

For the latter to be of any value, there are two further requirements. One is that you give people a little time, both to install the package and to leave it around long enough that people have a chance to see if it breaks in normal use, or similar.

Given the way Debian generally works: most people running apt-get dist-upgrade every day, then the only package that's going to get any testing at all in normal use is the very latest one, not one from a few days ago that's already been obsoleted.

There are a host of technical problems as well: you can't tell whether bugs apply to the new or the old version, there's no way to get at the appropriate old versions, the code was written expecting there to be exactly one candidate for each source package and does break if that's violated, the number of possible combinations of packages to try is fairly unreasonable already, trying old versions as well is non-trivial, and so on

The times (14 days for low, and 7 days for medium) were taken from the time it usually seems to take the autobuilders to get a package sync'ed across multiple architectures, which is usually around a week or a little longer. If you're doing the upload every other day thing, you'll tend to fail the first check in any case.

A couple of posters felt that a uniform 14-day waiting period is not ideal. For rarely-used packages, a bug may not be noticed in a fortnight; conversely, a bug in a heavily-used package like X is likely to be noticed within hours, so it might be moved to testing after a shorter delay. Anthony answered with " I'm not seeing the "excessive"-ness, here, I guess. "testing" isn't a replacement for unstable: if you want the latest and greatest stuff, and you don't want to wait for it, you go with unstable. "

Elsewhere, John O Sullivan remarked, " packages that are updated everyday are a big headache for those of us that are living at the end of a modem, because we have to update many 10's of packages a day == lots of downloading. " John wondered whether a rule that packages should generally not be updated more than twice a week should be made into policy. Anthony pointed out that if you track testing rather than unstable, " the 14 day rule has the cute property that once a package is placed in testing, it'll be at least 14 days before it's replaced. (On the day it gets included in testing, it's also the latest version uploaded to unstable; if the very next day a new version is uploaded, it'll still take 14 days before that version is considered for testing and until then the existing version will just sit there). "

The final thought comes again from Anthony: " we're in the first couple of weeks of using this, can we at least try it as is before worrying about how to tweak it? "

2. GCC Snapshot In Sid

1 Jan 2001 - 2 Jan 2001 (15 posts) Archive Link: "Huh, gcc 2.95.3?"

Summary By Zack Brown

People: Daniel StoneMark BrownHarald DunkelBen Collins

Harald Dunkel noticed that the gcc package in Sid had the version number 2.95.3, while the GCC steering committee had not yet officially announced the release. Ben Collins that the Debian version was based on the CVS branch; to which Daniel Stone sputtered, "Ack!(tm). Not shades of rh7, I hope? I know that people using sid (like myself) are willingly sado-masochists, but a CVS GCC?" Mark Brown explained, "GCC 2.95.3 is in final testing and due for release RSN, making it a somewhat different situation. It's also binary-compatible with 2.95.2." Elsewhere, Ben Collins clarified that the CVS version was not a bleeding edge developer snapshot, but was actually a soon-to-be-released release. But he said that until the official release, the CVS version was the only one available.

3. Source-Only Uploads Broken (temporarily?)

2 Jan 2001 - 5 Jan 2001 (10 posts) Archive Link: "Important Note On Source-Only Uploads"

Summary By Steve Robbins

People: Anthony TownsMichael Stone

Anthony Towns announced that source-only package uploads no longer work.

In more detail: it's possible, even easy with the recent versions of dpkg, to do source-only uploads to the archive. That is only upload a diff and a dsc (and maybe an orig.tgz), without any .debs at all.

The most important problem this has is how katie (the new dinstall) processes it. It goes through the following motions:

The only good thing about the way katie handles this is that it doesn't delete the old source. It does remove any reference to that source from the sid Sources.gz file though.

Another significant problem with source-only uploads is that (afaik, anyway) none of the autobuilders will attempt to build any arch: all packages, which would have left both the source only uploads I've noticed recently to break anyway.

A couple of posters lamented the loss of source-only uploads. Michael Stone even suggested that source-only uploads become the norm, since " we've seen too many "compiled against helix gnome" and whatnot, which wouldn't be an issue if packages were always built in a sane environment. "

No word on when (or whether) source-only uploads will work again.

4. Status Of /etc/debian_version

5 Jan 2001 - 7 Jan 2001 (27 posts) Archive Link: "What to do about /etc/debian_version"

Summary By Prashanth Mundkur

People: Santiago VilaJoey HessAnthony TownsMichael Stone

Santiago Vila asked how he should handle the /etc/debian_version file in light of Joey Hess' filing of Bug#81249, which addressed the fact that local changes to /etc/debian_version are not preserved on upgrades. Among the options Santiago posed were:

Joey pointed out that "The file does serve a useful purpose: it concentrates the debian version number string that is used in a number of places (issue.net and so on) into one central place to be modified." , and that as a conffile, /etc/debian_version would only change behaviour " if the admin edits the file though -- it is not as if making it a conffile is going to at all affect people who don't modify it." Joey hence suggested /etc/debian_version be made a conffile.

When Santiago asked Joey what he meant by having the file locally customized to "better reflect the status of the system", since the Debian version of a system was imprecise given that packages may be upgraded in an independent fashion, Joey responded, "I'm the only admin. So I'm probably the best authority on exactly what version fo debian it is running, so why not let me edit the file to reflect that?"

When Michael Stone suggested getting rid of the file, Bart Schuller and others remarked that they had seen third-party software install scripts use the file to determine which Linux distribution the system was running, and hence that the file was worthy of preservation.

Anthony Towns provided an alternative: "Have you thought about passing the buck to apt, and letting it update it? In theory, it can probably do this based on the information from the various Release files it downloads." , but there was no further discussion.

Elsewhere, there was some discussion on the status of /etc/mtab and its possible replacement by /proc/mounts.

 

 

 

 

 

 

Sharon And Joy
 

Kernel Traffic is grateful to be developed on a computer donated by Professor Greg Benson and Professor Allan Cruse in the Department of Computer Science at the University of San Francisco. This is the same department that invented FlashMob Computing. Kernel Traffic is hosted by the generous folks at kernel.org. All pages on this site are copyright their original authors, and distributed under the terms of the GNU General Public License version 2.0.