Kernel Traffic
Latest | Archives | People | Topics
Wine
Latest | Archives | People | Topics
GNUe
Latest | Archives | People | Topics
Czech
Home | News | RSS Feeds | Mailing Lists | Authors Info | Mirrors | Stalled Traffic
 

Debian Traffic #29 For 1 Jan 2002

By David Martínez

Debian Home Page | Weekly News | Social Contract | Constitution | Policy Manual | Develop er's Reference | Documentation Project | Developers' Lists Archives

Table Of Contents

Introduction

Merry Christmas and a Happy New Debian!

Welcome to Debian Cousin! I hope that this magazine will be useful both for Debian developers and users. My name is David Martínez, and I'm a Debian developer. The previous editors of Debian Cousin put a very high mark, and I will only try to reach it.

Apart from the snow, the freeze is entering Debian (or yet better, Debian is entering the freeze). The base section is completely frozen, and only point uploads are allowed. The next sections will follow the freeze in the next weeks, and many serious bugs will be closed if we want to release good software.

1. ID Software releases Quake 2 sources under GPL

21 Dec 2001 - 28 Dec 2001 (93 posts) Archive Link: "Quake 2 sources GPL'd"

People: Juhapekka TolvanenThomas BushnellBen CollinsJoey Hess

Juhapekka Tolvanen said, "I just heard it through the www.linuxgames.com ftp://ftp.idsoftware.c om/idstuff/source/quake2.zip Can we include that in Woody before too deep freeze?" Thomas Bushnell replied: "Does this include any game levels? If it doesn't include any levels that a person can play, then it only belongs in contrib." That brought a long thread about the convenience of having Quake 2 in main section or in contrib, being Ben Collins the defender for its placement in main, and Thomas and others putting it in contrib, because there's no real usable free data for the engine at the moment. As Ben Collins said:

The quake2 engine is a gaming engine. Lots of libraries in our current source are the same sort of things, and at one time did not have a game based on them. Did they go into contrib? No. If they had a game based on them that was non-free, would we put them in contrib? Probably not, because they are libraries as opposed to binaries.

Not so with the quake2 engine. However, just because it is a binary executable engine does not make it any different than a development library in terms of game development. Advertise the thing as a gaming engine, not as a game. Call the package "quake2-engine", and then once we get some, we will have "cool-game" Depend: quake2-engine. The fact is, that the quake2-engine does not depend on anything to perform what it was made for, and that is to be a gaming engine. The data is the game, and requires the engine.

And Thomas pointed:

The distinction between contrib and main is not whether it is *possible* to create something free which the contrib software would be useful for; it's really whether there *is* such a thing.

If the only practical use of the engine is to run non-free levels from id, then it belongs in contrib. If someone has levels (that at are all fun--that is, which are real games) which the engine works with, then it belongs (along with those levels) in main.

Erich Schubert then pointed to http://www.planetquake.com/stand/ for an implementation of some parts of a future free complete dataset for Quake 2.

Joey Hess put an interesting point of view:

The issues keeping quake 2 out of main, granting Ben's insistence that it can be looked at as an engine for games, are:

* As an engine for games, it is woefully lacking in documentation. Programming languages have at least one of a spec, sample code, a body of existing code, or something to read to learn them, while this "engine" does not.

* Nobody has actually come forward and volenteered to put this in main and give it the level of mantainance software in main deserves. If it were in main, they would really be obligated to be able to tell users some way it can be used, whether that is pointing them as a game that uses it, or at some documentation for writing one or at a free level editor or whatever. But if all the maintainer can do is point the user at data files you buy on CD, it makes a mockery of it being in main.

So in summary we may have actually excluded a package from being in main because it lacks sufficient documentation (nice precedent ;-), and this can all be changed by one maintainer with sufficient chutzpah to upload it to main and deal with the consequences.

Ben Collins gave up after replying to most of the messages in the thread. Later, Jamie Wilkinson filled in an Intent To Package (ITP) quake2 and declared his intention to put it in contrib.

2. Compiler problems

22 Dec 2001 - 25 Dec 2001 (14 posts) Archive Link: "Sparc buildd a cross-compiler?"

People: Mikael HedinBen CollinsJeff Licquia

Having a solid compiler is the first step for having a solid architecture. It seems that in gcc world, there are some archs that only now begin to stabilize.

Mikael Hedin asked in debian-devel: "I noticed gsmlib has failed on sparc for a long time. The last log, http://buildd.debian.org/fetch.php?&pkg=gsmlib&ver=1.7-1&arch=sparc&stamp=100607 1760&file=log&as=raw, says in the end that g++-3.0 is a cross compiler, and then the build bails out. What's up?" . Ben Collins replied him: "Your package better use gcc, not gcc-3.0. Using anything other than the default supported compiler gets you a bug report." Then Mikael said that gsmlib does not build with g++-2.95, and Ben suggested: "Then fix the build with that compiler. You wont get any support for a compiler that is not considered the default for an arch." Mikael then ended with:

Anyway, g++-3.0 seems to be completely broken on sparc. int main(){return(0);} gets a bus error. So I guess I'll just don't build on sparc. Unless I get a really easy way to fix this (butit seems to me to be something unsopported in g++-2.95).

Ben went on:

Of course it is broken. It is _not_ supported on sparc, other than to make it available to users. _I_ do not want anything built on sparc that doesn't use the default compiler (except in cases such as libc6-sparc64 where we obviously have to use the 64-bit capable gcc-3.0 compiler).

It should be policy that programs are required to use the default compiler on an arch. You create serious overhead on arch maintainence when you ignore that.

But Jeff Licquia objected:

While I don't disagree with such a policy in general, I think that exceptions should be allowed.

On ia64, there really isn't a super-strong code generation engine available. The default gcc (2.96!) is a bit behind in bugfixes, and gcc 3.x, although much better at generating ia64 code, has other weaknesses. We try to build everything with gcc 2.96 as much as possible, but in some cases, gcc 3.0 is required to get code that works. In those cases, we haven't seen anything wrong with debian/rules hackery to set CC=gcc-3.0 and so on, and Build-Depend on gcc-3.0 [ia64].

Is this something you object to? I understand how you might object on sparc, since gcc 2.95 has supported sparc for a long time now. But on newer architectures, we may not have the luxury to mandate a single gcc version.

And if you object, could you suggest a solution? Some of the packages affected are very large and complex and "fix the problem in the source of your package" would, most likely, involve quite a bit of work. I suspect in a few of those cases that the only feasible response would be to remove the package from the architecture, which seems a shame if building with a different compiler would fix the problem.

It seems that life is not that easy on architectures other than i386...

3. Debian losing quality

25 Dec 2001 - 29 Dec 2001 (66 posts) Archive Link: "An alarming trend (no it's not flaimbait.)"

People: Brian WolfeHenrique de Moraes HolschuhChristian KurzDavid N. WeltonAnthony Towns

Debian has always been known for its stability and solidity. But the enormous amount of packages that sit now in Debian could throw bad press over the distribution. Brian Wolfe posted a long message about this issue:

For some time now there has been an increasing trend in people that I know who use debian. It is the view that debian is becoming increasingly "old"/outdated, and that developers either a: dont' have the time to properly maintain packages, or just don't care. Which the case is here I don't know. I'm not intimate with a lot of developers. However, this has been the same view that has been slowly dawning over me for a while now.

I see an increasing trend of two critical problems in the way debian operates. #1 package age. Let me talk about this one first. There has been a relatively (year or two) explosion in the package count. As this package count has gone up, packages that I have used for years and that used to work well have falen into a sad state or disrepair.

Then commented out some ancient bugs in CDRtoaster, and continued:

CDRToaster hasn't been updated on the homepage since Jan 2001 at ver 1.12. Obviously this package is DEAD. 8-P I'm sad to see it go as I am on many usefull programs such as this one.

However, that leaves a problem. I've been told by several developers that "it's an upstream problem. send them a patch and when they include it we will update". Wel, that argument doesn't work in increasingly common cases like this. At this point, it is now (IMHO) the debian packagers problem. If they are unwilling or unable to fix it, then the package should be marked as "BAD" or "dead-upstream" as a warning to the user that they should pick a different utility like this one to use.

What I see happening is this. The package count has increased proportionatly to the ammount of bugs per package. This is giving debian a bad name. This is driving users away. Eventualy if this continues, debian WILL die or be a nice distribution only diehard fans of it's ideals will use.

Now a little history for you to understand my view of why this prevaling attitude is annoying to say the least, and has me up in arms over it so to speak. When I signed on to distribute debian, it was rock solid. Packages were only marginaly out of date. People loved it. Users loved it. Debian people trash talked redhat daily over it's bugs.(not all debian folk, just the more vocal and publicly seen ones). I have slowly stoped recomending it as the number of people that tried it because of me has shifted from mostly "nice distro. thanks" to "this is buggy, and out of date. thanks a lot. >:( ".

Many developers then said that many problems with poorly maintained packages was only an issue with the Debian maintainer. As Henrique de Moraes Holschuh said: "Please file such a bug against that CD recoding package. If the maintainer complains that he is 'actively maintaining' it, tell him to stop lying to himself and admit he either needs to become upstream and fix all bugs, or drop the package (and keep the bug open)" .

But other developers does not feel that becoming upstream be the solution, as Christian Kurz: "you seem to ignore that this is _volunteer_ _based_. Debian Developers will work on those issues that they are interested in and not the things you want to see them working on. If you want to see Developers working on some issue, either start paying them for doing the work, convince enough to work on the issue or start the work on your own." David N. Welton then raised an ancient issue in debian-devel: "the best way you can make a meaningful contribution is to file bugs that are "higher level" than "normal", in order to draw attention to broken packages [...] and by doing so, possibly blocking buggy software from going into 'testing' or being released."

Anthony Towns, the release manager, quickly expressed his well-known opinion about that subject:

Oh god no. Please no. Inflating bug severeties just makes it harder to do releases; if there's a problem with normal bugs being ignored (and, IMO, there is), it needs to be addressed directly, not worked around by filing everything as important or higher.

Hrm. At least tell me that I'm misreading this, and what you meant to say was `` "higher quality" than "average" '' or something.

The thread died inconclusively without a clear consensus.

 

 

 

 

 

 

Sharon And Joy
 

Kernel Traffic is grateful to be developed on a computer donated by Professor Greg Benson and Professor Allan Cruse in the Department of Computer Science at the University of San Francisco. This is the same department that invented FlashMob Computing. Kernel Traffic is hosted by the generous folks at kernel.org. All pages on this site are copyright their original authors, and distributed under the terms of the GNU General Public License version 2.0.