Upgrading Gitea Is Painful

I just wanted to upgrade a Gitea instance and, in the process, deleted old gitea binaries. After that, pushing to a repository did not work anymore, because the path to the gitea binary which is eg. used when the instance or a repository is being created, is being hard-coded into several files, and that file was just gone. So, in order to upgrade to a newer version of gitea, you have to do the following, assuming you run the service under the gitea user:

  1. In ~gitea/.ssh/authorized_keys, you have to adjust the path of gitea to the location of the new binary.
  2. Per repo, you need to adjust the path to the gitea binary in the following files:

    • hooks/post-receive.d/gitea
    • hooks/pre-receive.d/gitea
    • hooks/update.d/gitea

Maybe more files need to be changed, but at the moment, this seems to be enough to generally make things work again.

Links:

  • https://gitea.io

Back to top


Why Do DigitalOcean, AWS & Co Not Default To Debian?

I just read Chris Lamb's platform[1] for this year's DPL elections, where he asks why enterprises do not use Debian by default. In this article, I want to give some answers, although I think Chris is very likely already aware of them, given his track record.

  • Marketing

    I think Chris is partially right: Marketing is important, whether we like it or not. The ArchWiki example that he mentions, shows that they manage to present relevant content in a very accessible manner. This has in part to do with their organization of the information, and also with them possibly keeping the information better up to date than we probably do (I frequently find better information in the ArchWiki myself.)

    Their styling is imho on par with ours, so the difference should lie elsewhere. This may be partially due to them using a different software which has a much bigger userbase than ours, which certainly does contribute to users findig it easier to work with, because they don't need to learn anything new - no new procedures, no new markup language, the software already feels familiar. In short, it conforms more to existing user habits because of the market share of that other software.

  • Commercial Viability

    In my professional experience, I found that there are a few factors which make other versions of Linux, particularly CentOS and friends, more attractive to enterprises like eg. AWS:

    • Our support cycle is too short.

      These enterprises like to have that 10 years of support and never worry about any ugprades, because after 10 years, you can usually safely throw the machine away. The impact is that the vendor, eg. Amazon, does not need to involve the customer about upgrading their application, which the customer usually does not want to do, and also does not allocate any budget to. The typical customer expects, that once he has his application deployed, it will continue to run without change until he decides to stop running that version of said software, and considers upgrades to be a waste of time an money. Also, both security updates and newer versions of some third-party software become available on older versions of such Linux systems without the need for a big upgrade. The former enables the vendor to say that his platform is secure, and that any breaches are solely the fault of the customer, while the latter enables the vendor to offer new features to the customer without requiring him to upgrade. As an example, I'd like to point to the availability of PHP7 on CentOS 6.8, which is from 2016, but does not deviate too much from even older versions of CentOS and thus require not too much re-learning, with their first 6.x version being released in 2011, alongside Squeeze.

      [2018-01] It looks like Snaps are addressing this problem.

    • As a corollary to that, there is a much clearer separation between the very small core distribution, and the large amount of third-party commercial software.

      Also, us having tons of software already included, which eat a lot of manpower, is an underemphasized, so it may not be obvious how Debian can make users' lives easier.

    • There is a certification system in place, that gives the enterprise some confidence about the abilities of any prospective hires. I am not aware of any certification system for Debian.

    • The boon and the bane of Debian is the non-commercial nature of it. There is no single commercial entity behind Debian, which results in enterprises not knowing whom to sue, or how long the project will survive. Nevermind that similar problems have occurred with many vendors in the past, but there is a vendor which could be sued, if need be. And it looks like they have enough government backing to not easily go bankrupt, either. But the distrust against volunteer organisations which are as loosely knit as Debian is, runs deep.

Links:

  • https://www.debian.org/vote/2017/platforms/lamby

Back to top


Freie Software und das Militär

Ich lese oft und gern Fefes Blog, weil es einem eine Vielzahl von Nachrichten in aggregierter Form mit Links auf die Quellen zur Verfügung stellt, ohne daß man tonnenweise Reklame und Schlimmeres über sich ergehen lassen muß, aber eine Sache stößt mir schon lange auf: Bei jeder Gelegenheit fordert Fefe, daß die GPL um eine Klausel zum Ausschluß militärischer Anwendungen erweitert werden müsse.

Davon halte ich überhaupt nichts, und dem ist meiner Meinung nach ntschieden entgegenzutreten.

Begründung:

Zum Einen würde dadurch die Softwarelandschaft lizenztechnisch weiter zersplittert, und zwar in einer Art und Weise, die uns in die Zeit vor die Entwicklung der GPL zurückwerfen würde. Wenn die GPL nämlich um diese Forderung erweitert würde, käme der nächste Entwickler an und würde die Verwendung im Bereich Gentechnik, der Kirche, durch Autofahrer, Veganer, Farbige, oder wie auch immer einschränken wollen, und kaum eine Software wäre noch mit irgendeiner anderen Software kompatibel. Diese Art von Lizenzwirrwarr war vor der GPL üblich.

Wir haben ja jetzt schon Schwierigkeiten mit OpenSSL, jQuery und sicher noch einer Reihe weiterer Softwarepaketen, die Lizenzfragen aufwerfen bzw. eine Sonderbehandlung erfordern.

Dabei gibt es natürlich massive Abgrenzungsprobleme: Benötigt eine Fräse in einer Munitionsfabrik jetzt eine nicht-Fefe-GPLte Software, oder wäre ein derartiger Einsatz noch von einer derartig geänderten "GPL" gedeckt? Was ist mit Nähmaschinen für Schutzwesten? Für Uniformen? Was wäre, wenn die Bundeswehr die Software im Rahmen einer weiteren Flutkatastrophe zu Zivilschutzzwecken einsetzen will, oder wenn Widerstandskämpfer in Nordkorea (gibt es die überhaupt?) diese Software benutzen wollen, um gegen ihre Regierung vorzugehen? Was, wenn diese Widerstandskämpfer gerade gegen ein im Verhältnis deutlich liberaleres Regime vorgehen, wie etwa gerade im Nahen Osten, oder schon früher in Lateinamerika? Ich benutze hier in beiden Fällen das Wort "Widerstandskämpfer", um die politische Wertung aus der Frage herauszunehmen und die Diskussion auf die juristische Mechanik, so, wie sie sich mir als juristischen Laien darstellt, zu konzentrieren.

Dazu kommt, daß diese generelle Änderung unnötig ist, denn schon heute kann jeder seine Software nach der Art "GPL plus folgende Einschränkungen/Erweiterungen" lizensieren. Ein populäres Beispiel dafür ist die "OpenSSL-Ausnahme" oder die "Classpath-Ausnahme" (Wikipedia zum Thema).

Des Weiteren geht er von der Annahme aus, daß das Militär sich an eine derartige Lizenz halten müsse. Dagegen spricht jedoch alle Erfahrung im Hinblick auf staatliches Verhalten, speziell dann, wenn irgendwie der Themenkreis "nationale Sicherheit" berührt wird. Meiner Meinung nach ist davon auszugehen, daß alles, was diesen Leuten als genügend praktisch erscheint, im Zweifel ganz einfach requiriert wird, und daß sich kein Richter dagegenstellen wird.

Und zu guter Letzt sollte man auch den Aspekt der Selbstverteidigung nicht aus den Augen verlieren, denn nicht nur Fefe kann "militärische Anwendungen" definieren, die Staatsmacht kann das auch, wie wir schon bei der Auseinandersetzung um Verschlüsselung, und besonders um PGP/GnuPG, gesehen haben. Eine solche Fefe-Lizenz müßte demzufolge Klauseln beinhalten, die derartigen Versuchen einen Riegel vorschieben.

Aus meiner Sicht ist klar, daß die Staatsmacht und Unternehmen aus ihrem Dunstkreis die notwendige Erlaubnis quasi selbst ausstellen können, während etwaige nichtstaatliche Akteure wahrscheinlich keinen legalen Ersatz etwa in Form von QNX, finden können. Dabei sollte man bedenken, daß diese Konstellation, daß sich Bürger meinen, nur noch mit Waffengewalt gegen autoritäre Regierungen oder sonstige Angreifer zur Wehr setzen zu können, schon lange und in vielen Teilen der Welt, derzeit besonders deutlich etwa im Nahen Osten zu sehen, gegeben ist.

Dazu kommt, daß man etwaige Lizenzverletzer nur in extremen Ausnahmefällen verklagen können dürfte, wenn man denn der Lizenzverletzung gewahr würde, denn im Zweifel haben diese Personen(kreise) einfach mehr legale und physische Feuerkraft als der geneigte Softwareschmied.

Meiner Meinung nach sollte Fefe hier, wie bei anderen Themen auch, mit mehr Ratio und weniger Bauchgefühl an das Thema herangehen. Dann müßte er entweder seine Forderung fallenlassen oder zumindest erklären, warum nur militärische Anwendungen ausgeschlossen sein sollen - denn andere Anwendungen töten Menschen genausogut, nur nicht unbedingt genauso offensichtlich und spektakulär. Und er müßte meiner Meinung nach als politischer Mensch erklären, wieso sich diese Veränderungen in der Lizenzlandschaft gesellschaftlich positiv auswirken.

Links (Auswahl):

Back to top


BT hijacks DNS queries

I just configured a new DNS name in one of my domains, which did not exist before. The associated IP number is routed to Germany. But while the name was not really up, the answer should have been NXDOMAIN, meaning that the name does not exist. Example:

$ dig blablablablabla.oeko.net

; <<>> DiG 9.9.5-8-Debian <<>> blablablablabla.oeko.net
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NXDOMAIN, id: 38513
;; flags: qr rd ra; QUERY: 1, ANSWER: 0, AUTHORITY: 1, ADDITIONAL: 1

;; OPT PSEUDOSECTION:
; EDNS: version: 0, flags:; udp: 4096
;; QUESTION SECTION:
;blablablablabla.oeko.net.      IN      A

;; AUTHORITY SECTION:
oeko.net.               139     IN      SOA     a.ns.oeko.net. hostmaster.oeko.net. 1021018254 16384 2048 1048576 2560

;; Query time: 10 msec
;; SERVER: 127.0.0.1#53(127.0.0.1)
;; WHEN: Thu Feb 12 21:33:53 CET 2015
;; MSG SIZE  rcvd: 105

But instead, they gave a fake answer:

$ dig bla.oeko.net

; <<>> DiG 9.9.5-8-Debian <<>> bla.oeko.net
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 9013
;; flags: qr rd ra; QUERY: 1, ANSWER: 1, AUTHORITY: 0, ADDITIONAL: 0

;; QUESTION SECTION:
;bla.oeko.net.          IN  A

;; ANSWER SECTION:
bla.oeko.net.       20  IN  A   92.242.132.15

;; Query time: 32 msec
;; SERVER: 192.168.1.254#53(192.168.1.254)
;; WHEN: Thu Feb 12 19:55:14 GMT 2015
;; MSG SIZE  rcvd: 46
$

As a result, I am unable to check whether my DNS performed correctly, until they deceided to throw the fake answer away.

Of course, this has huge potential for censorship of all kinds, which I have seen in action elsewhere already. I am not the only person aggravated by this kind of behaviour. Please follow the link below to read other people's take on this problem.

Thank you!

Links:

  • http://linuxforums.org.uk/index.php?topic=11464.0

Back to top


Typing Chinese on a Computer

Just today, I read an article about the influence of the computer on the chinese language. I can agree with some of the points of the author, but think that the difficulty of using a method like Wubi is generally overstated. CangJie is more difficult, but in contrast to spoken language, they both have the very valuable property of not changing according to dialect, region or time. The speedups a user of predictive input gains, are also avialable to users of handwriting or structure-based input methods, but the input speed should be excellent at 150 words, achievable in Wubi, or the 200 words achievable in CangJie. On top of predictive input and much less guesswork that makes the phonetic input methods slow, the structure-based input methods sport phrase books and rules for having hortcuts to type several characters in one go. And while I have seen every undergrad student using only PinYin or ZhuYin, every PhD student that I have met so far, has switched to Wubi, simply for the massive speed increase.

However, I am unconvinced about the notion that writing Chinese is slower than English:

If you can type 150 chinese characters per minute, that amounts to roughly 50 words per minute if you subtract particles and composita, as many chinese words have only one or two characters. Now, imagine how fast you'd have to type to achieve similar speed in English: If the average English word has four characters, which is probably not enough, you'd have to type at 600 characters per minute to achieve similar results, and then you have spacing, too, which does not exist in Chinese. I also hold that the structure-based input methods at least help you memorize the graphic elements of the characters, thus being closer to hand-writing than phonetic input methods. With the composition rules and phrase books, you end up usually having one to three key strokes to produce a chinese character. In summary, I think it is not easy to say whether English or Chinese can be typed faster.

Unfortunately, my own experience with Chinese input is limited to PinYin and Wubi, and as far as the steep learning curve goes, the principles of Wubi can be explained in probably one or three hours, and after that, it takes two weeks of practice to achieve some fluency. Not a big invest in comparison to learning Chinese in the first place, or the waste to be accrued over time using an inferiour method. I guess it is mostly the psychological barrier, possibly combined with unsophisticated didactics that contribute to the perception that these methods are hard.

Links:

Back to top


Small Timezone Code Snippet

Today, I was looking at how to adjust a time stamp from a log file without a timezone info to contain the local timezone, so I can stuff a timezone aware value into a database. It turns out that this is a somewhat under-polished part of the Python standard library, at least as of Python 2.6, which I am using (don't ask why). While looking for a solution, I frequently came across code that used pytz , but I wanted something that would stay within the standard library.

So here's my hodgepodge solution to the problem, which should work in most of Europe:

import time

def getTimeOffset():
    offset = time.timezone
    if bool(time.localtime().tm_isdst):
        offset = offset - 3600
    stz = "%+02.2d%02d" % (offset / 3600, offset % 3600)
    return stz

This approach is a straightforward extension of the idea presented here.

Back to top


New Blog Software, Links Changed

As you might have noticed, I have switched from MovableType to Pelican. As a consequence, the links in my blog changed - usually only a little, but in a slightly irregular fashion. Please peruse the archives and search for the title of your article. The content itself should all be there.

Thank you!

Back to top