Well… I think I lost about 12h on all my wikis. I thought: “Upgrade from Debian Wheezy to Jessie? No Problem!” I never saw a big huge blinking warning saying that you shouldn’t do it if you’re a guest on a virtual machine. But systemd replaces sysvinit, which requires a newer kernel, which the host doesn’t have! Everything works until you reboot. At which point the network doesn’t come back up. You can’t switch from systemd to sysvinit because downloading the packages doesn’t work anymore. You can’t scp the deb files to your server because all you have is a serial console. On your end, it looks like ssh. And yet, whatever I tried wouldn’t work. Because networking doesn’t work on the host.

And that’s when I discovered some flaws in my backup. What a disappointment. How exactly was Perl installed? Why doesn’t the backup contain the Apache config files for those sites that had SSL added to them?

Alex Schroeder I think all the permissions are back where they should be. The campaignwiki.org cron job is running. Monit and Munin are basically up but their configuration is borked. That is to say, they are monitoring some things I haven’t installed because I never kept track of the packages I actually installed.

Sometimes I wonder whether I should move Oddmuse away from Perl 5 to something else. Something I would like to learn. An opportunity to redo everything, from scratch. Not many people are using Oddmuse, so I might as well write something else, for myself.

I recently read Writing Web Applications for Go, where they develop a simple wiki. I wondered about storing the data in a git repository. I found git2go. Hm…

< Hehe, I think you and [http://oddmuse.org/wiki/Revolutionary_Changes Oddmuse: Revolutionary Changes] is what started this all! I looked at Oddmuse as a "mission accomplished" project. With the migration of Emacs Wiki I've realized that a big site might want to serve its own pages without running a web server. I tried to understand [http://plackperl.org/ PSGI/Plack] in order to have Perl run Oddmuse like a web application. When I saw the wiki tutorial for Go, and how I got a web server up and running in no time, and how easy it was to include some basic markup, I was thrilled! Sure, [https://github.com/kensanata/oddmu the code] doesn't have a lot of features. But it beats the SmallestWiki candidates. :)

to

> Hehe, I think you and [http://oddmuse.org/wiki/Revolutionary_Changes Oddmuse: Revolutionary Changes] is what started this all! I looked at Oddmuse as a "mission accomplished" project. With the migration of Emacs Wiki I've realized that a big site might want to serve its own pages without running a web server. I tried to understand [http://plackperl.org/ PSGI/Plack] in order to have Perl run Oddmuse like a web application. When I saw the wiki tutorial for Go, and how I got a web server up and running in no time, and how easy it was to include some basic markup, I was thrilled! Sure, [https://github.com/kensanata/oddmu the code] doesn't have a lot of features. But it beats the [[SmallestWiki]] candidates. :)

Perl 6 is an obvious candidate. And yes, they did not just change print ‘x’ to print(’x’) like they did in Python, it is a completely new language that you have to learn.

Speaking of Go, there is a great video called Perl 6 for Mere Mortals, and here is a direct link to a part of it which is somehow related to Go. That’s not a meaningful comparison, but it should give you the idea. You can also watch that video from the beginning, it is very interesting if you have not investigated into Perl 6 yet.

Now, if we think about all of the languages out there, what are the actual features that are required for Oddmuse?

Well, since Oddmuse is all about parsing the wiki text, you need some tools for parsing. Perl 5 has regexes (um, okay), Perl 6 has grammars (exactly what we need!), and Go has something as well (like this, which is, again, verbose as hell), but I’m not an expert. You can also use libs, but having a built-in support is probably advantageous.

Somehow I can’t really think of any other criteria, everything else probably exists in every other language out there. You might also say “performance” but the only website suffering from that is probably emacswiki, and it all depends on your actual goals – for example, better execution time can be achieved by using concurrency, which does not really lead to better CPU time. Concurrency in Perl 6 – yes (but maybe just a bit flaky at the moment), performance in Perl 6 – well, not now, but the potential is there (i.e. gradual typing).

In other words: what is the motivation? “Just to learn” is not going to get you anywhere, not even make you learn something (learning to find reasons for doing stuff (i.e. causation) is part of the learning, doing stuff purposelessly is harmful to the thinking process).

I keep thinking about all the design decisions. Filenames are pagenames. Namespaces are subdirectories. Can we switch to “git first”? No more log files. We no longer need keep files. How would modules work? Would you simply recompile the wiki? Dynamic linking? Would I want to use a templating architecture? People seem to like that. Was “printing as we go” a good decision? Perhaps it no longer matters much.

I need to think about extensibility of Cajun, the Wiki Creole lexer and parser.

“git first” – Yea, but… This would effectively break the goal of having one script file that just works (and I think that this is more important than having no recent changes and keep files logic in the core).

“How would modules work? Would you simply recompile the wiki?” – this would effectively throw away all of the nice things like Module Updater or Module Bisect (or at least make them much harder to implement). It will also break the potential of a few other ideas that keep floating in my head. In other words, “simply recompile the wiki” is not that simple.

“Would I want to use a templating architecture? People seem to like that.” Yea, but… It seems like if we switch to Perl6 then the core will collapse into a small piece of code with all of the irrelevant stuff separated into grammars or other classes, or maybe just tiny subroutines. Which brings us to the next point…

“Was “printing as we go” a good decision? Perhaps it no longer matters much.” – Good decision, but it really depends on your understanding of “as we go”. For example, what if we parse the wiki text first, before printing anything except the header? Then, once we start processing the parsed text, we can print stuff as we go. That’s exactly how it will work in perl6 (at least, that’s the easiest way) – first you slam the wiki text into your grammar and wait for it to be processed (oh by the way, while this is happening you can actually do other stuff asynchronously, like start reading the tag index or whatever). Then, when your wikitext is parsed, you can start deciding what to print now. And this is so much better than what we have now. Want to print table of contents? Just take the required information from the match object, no need for dirty hacks. In other words, in perl6 it is very easy to have parsing and actions separately, which is beneficial in many ways. But at the same time, once you start processing the actions you can start printing as you go (which makes sense, what if there is some complex search stuff included on the page, you’d probably want to see at least half of the page while this is going).

And yeah, if you are seriously thinking about Go, what if we make a comparison table with Perl6 and Go, and the relevant language features? Both of the languages have some interesting stuff to offer, for example Perl6 to JS and Go to JS. And there are definitely some winning points for Go, especially associated with execution time and maturity.

Alex Schroeder Hehe, I think you and Oddmuse: Revolutionary Changes is what started this all! I looked at Oddmuse as a “mission accomplished” project. With the migration of Emacs Wiki I’ve realized that a big site might want to serve its own pages without running a web server. I tried to understand PSGI/Plack in order to have Perl run Oddmuse like a web application. When I saw the wiki tutorial for Go, and how I got a web server up and running in no time, and how easy it was to include some basic markup, I was thrilled! Sure, the code doesn’t have a lot of features. But it beats the SmallestWiki candidates.

You arguments in favor of some of the core architecture decisions made for Oddmuse are good – and your hopes for Perl 6 and improved parsing are infectious! I’m very much looking forward to Perl 6.

– Alex Schroeder 2015-06-16 06:02 UTC

Alex Schroeder It’s also cool to look at the TinyWiki source code again and rediscover why I rewrote the Usemod Wiki style search and replace code with the extensible state machine code we have right now.

OK, so apparently I need to look my site’s setup again. I hate these sysadmin problems. I would love to not worry about security issues, trusting my GNU/Linux distro to simply do the right thing. But it can’t upgrade my sites SSL setup. Currently the architecture does no allow that. Each site has config files, each site has its own certificates.

For Chrome, the solution is more involved and requires you to visit this page in order to find the spec of the cyphers you want to disable and to start Chrome with the --cipher-suite-blacklist parameter in order to blacklist the cyphers you provide.

Today I saw two posts on Google+ I found interesting. In the first, Rob said that he would like to finance a zine in order to get all the news but not “the Internet”. I said that back when we all just had websites and used ftp to post new stuff the Internet was a peaceful place! ﻿

I wonder about browser extension that hides all the comments, everywhere. I think those are the source of all grief. Yes, it’s sad! I know of the Herp Derp extension for YouTube. I’m sure there are others!

In another post, people were discussing how we react to provocative posts by others. What if they disabled comments? Are we happy because it saves us from wasting our energy? Are we incensed because the medium implies that we’ll only post what we’re also willing to discuss? That’s certainly how I feel! For myself, this is my daily Google+ challenge: not to post any political stuff. For if I did, I wouldn’t want to read the counterpoints. I still feel the need to rant, though! What to do? Because I also hate posts where I cannot comment, I don’t want to post and disable comments. My solution is to keep the political stuff on Twitter. There, I retweet a lot. Or I’ll write a few well chosen words, add a link and post it. If I’m interested in reading the opinions of others, I can follow them on Twitter and read the stuff they link to. Perfect! I remember people saying that 140 characters was not enough. But now I’m seeing that on G+, having more characters doesn’t improve the experience of politics and religion.

Googling… Stackoverflow has something about webfonts, anti-aliasing, and a note that 37 will fix this. I guess I hadn’t used Chrome in a long time. Let’s see whether an update fixes this.

Restarting… I am not impressed.

Adding Symbola to the site and to the CSS.

Fiddling some more. Increased the font size. Getting rid of Arial Narrow or whatever it was for header and footer. Getting rid of bold for the moment and replacing it with small caps. WTF am I doing!?

Unhappy… Noticia Text by JM Solé? Should I use the webfonts from the Google network or serve from my own server? More traces being left behind…

Built sfnt2woff to convert ttf files to woff. The files are significantly smaller, now. Using Noticia Text and Symbola by George Douros (for the smileys). Everything seems to work on Chrome. Now anti-aliasing for the header and footer looks a bit crummy on Firefox (Windows). Wow. Such pain.

At home… Things look pretty good on Firefox (OSX). Oh well. I think this is going to be the new look for a while.

Alex Schroeder The gotobar looks OK? Perhaps it was pure chance that it didn’t wrap with the previous font size (and using a narrow font). I wonder why you’re not getting more emojis. I though Symbola had them all covered. Oh well.

– Alex Schroeder 2014-11-14 16:08 UTC

No, it is distracting. If you change the width of the input fields to 8ex then it will not wrap. Alternatively you can make “Search” and “Matching Pages” text smaller. But please, stop this line wrapping thing. Maybe I could live with it if “Matching pages” thing was under “Search”, but having input fields on the left and on the right just bugs me.

“Fail2ban scans log files (e.g. /var/log/apache/error_log) and bans IPs that show the malicious signs – too many password failures, seeking for exploits, etc. Generally Fail2Ban is then used to update firewall rules to reject the IP addresses for a specified amount of time, although any arbitrary other action (e.g. sending an email) could also be configured. Out of the box Fail2Ban comes with filters for various services (apache, courier, ssh, etc).”

Ever since I installed fail2ban, it showed no activity. Until now. Weird!

Is this due to the Shellshock vulnerability? First public disclosure 2014-09-24, activity starting 2014-10-06. It’s weird, though. I thought Shellshock would involve bash scripts as CGI scripts, called via Apache but these failures are ordinary SSH login attempts as seen on /var/log/auth.log:

Da wäre mal App.net – aber die haben in ihren Terms of Service eine schreckliche Passage drinnen: “You agree to defend, indemnify and hold us harmless from and against any and all costs, damages, liabilities, and expenses (including attorneys’ fees, costs, penalties, interest and disbursements) we incur in relation to, arising from, or for the purpose of avoiding, any claim or demand from a third party relating to your use of the Service or the use of the Service by any person using your account, including any claim that your use of the Service violates any applicable law or regulation, or the rights of any third party, and/or your violation of these Terms.”

Twitter recently updated their Privacy Policy and Terms of Service to allow us to buy merchandise from “some of the most popular names on Twitter”, without leaving “the Twitter experience”. Yay!? In one of my anti-capitalist urges, I decided to take another look at Twitter alternatives.

As far as I can tell, there is App.net and GNU Social (formerly Status Net).

App.net offers a Twitter-like experience, API access, and no ads but it costs. The free option offers, 500MB of file storage and following up to 40 accounts (plenty to get started). Upgrading costs $36/year, which is an OK price but I pay just $25/year for my Flickr Pro account… The default client you download, called App.net on my phone, also gives you a totally confusing non-Twitter experience involving Broadcast. I had no idea what this was and switched to Riposte.

You agree to defend, indemnify and hold us harmless from and against any and all costs, damages, liabilities, and expenses (including attorneys’ fees, costs, penalties, interest and disbursements) we incur in relation to, arising from, or for the purpose of avoiding, any claim or demand from a third party relating to your use of the Service or the use of the Service by any person using your account, including any claim that your use of the Service violates any applicable law or regulation, or the rights of any third party, and/or your violation of these Terms.

I don’t think I can use this service! Specially considering this:

These Terms and any action related thereto will be governed by the laws of the State of California without regard to or application of its conflict of law provisions or your state or country of residence. All claims, legal proceedings or litigation arising in connection with the Services will be brought solely in the federal or state courts located in San Francisco County, California, United States, and you consent to the jurisdiction of and venue in such courts and waive any objection as to inconvenient forum.

Remember the abominable, cruel and unusual punishment called Three Strikes? No thank you! If I can choose, California is unacceptable.

AlexSchroeder I just checked it out and it doesn’t seem to be an alternative for my use case: sitting on the train and staring into my iPhone 4. The download page is full of alpha warnings. iOS clients unavailable…