Category Archives: English

Why do trolls regenerate?

For a while I’ve wondered why everyone agree that trolls can heal their wounds by regenerating. As a monster type trolls are recognizable by being big, strong and for the most part not particularly bright. Depending on the setting for a given book or game, the trolls may look or behave slightly differently, but they all agree on a single trait: regeneration. This spans different settings, whether it is Dungeons and Dragons, the Might and Magic-series or a dozen other examples. In fact D&D feels so strongly about this point that fire or acid damage is required to finish them off.

So where did this idea originate from and how come it is so widespread? It’s not mentioned in the old fairy tales, at least not any I can remember. The closest would be the one about the troll who hid his heart someplace else and thus couldn’t be killed. They’re tough? Yes. Hard to kill? Certainly. Often have multiple heads? Ok, most games have glossed over that aspect for some reason. The core attributes are the same, making trolls challenging opponents in whichever form they take. Adding the ability to close the wounds and restore it’s health over time, they’re also able to make a comeback even when you thought the fight was over.

I did some digging and part of the answer is probably that Dungeons and Dragons included it. Most, if not all, computer role playing game are influenced by what D&D created, so it only makes sense that it would spread to other settings. So where did they pick it up? Turns out there’s a book named Three Hearts and Three Lions by Poul Anderson which feature regenerating trolls. This seems to be where D&D got the inspiration. The same novel also contains the basis for the aligment system and the paladin class! I don’t know if the story explains why or how trolls gained this trait, and I didn’t find any more details. Still, looks like Three Hearts and Three Lions was the original source, and that after it got added to D&D it spread further from there.

Testing expections in Java

Unit tests usually run a piece of code to verify the output or state to ensure it does what you excepted. With exceptions, it gets trickier. Once one is thrown the test ends abruptly, so how can you make sure that it was really triggered?

To demonstrate various strategies for testing exceptions, I’ve made a small example project in the form of a simple calculator. Most of the tests use plain JUnit4, except one which takes advantage of AssertJ assertions. But before we look at that, we should clarify what we want to accomplish by testing for exceptions. As with all testing, the main goal is to verify the code does what it is supposed to. In this case; throw an exception given a certain state or input. So we want to verify three things:
1. An exception was thrown
2. It was triggered by the state or input we wish to test
3. It was the error we expected

The example calculator is capable of adding or subtracting numbers. We will ignore the implementation for now, assume it is sane and focus on the tests. It has a special rule though, it should only add positive numbers together. For negative numbers, corresponding subtraction should be used instead. So if anything fails to follow this business rule, I want the the add()-method to throw an exception.

The first approach is covered in This suite contains some normal tests to ensure the calculator works as intended and one to verify it throws an exception when adding negative numbers. The latter is annotated with @Test(expected = IllegalArgumentException.class). This tells the test runner that the test should throw an exception of the specified type. The main problem here is that the annotation covers the whole test, which means that if any line throws such an exception the test will still pass. If we try to comment out the last line with calculator.add(1, -1); we might expect the test to fail since its no longer adding anything, but to our surprise it is still passing! Sounds like something else in the test is triggering an exception, but it’s hard to tell since it doesn’t seem possible to verify error message we get with the annotation. Thus, it only succeeds in point 1, but fails on 2 and 3. As soon as you do more that one thing in a test, you can no longer be sure which of the statements triggered the exception.

Since the annotation seemed too broad, let’s try to focus more on what we are trying to test. Ultimately, we want to know if the statement calculator.add(1, -1); throws an exception. So how do we normally deal with exceptions? Try-catch, of course. On to This is a quite normal pattern which has several variations, but the core concept is that we do some set up, then call the statement we wish to test inside a try block and assert that we got the exception we wanted. Of course we also need to keep an eye out for other possible outcomes, so there’s two additional checks to mark the test as a failure if another or no exceptions are thrown. Without, the test would still pass even though it didn’t trigger the exception we want.

When running this test it is easier to see why the annotated test failed earlier; the constructor is rather picky and excpects the name to start with a capital letter. Once we’ve fixed that, the test works as expected. Actually, the constructor should be called outside the try-block, since it is only part of the arrangement setting up the necessary prerequisites for the test. The core of the test is the add()-method, so we should have as little as possible else inside the try-catch. If an exception is thrown in the setup, the test should of course fail because we didn’t achieve the necessary state to test our specification.

This way, we know the exception was thrown, we’ve limited the code in the try-block to just the method call we want to test and we inspect the error message to verify what we got. In other words, the try-catch pattern accomplishes all three goals we established at the start. However, it is a bit cumbersome to set up with all the try-catch boilerplate each time we want to test an exception. Worst case, we create an incomplete test which misses a case without reporting the test as failing.

For an alternative use of this pattern, see It uses a boolean flag to ensure that we don’t leave the test without asserting properly. I think this is somewhat better, but we still need to write a lot of boilerplate and end up introducing a new variable.

While I think the try-catch pattern is a step in the right direction, I’m not too happy with the need to add the same extra lines as safeguards over and over. Luckily, I found that JUnit (version 4.7 and newer) comes with a built-in rule to make this easier. The rule is called ExpectedException and is used in The rule is defined at the top and basically says that by default, no exceptions should be thrown by the tests. But where you want an exception to be triggered, the rule can be instructed to look for the exception type and error message which is expected. These instructions can be placed after all the all the setup so that we have it separated from the minimal section we wish to test.

This guarantees that the exception is triggered where we wanted it (goal #2), as well as specifying type and error message (goal #3). If it doesn’t encounter an exception matching its expected criteria it will mark the test as a failure, thereby fulfilling goal #1. All in all, it does an excellent job to fulfil all requirements.

The examples above are all using JUnit4, but I also looked for other solutions. I found AssertJ, a fluent assertion framework which contains a lot of useful things. (It started out as a fork of fest assert, for those more familiar with that). contains an example demonstrating how it can deal with exceptions.

The code which should throw the exception is placed inside a lambda expression, which makes it possible to observe and verify the result from the outside. In terms of separation, this is perhaps one step further than the other examples, since we know that the exception can only be triggered by the code we place inside the lambda. We can also inspect it, looking at the type and the error message. This allows us full control to verify the exception when it has been thrown as well as cleary separate the section we except to throw something from the other parts of the test.

In conclusion, I prefer ExpectedException because it gives you the greatest amount of control/readability when testing exceptions. The annotation can lead to brittle tests if they have more than one line or method call in them. Setting up try-catch each time seems too cumbersome, plus I fear it is far too easy to write a bad test if you forget to add one of the safe guards. I liked the AssertJ approach though, I will consider using this for future projects.

And as a bonus at the end, there is an interesting proposal in the JUnit bug tracker on something similar to what AssertJ does, which means it might become available in JUnit someday.

A comment on comments

As the observant reader might notice, there’s currently no way to add a comment at the end of this post. In fact, it’s not possible to add comments to any of the posts. What’s going on?

The comment section was initially enabled to facilitate feedback and discussion. In practice there hasn’t been any of either. There has been plenty of comments though, which I’ve had to mark as spam from time to time. Recently, there’s been a sharp increase and since I don’t get any interesting comments I found it preferable to just disable comments all together.

Thus, the comment section has been disabled for the time being. It may return in the future, but I would need to find a better solution that the current one.

The First Law trilogy by Joe Abercombie

I recently finished reading The First Law trilogy by Joe Abercombie, which consists of “The Blade Itself”, “Before They Are Hanged” and “Last Argument of Kings”. While a fantasy series, it is also part of the grimdark subgenre. As can be guessed by the name, grimdark is darker and grittier than “normal” fantasy. Rather than a classical good versus evil story told with clear black and white characters, they characters come in varying tones of grey. It is comparable to George RR Martin’s Song of Ice and Fire (also known as Game of Thrones) where there’s really no clear-cut good guys.

This is evident in one of the main characters in the series, Inquisitor Glokta. He used to be an officer, but after being captured and tortured in a war, he has now turned to torturing others. I’ve seen him compared to Black Adder in other reviews, and while I don’t fully agree in this, I can certainly see the similarites. I would rather compare him with Dr. House, since he’s smart and capable at what he does, though constantly in pain. Glokta easily has some of the best lines in the books, and his inner monologues are a thrill to follow. He especially shines in the second book, “Before They Are Hanged”, where he is tasked with running a city while investigating why his predecessor vanished. Oh, and and the city is besieged by an army much stronger than any defence they might be able to put up.

The two other main characters are Logen Ninefingers and Jezal dan Luthar. Luthar is a young officer which is training for the annual fencing contest, hoping to win fame and glory. While busy practicing and spending his evenings playing cards, he is eventually dragged into a quest for an object which might change the fate of the world. Logen is a barbarian from the north which has been a warrior for most of his life. In addition to his skill in battle, he is able to summon and talk to spirits. After being separated from his group of fighters and assuming they have perished, he heads south. Shortly after, he is called upon by Bayaz, the First Magus, which has use for someone who can talk to the spirits.

Bayaz is a powerful wizard, who has played a vital part at several times throughout the history of the world. The backstory is presented through various means (including a play!), and helps both explain what has happened earlier and show how historical events affect the present. He is a wise old man, but can also be intimidating in his displays of magical power. I find it interesting how he fills a similar role to Gandalf, yet does things which Gandalf would never do. This is one of the fun things in the books, how the author plays with the preconceptions of how the story will progress. An example of this is that the very first chapter literally ends with a cliff-hanger.

All in all, quite interesting books. Abercombie’s also written some standalones which take place in the same universe, which I look forward to checking out.

Robert Jordan and Brandon Sanderson – A Memory of Light

By popular demand, my thoughts on the final volume in the Wheel of Time series; A Memory of Light. With 14 books and one prequel novel in total, it is one of the longest fantasy series and it has now come to an end. When Robert Jordan passed away, a lot of us were worried we would never know what happened to the group of friends which had to flee their home town all these years ago, but now the last book is out. And so, more than a decade after I picked up the first book, I’ve finished the last one.

Being the final book in a long series, there is a lot happening. Factions clash, prophecies are fulfilled and the fate of the world is determined. It all culminates in Tarmon Gai’don, the Final Battle, which gets a nearly 200-pages long chapter dedicated to it. Much of the series have been leading up to this event and most of the plot threads are resolved, as well as some new mysteries introduced. (Those who have read it know what, or rather who, I’m talking about).

It was really nice to see the series finished, and I think Sanderson did a great job wrapping it up. It is one of the best series I have read, which is a bit ironic since I initially gave up on it merely a chapter or two in. When I picked it up a second time though, I couldn’t figure out why I had abandoned it. I really enjoyed the characters, the varied cultures they encounter in different parts of the world, the magic system and the glimpses into the lost glory and wonders of the Age of Legends.

Mystery is important, and so are stories

Most who played The Longest Journey will probably remember unexpectedly stumbling across the name of its writer inside the game. At the entrance of a movie theater, the player can look at a movie poster for “A Welsh Ghost Story, written and directed by Ragnar Tørnquist”. I always wondered whether this was a reference to something he had actually written, foreshadowing something to come later or simply his way of inserting his name into the story.

Then some years went by, the sequel Dreamfall was released, and while I eagerly await Dreamfall Chapters which is due to be released this fall I had mostly forgotten about this little cameo. Then here the other day, Ragnar Tørnquist posted a link to a screenplay called “In the Dark Places”. And the interesting part is that the working title had been “A Welsh Ghost Story”! So not only does it exist, but we’ll get to read it as well. He also posted another story, “Rules are Rules”, which he now considers a sort of precursor to The Longest Journey. I’ve only skimmed parts of them so far, but both look like interesting reads.

I’ve spent ages looking for a music video and I finally found it!

Those of you who know me, might know I’ve been searching for a music video I remember seeing when I was younger. I’ve at least asked friends about whether they might know it from time to time over the years in the hope that someone would recognize it. However, all I had to go one was vague half-remembered details, since I did not remember anything regarding the name of the artist, title of the song or any of the lyrics. This of course made it hard to search for, not to mention difficult to explain to others.

What I did remember was some key scenes from the music video and roughly the time period it was from. The description would go something like this:

  • The video was more than six minutes long, and due to its length only parts of it was shown on “Topp 20” (TV show which presented the 20 most sold singles in Norway that week).
  • It charted sometime around 1994-96. Which is a pretty wide range, but about all I could pin down. (Other songs I remember from that time period include U2’s “Hold me, thrill me, kiss me, kill me” and Nick Cave and Kylie Minouge’s “Where the wild roses grow”).
  • It featured among others: some exploding barrels, a cave and a mask (some sort of treasure hunt?) and a man attempting to save a woman in a raft approaching a waterfall. The man has climbed a nearby tree and tries to catch here when the raft passes beneath an overarching branch.

As I mentioned, I’ve asked various people about this over the years, and while some have said it sounded familiar, no one really knew what it could be. I have pondered various approaches, including going through each and every entry which charted in that time period. The obvious problem is that it would take an enormous amount of time to go through them all. We are talking about a list with twenty entries which was updated weekly, which effectively means 20×52 songs, and while songs staying multiple weeks would probably bring the number down somewhat it is still a large number. Disregarding the time it would take to watch all of them, even though the lists are posted online along with plenty of the music videos these days, there is no guarantee the video I was looking for would be available somewhere.

So I haven’t really searched all that actively lately, though here the other day I ran across review listing top 10 music videos from the ’90s. Since I didn’t want to get my hopes up, I mainly watched it for the fun of it and expecting to see some songs I had completely forgotten about. And then, when talking about one of the songs, the reviewer mentioned “Oh, and by the way, this video had a sequel” and showed some random clips which seemed eerily familiar. So I looked up the artist on Wikipedia and skimmed the list of singles released around that time period, then saw if I could find the music videos for them. And indeed, after roughly 17 years I had randomly stumbled across the music video I was looking for.

The video in question? Meat Loaf’s “I’d lie for you (and that’s the truth)”. In retrospect, maybe not the best song in the world, but I was really happy to finally solve an old puzzle and see it again.

Debian bug squashing party

Last weekend I attended a Debian bug squashing party, organized by NUUG, Skolelinux and Bitraf. In other words, roughly nine people gathered in front of their computers in the same room, trying to fix bugs and make Debian better.

First we were introduced to some of the tools and how to interact with the Debian BTS. Then we looked at the list of Release Critical bugs currently affecting Debian. At the time, there was more than 1000 bugs which would prevent a new release. Since this is too many (Debian require the number to drop to zero before making a release) we took a look at some of them.

First we looked at a bug report about a program crashing at startup, while getting to know our way around the BTS. We all tested to see if we could reproduce the issue in various environments. I was the only one who got the crash in my virtual machin running Sid (yay!). However, the exact same version of the package would not crash on Ubuntu Saucy, so the underlying issue was assumed to reside in one of the dependencies. We gathered a list of which versions/environments we had tested along with the results and a diff of the changes in dependencies from a working version to the crashing one. We submitted this as a comment to the bug report.

Next up, we looked at various bugs which had been filed as a result of failing rebuilds. A lot of them had a common cause, compilers have become stricter about imports, so some programs need to explicitly import libraries the compiler would add automatically in the past. One bug was picked as an example, and we all looked into it in parallel, attempting to patch it and get it to build. Related to this, we went through the process of installing dependencies, building the package, generating a diff and adding it as a proper patch.

After getting acquainted with the various tools and parts, we were let loose, all tasked with finding a similar bug and hopefully fix it by the end of the day. After some back-and-forth, I got a working patch for one of the bugs and submitted it. (Looks like another patch was used instead, but it also looks better than mine. Anyways, the important thing is that the package is now working again.) For a total list of all the bugs we looked at, see here.

All in all, it was a fun and nice experience. I had looked at most of the tools previously, but it was nice to have someone who were more familiar with them and would answer questions when someone ran into issues. I was also pleasantly surprised how easy (relatively speaking) it was to fix an issue, even an FTBFS one in packages I had never heard about.

My list of virtual machines

list_of_vmsThought I’d share the setup I have for virtual machines, how I use them to triage bugs and experiment with various software.

First a small digression, since the observant reader will notice I am using Virtualbox. When I first discovered and started playing around with virtual machines I had a computer incapable of hardware supported virtualization. I discovered this rather quickly since every virtualization solution I tried failed to work because they all required specific CPU features. After testing several solutions, I settled on Virtualbox because it also supported software-based virtualization. I’ve later replaced that machine, and while my current computer supports hardware assisted virtualization I’m still using Virtualbox as it is straight-forward and I am familiar with it. I did briefly try a couple of other solutions when I got my new computer, but didn’t find any obvious advantages they had over sticking with my existing setup.

Now, the machines. I have a set of the currently supported Ubuntu releases, organized by their code names. (Yes, I’m aware 11.04 reached end of life a while back.) They come in handy when confirming bugs or trying to track down which release something broke (or got fixed). My main use case is: load up the relevant release a bug was reported against, verify it is reproducible there, and then check whether it is also present in the latest development release.

All are kept more or less up to date, to make sure I have the latest version of libraries and other software when attempting to reproduce bugs. When I started triaging bug reports I used to simply install the software on my main system and check if the bug was reproducible there, though I quickly changed my approach for several reasons. Mainly because my main system wouldn’t easily allow me to test with multiple releases, but also in case my setup or set of installed packages would produce a different result than a system out of the box. The latter may not always be relevant, but there are some cases where it matters. For instance, say a program fails to run without a specific library which is not installed as a dependency, however since I already have installed the library for other reasons I wouldn’t be able to reproduce the issue. In cases like that it makes more sense to check what happens on a system out of the box.

In addition to the Ubuntu releases, I also run a couple of other systems. Arch Linux is nice and since it is rolling release distribution it usually includes the latest version of programs/libraries before most other distros. It’s ideal for testing whether projects still work as expected with the latest version of their dependencies, or to try out features in newer versions of programs. If newer versions of a library or compiler is released, it’s really convinient to be able to catch any issues early before it ends up the stable version of other distributions. In addition, Arch has a rather different philosophy and approach compared to Ubuntu, which is interesting to explore.

The Debian machine is running Sid (unstable). For most of the same reason as Arch, being able to test the latest version of projects, plus it will eventually turn into the next releases of Debian, Ubuntu (and related derivatives). As Ubuntu is based on Debian, it is of course also relevant for checking whether bugs are reproducible both places in case they should be forwarded upstream. As Debian is currently in freeze for the upcoming Wheezy release, there’s not many updates these days though.

Oh, and there’s a Windows 8 preview I was trying out when it became available. Used it some when it was announced. I’m pretty sure that will expire soon.

A Memory of Light released

“A Memory of Light” is the fourteenth and final book of the Wheel of Time, an epic fantasy series. It was released earlier today, and I’ve already picked up my copy which I had preordered through Outland.
Originally, the title was intended for the twelfth book. Then it grew too large, and was split into three: “The Gathering Storm” (2009), “Towers of Midnight” (2010) and this final volume. It is also the third book Brandon Sanderson has finished after Robert Jordan passed away in 2007. It was sad to see the original author pass away before he had the chance to finish the series. On the other hand, I think Sanderson has done a great job with the last books. He is also one of the best authors I have discovered over the last few years, and I am not convinced I would have done so had he not been chosen to finish the Wheel of Time. He has now taken a step back to focus on his own series and books again, and it looks like he already have plans to keep himself busy for a while.

Almost 23 years and thousands of pages since the first book, it is time to finally figure out how this story ends… At the end of the previous book most of the characters were joining forces and preparing for Tarmon Gai’don, The Last Battle. This is the event the books have been leading up to; the final confrontation between good and evil.