Sunday, September 27, 2015

Perkun 0.0.9 released!

Perkun ( 0.0.9 has been released! It contains multiple classes mentioned previously that facilitate applying heuristics. It contains also a new class - dump_controller, inherited by the optimizer_with_all_data, which controls dumping the data in the verbose mode. It is possible to redirect various data to the XML or text files, to stdout, stderr or a single file. Take a look at the constructor dump_controller::dump_controller in the file

Monday, September 21, 2015


There is a Perkun ( port in Java. You can download it from There are two JARs there:
  • JPerkun.jar - the Perkun port to Java (a library)
  • Perkun.jar - a Java application based on JPerkun
They both contain the source code. In order to run an example execute:

java -jar Perkun.jar example.perkun

Multiple features are not yet implemented though. The primary functionality should be there.

Sunday, September 13, 2015

Action iteration controller - Perkun 0.0.9.

You may have noticed that the feature described recently in Perkun ( 0.0.9 allows breaking the computation in any moment. This ability is not so great, though, if we process the actions in a strict, a-priori defined order. A new class has been introduced to relax this order. It is called action_iteration_controller. It is derived from class_with_tracking, and so can benefit of its stack of trackers. It contains three virtual functions:

virtual std::list<action*>::const_iterator get_action_iteration_begin() const;

virtual void action_iteration_increment(std::list<action*>::const_iterator & target) const;

virtual bool get_action_iteration_terminated(const std::list<action*>::const_iterator & i) const;

It is inherited by the class optimizer. Whenever the get_optimal_action iterates over the actions it uses these three functions. In your own programs you are free to override them providing a new (possibly random) order of the action iteration. And you can base these new orders (permutations of actions) on the knowledge stored in the stack of trackers!

Saturday, September 12, 2015

Heuristics in Perkun 0.0.9

In (yet unreleased) Perkun ( 0.0.9 there will be a new mechanism to control the loop in the function get_optimal_action.

The optimizer will be inherited from a class called "class_with_tracking" which (as an attribute) will contain a stack of trackers. Each tracker will have access to the currently best action, current action, current best result. Based on the stack of trackers the program will decide how to proceed in the loop. There are three possible decisions:
  • NONE
NONE means we are proceeding with the current action. BREAK means we are terminating the execution (we are happy with the current best action) and CONTINUE means we skip the current action, but proceed with the loop.

The user embedding Perkun in his own programs will be free to override the virtual function:

class_with_tracking::decision class_with_tracking::make_decision(const std::list<tracker*> & s)

The decision returned by it will be precisely NONE,BREAK or CONTINUE. In the default implementation it is NONE, meaning exhaustive search of the game tree.

Friday, September 11, 2015

Perkun API. The variables.

In Perkun ( there are three classes representing variables:

It is:
  • perkun::hidden_variable
  • perkun::input_variable
  • perkun::output_variable
They are all inherited from the class perkun::variable. Each variable has a list of allowed values. You might need to use these classes when you embed Perkun in your own programs.

Thursday, September 10, 2015

Perkun interactive commands.

While executing Perkun ( loop instruction (in the interactive mode) it is possible to give a special command rather than the list of input variable values. There are three such commands:
  • verbose
  • silent
  • reset
Verbose switches Perkun to verbose mode and silent does the reverse. In the verbose mode the Perkun's decision is "explained", i.e. the expected values of the payoff are given for all decisions being considered. The reset command destroys the current belief, so that it will be replaced by an apriori (uniform) belief.

If you are curious how the commands work take a look at the function optimizer_with_all_data::get_input. This function is virtual and is likely to be replaced in your own programs based on Perkun.

Wednesday, September 9, 2015

Perkun 0.0.8

A new Perkun ( has been released. It is version 0.0.8. It contains a man page.  Just install it and type:

$ man perkun

Apart from that there is an important enhancement in the API:

void perkun::optimizer_with_all_data::set_apriori_belief(belief * b) const;

It can be overridden to fill the belief with an a-priori distribution (it is useful in programs based on Perkun).

Monday, September 7, 2015

Perkun. The Haskell generators.

Try using in Perkun ( the instruction "cout << haskell generator << eol;". For example:

value false;
value true;
value place_Wyzima;
value place_Novigrad;
value place_Shadizar;
value do_nothing;
value attack_vampire;
input variable do_I_see_vampire:{false,true};
input variable where_am_I:{place_Wyzima,place_Novigrad,place_Shadizar};
hidden variable where_is_vampire:{place_Wyzima,place_Novigrad,place_Shadizar};
output variable action:{do_nothing,attack_vampire};
cout << haskell generator << eol;

It will print out a Haskell code generating the model. The idea is similar like the Prolog generators described previously. You may enhance the generated code to create actual model.

Sunday, September 6, 2015

Perkun API. Optimizer classes.

Perkun ( provides an API allowing to extend it in your own programs. The central class containing the Perkun algorithm is perkun::optimizer:

It is possible to use Perkun just by instantiating the perkun::optimizer. However this would be not very convenient. The Perkun parser requires an instance of the class perkun::optimizer_with_all_data, which contains:
  • perkun::collection_of_values
  • perkun::collection_of_variables
  • perkun::collection_of_visible_states
  • perkun::collection_of_actions
Its base class relies only on the references to these classes instances. It is recommended to instantiate a class inherited from perkun::optimizer_with_all_data and redefine the virtual functions in it. The most important are two functions:

virtual void get_input(std::map<variable*, value*> & m, bool & eof);
virtual void execute(const action * a);

get_input fills the map with the pairs: (variable*,value*) and eventually sets the boolean flag eof, while execute executes the optimal action chosen by the Perkun algorithm. See the Perkun Wars project ( to learn how these function work in the class npc. They use pipes to communicate with the parent process.

Friday, September 4, 2015

Avoiding surprises with embedded Perkun.

I have already mentioned that in some cases it may happen that Perkun ( gets "surprised". This happens when the model says that some input values are impossible. See the code:

        value false, true;
        value hello;

        input variable what_I_can_see:{false, true};
        output variable action:{hello};



set({what_I_can_see=>false },{action=>hello },{what_I_can_see=>false },0.0);
set({what_I_can_see=>false },{action=>hello },{what_I_can_see=>true },1.0);
set({what_I_can_see=>true },{action=>hello },{what_I_can_see=>false },1.0);
set({what_I_can_see=>true },{action=>hello },{what_I_can_see=>true },0.0);



The input values must be false, true, false, true,... interchangeably. If you tell Perkun it gets "false" twice it gets surprised. You probably want to avoid this situation when you embed Perkun in your own programs. In order to do that you should redefine the virtual function:

void optimizer::on_error_in_populate_belief_for_consequence(const belief & b1, const action & a, const visible_state & vs, belief & target) const;

You must do it in the new class inherited from perkun::optimizer_with_all_data (see the previous post). In Perkun Wars I redefine it in the class npc. Instead of throwing an error I call on the target belief "make_uniform". This makes Perkun to assume a reasonable belief distribution once it gets "surprised".

Another way is to make the model without zeros in the set instructions. You could replace them for example with 0.01. Then nothing is impossible.

Thursday, September 3, 2015

Perkun as a library.

Imagine you would like to use Perkun ( as a library in your own project. It is possible! Take a look at my game: Perkun Wars (

In your you have to add the line:

PKG_CHECK_MODULES([PERKUN], [libperkun >= 0.0.7])

In src/ you will add the flags defined by this macro - PERKUN_CFLAGS to AM_CXXFLAGS and PERKUN_LIBS to your LDADD. For example:

    -I.. -I../inc \

perkun_wars_LDADD = \

In your header file you need to include <perkun.h>. See the file inc/perkun_wars.h.

Define a new class inheriting perkun::optimizer_with_all_data, In Perkun Wars it is the class npc.

Instantiate the class and run the perkun interpreter (see the file src/, function main_perkun). You run the interpreter calling the function "parse".

This function main_perkun runs in a separate process, because in Perkun specification we use the loop. But there are pipes in place and the redefined virtual functions of the class npc communicate with the main process through the pipes. In fact there are three extra processes in Perkun Wars. They are named after the heros:
  • Dorban
  • Pregor
  • Thragos

Wednesday, September 2, 2015

Perkun and XML.

Perkun ( can produce an XML file containing its specification. Write a Perkun specification (i.e. the four mandatory sections) and terminate it with an instruction:

cout << xml << eol;

When you execute the code with Perkun it will write an XML document to the standard output.

Now look at the file perkun.xslt. It is written in an XML-like language called XSLT. You may use it to convert the XML produced by Perkun into a HTML document. This can be done with any XSLT processor, we use here xsltproc:

xsltproc perkun.xslt document.xml > document.html

Tuesday, September 1, 2015

Perkun. Payoff generated in Prolog.

Let us begin with an easy example. In Perkun ( you have to provide the payoff section and model section. The payoff in dorban_general.perkun looks as follows:

set({where_is_Dorban=>place_Wyzima, do_I_see_vampire=>false}, 0.0);
set({where_is_Dorban=>place_Wyzima, do_I_see_vampire=>true}, 100.0);
set({where_is_Dorban=>place_Shadizar, do_I_see_vampire=>false}, 0.0);
set({where_is_Dorban=>place_Shadizar, do_I_see_vampire=>true}, 100.0);
set({where_is_Dorban=>place_Novigrad, do_I_see_vampire=>false}, 0.0);
set({where_is_Dorban=>place_Novigrad, do_I_see_vampire=>true}, 100.0);

Do we notice any rule, any regularity here? Of course no matter where we are the payoff depends on the variable do_I_see_vampire only. In dorban_general_final.prolog we needed to add the fact:

% get_payoff(INPUT_where_is_Dorban,INPUT_do_I_see_vampire, PAYOFF).

get_payoff(_,true, 100.0). % Dorban is hunting the vampires

The comment above the fact was generated by Perkun. That is it! The payoff for do_I_see_vampire=false will be provided by the fallback rule (also generated by Perkun):

get_payoff(_,_, 0.0).

By the way, I did not mention yet, that in order to use Prolog generators it is better to avoid in Perkun the identifiers beginning with a capital letter. Lower case letters are better. Needless to say, the underscore character ("_") is also a bad candidate for an identifier.