As usual, when I should learn to my exams, I find something far more interesting to do. However, this was an older idea which I really want to finish this summer, so I just made this basic example to remind myself to do it next months.

In a nutshell, this project is going to be a music visualization tool for VJ events. I don’t have that much experience in VJing yet, however I’ll try to learn it while developing this stuff. It will be modular and I try to make it possible to connect it with existing solutions to produce more interesting effects.
Currently, its only containing a couple actions with simple animations, but I make it reactive to sound as soon as possible, after all, thats the whole reason of this thing. If anybody interested, the whole project is on GitHub, and when it’s getting usable, I’ll post standalone binaries on my site too.


These screenshots looks even better than the current one on github, and it’s only one version ahead

A lot happened since the last status report. Centroid and Handstract is updated and the first time I know that these things will be profitable, at least now the downloads are doubled, thanks to you guys. The App Store screenshots and descriptions have changed too, it may look more appealing for the first time visitors on the app pages.

11174455_368018206726637_4643396153185034678_oThe smaller UI change in the new Handstract version

However, I’m a little stuck with Filterion, it will be upgraded and I already now the new features, but not sure about the whole editing process. Currently you can switch and reorder filters and effects, which does the job well, but aren’t very ergonomic nor innovative. This time I’m going to make a couple prototypes and there will be a bigger UX testing phase.

Planning the new Filterion features

Also, there are a couple other projects in the making, for example a virtual reality game like stuff in Unity which will be released soon, but first I’ve got

As a part of an university project, we had to create something in small groups, to use project management tools, like git and unit testing. We choosed to make a small game in c++ with the help of the SFML library, and it was a lot fun.

The game turned up pretty good, basically it’s an Asteroids clone where you have to type a text to shoot down the targets, almost like in this game:

Our version was developed to run natively on windows and unix machines. It’s not finished yet, and it’s kinda rough, so I won’t post a binary (maybe in the future), but the PhobosLab version is better anyway.


I’ve never used SFML before, so first I thought it’s going to work like the Cocos2d or Cube, but it’s more lightweight which is a big advantage. Primarily it was made for 2D games, but since it doesn’t provide that big framework, it could be used in more general ways (like data visualization, user interfaces or other visual stuff). Actually, there are GUI extensions based on SFML, like

Now, that the Apple Watch and the new MacBooks are out, it’s time to talk about this cool new feature called Force Touch. It enables devices to recognize the strength of the touch and paired with the Taptic Feedback it can emulate physical button clicks. This could be the greatest improvement in touch screen technology since the multitouch gestures that made Apple devices so responsive.

Let’s see the obvious problems first: currently most of the mobile games uses maximum 1-2 controls simultaneously. Surely, there are a lot of great games using only one finger, like Badlands of Shoot the Moon, and the twin-stick control style games are also popular, like the classic Minigore, or the brand new Spartan Strike on iOS.

But if we want to play console-like games, we will most likely need a physical controller which can cost a lot or we have to cope with a lot onscreen buttons, which are hard and slow to reach in a fast action game.

The brief state of mobile action games (only the main controls):


Currently a top-down shooter is