Monday, December 16, 2013

Every busline in Istanbul

There's a small project I have been working on in my free time. Here is a first glimpse.

Sunday, August 25, 2013

Android Canvas: Clipping Images with Multiple Paths

Ok the title might not be clear at first. Since one Youtube video is worth a thousand words here it is:

As you can see in the video above we can manipulate the closed path -which is composed of multiple paths- and clip the image underneath with the shape of the paths. There is an off-topic issue I would like to mention at this point. You may notice that the canvas drawing seems a little bit laggy in the video. That's not because I recorded it from an emulator (since I use Intel virtualization the emulator is pretty fast) but the canvas.clipPath does not support android hardware acceleration. So the emulator is slow without the hardware acceleration. However, this runs pretty neat in a phone. You can see the unsupported hardware acceleration functions here: So for this project android:hardwareAccelerated must be set to false.

Ok lets get back to the topic, I implemented this for a freelance job. The requirement was to create a photoshop like tool where you can put images on top of each other and clip the top image to stitch the images together (imagine you are trying to cut your friends head from a picture and paste it to another picture, you need to adjust the edges of the head to make the final image look pretty). Since the clipping was not needed to be too precise couple of curves was enough and since android already has most of the stuff implemented I just needed to combine them together.

The closed path  is actually composed of several quadratic bezier curves. A bezier curve is a parametric curve. You specify some control points with a start and an end point. Then, by interpolation between your start and end point using your control point you create your curve.

I started with creating points to control my curves. We will call this class as BezierPoint.

The important features of this class is that it has the coordinates to represent the point and it holds a rectangular border in which you can put your finger to move the point. So it also has a function to check if the current point is selected by comparing your current touch point. It is really simple, it just checks if your finger is inside the rectangular box:

The border is specified by some constants namely RADIUS and SENSITIVITY the radius is also used to draw the point itself inside the canvas. The reason there are two variables is that you might want the size of the touch border to be different than the size of the point itself. These are used as follows:
Note that x and y is updated whenever the point is moved so the border is updated as well. The newX and newY variables are the new touch points of your finger.

After creating the class to hold the points we need the Curve class, we are going to call it as BezierCurve. This class is responsible for holding the points and their paints to be used while drawing. The points inside this class are used while doing the actual clipping. This class also has a method to check if its points are selected:
So you can think that the selection checking propagates from curves to points. Selection works as follows, when the user touches the screen, ACTION_DOWN catches the point and checks all the curves on the screen until it finds a match. Then in the ACTION_MOVE the selected points are moved. If there is no selected point the image is moved:
Now lets explain the real clipping.

The trick here is to just move the path to the beginning once. If you try to use moveTo for every curve, you are going to get a multiple clipping again but the center will not be visible. So that's not the clipping we are looking for. We iterate over every curve, moveTo the begining once and quadTo the remaining curves. Then draw the bitmap and other stuff as usual.

You can find the complete project in github:

Saturday, July 27, 2013

Configuring VES for Eclipse and ndk-build

This post is choosen for Kitware guest blog:
Recently, I needed a visualization framework for my project. Since one of the requirements of the project is to render scientific datasets and since this is not the main requirement (there are more important features to be implemented), I needed a quick solution. After some search and with the suggestions of my advisor in Aviz I decided to use VES, which is VTK OpenGL ES Rendering Toolkit. VES integrates with the Visualization Toolkit (VTK) to deliver scientific and medical visualization capabilities to mobile application developers. However, VES is not the only layer on top of VTK there is one more layer called Kiwi. The architecture is basically as follows:
As you can see the architecture has two different branches on top of Kiwi. These are to be used for iOS or Android. I will talk about the Android part here. Before talking about the Eclipse and ndk-build configuration I highly recommend you too watch this webinar about VES.

The biggest problem I had through the building process was to create an Eclipse project. The project itself uses CMake and is configured to create makefiles which can be used via Nmake. After creating these makefiles and building VES some other scripts needed to be run to create an Android project. You can import this Android project directly to Eclipse but you cannot build the native part from it. You should run a separate compile script to build the native part and create the .so library file. Then you can run the project from Eclipse again. 

Consequently, I started trying to create a one-button build configuration. At first I played with the CMake configurations (for about two weeks) but I couldn't manage to create a decent eclipse project which can be built (maybe it's because I was not very experienced in Cmake but I don't know). Then I turned to ndk-build which is android's native development kit's build script. For ndk-build to be run you need an and file. These files are a little different from a normal makefile. After a 2 days of struggle I had the project building with the help of my colleague. At the end, the project we had was not building the whole ves source code but it was linking the libraries createb by ves build to the KiwiNative.cpp.

Returning to the project itself, the first step to start building apps with VES of course compiling and building it. The best way to start this process is either the video I linked above or the developer guide in their website (developer guide for Linux and Mac can be found here). I will briefly go over the project structure and some scripts here and then explain my approach to make the project more portable with eclipse and ndk-build.

For building a visual studio command prompt is necessary to use the nmake command. VS Express 2012 is enough for this or any other VS distribution which includes the nmake tool.

The project structure

Following the instructions in the website or in the video you will see that some scripts should be run. I will give a brief explanation of these script and the folder structure of the project.
|   +---Android
|   |   +---CMakeBuild
|   |   |       configure.bat
|   |   |   \---build
|   |   |       +---CMakeExternals
|   |   |       |   +---Build
|   |   |       |   |   +---eigen
|   |   |       |   |   +---ves-android
|   |   |       |   |   +---vtk-host
|   |   |       |   +---Download
|   |   |       |   |   +---eigen
|   |   |       |   |   +---ves-android
|   |   |       |   |   +---vtk-android
|   |   |       |   |   +---vtk-host
|   |   |       |   +---Install
|   |   |       |   |   +---eigen
|   |   |       |   |   +---ves-android
|   |   |       |   |   +---vtk-android
|   |   |       |   |   +---vtk-host
|   |   +---Kiwi
|   |   |      configure.bat
|   |   |      tools.bat
|   |   |      compile.bat
|   |   |      run.bat
After cloning the repository from git:// you will get the above structure more or less. I omit some stuff to focus on the folders that I want. As far as I remember the CMakeExternals folder is created when you run the configure.bat script inside Android\CMakeBuild. So following the developer guide, you run the configure.bat. Let’s see briefly what this does:

@echo off

rem Set the NDK path here
set ANDROID_NDK=c:/tools/android-ndk-r8

rem set ANDROID_TOOLCHAIN_NAME=arm-linux-androideabi-4.4.3

set BUILD_TYPE=Release

rem set CMAKE_HOST_WIN32=1
set CL=/MP

set build_dir=%CD%\build
set source_dir=%CD%\..\..\..

echo Android NDK directory: %ANDROID_NDK%
echo Build type: %BUILD_TYPE%

cmake.exe -E make_directory "%build_dir%" 
cd "%build_dir%"
cd ..

echo Configuration done.
echo To build VES, go to the build directory and type nmake.
echo Don't forget to add tools and platforms directory of Android SDK to PATH env var.

Lets first look at the CMAKE_DEFAULT_GENERATOR and BUILD_TYPE part. This part is where you set the build type and the cmake generator. I have played with the cmake generator for 2 weeks to be able to create a portable eclipse project. I have tried "Eclipse CDT4 - NMake Makefiles" in Windows and "Eclipse CDT4 - Unix Makefiles" in Ubuntu. However, in the end I gave up and build the ves according to the developer guide and link the static libraries to ndk-build to be able to make the project portable. More on this later... You can also change the build type to be able to build the created libraries later on with a tool like gdb.

Through the end of the file calls to the cmake.exe are the real commands doing the job of creating the makefiles and stuff. This is basically running cmake to create appropriate folders and make files for the project to be used later while compiling the code. The superbuild variable is explained in more detail in the developer guide of VES. As you can see the build folder is actually inside the project structure. I guess this has a special purpose since the webinar I shared in this article says something like that. In the webinar, you can see that the guy is actually creating a link to an outside folder but using the original build folder inside the project structure. I guess this has to do with some relative path issues. So this was another problem which stops me creating an eclipse project. Since eclipse gets confused by in source builds, it was impossible to directly import the created project from here and the out-of-source build was not working as I expected. Search google for "out of source build" for more detail on this subject.

After running configure.bat you go to the build folder which is just created and run nmake command. This is going to build the VES project and create .a files which are normally static library files but in the webinar the guy says these are actually archive files. Anyway I will call them static library files (so this is like .lib in windows and dynamic libraries (.so) are like .dll in windows). These libraries are inside folders named lib which are located inside CMakeExternals\Install separated in their corresponding folders. For example, CMakeExternals\Install\vtk-android\lib. After this process you normally go to Apps\Android\Kiwi folder and run the scripts there to create the Android project and link these libraries to your project together with KiwiNative.cpp.

My setup takes a follows a different path at this point. Since we want to work together on this project with my colleague and we do not want to put all the source files to SVN. In addition to this, we want to be free of the relative path stuff of the project.

Examining the makefile inside Kiwi application folder we found out that the real makefile it uses is called build.make which is located inside \Kiwi\jni\CMakeFiles\KiwiNative.dir and examining this file it can be seen that it creates a seperate .o from KiwiNative.cpp (which is our main jni file) and link all the other libraries which are created before to it to create the file library which is So trying follow the conventions inside this file we try to create a stand-alone android application project with ndk-build. On the road, we encountered really random looking errors related to OpenGL, function definitions and even the order of the libraries linked. Finally we came up with an and files. I will add the original files at the end of this article. However, there are two things that I want to emphasize.

The first one is make sure whether your project uses STL or not (This is a general suggestions but in our case, VES, yes it uses STL). If it uses make sure you include that in your
APP_STL := gnustl_shared

The second thing is you should pay attention to the order of the linked libraries. Solving this problem took some time.. since the order looked arbitrary to us but then we used the order from the original build.make file and everything was resolved.

Beside this two points other stuff was straight forward we carried all the include folders from Android\CMakeBuild\build\CMakeExternals\Install into jni folder of our project. We created LOCAL_MODULEs for all the .a library files. For example:
include $(CLEAR_VARS)
LOCAL_MODULE := libvesShaders
LOCAL_SRC_FILES = ves-android/lib/libvesShaders.a
Then we included these local modules to our last LOCAL_MODULE which is a BUILD_SHARED_LIBRARY file. This module is as follows:
include $(CLEAR_VARS)
LOCAL_STATIC_LIBRARIES :=  libkiwi libvesShaders libves libvtkIOXML libvtkIOLegacy libvtkIOPLY libvtkIOGeometry libvtkFiltersModeling libvtkImagingCore libvtkRenderingFreeType libvtkRenderingCore libvtkIOImage libvtkDICOMParser libvtkmetaio libvtkpng libvtktiff libvtkjpeg libvtkFiltersSources libvtkFiltersGeometry libvtkIOXMLParser libvtkIOCore libvtkexpat libvtkFiltersExtraction libvtkFiltersGeneral libvtkFiltersCore libvtkCommonExecutionModel libvtkCommonComputationalGeometry libvtkCommonDataModel libvtkCommonMisc libvtkCommonTransforms libvtkCommonSystem libvtkCommonMath libvtkCommonCore libvtksys libvtkfreetype libvtkzlib
LOCAL_C_INCLUDES := $(LOCAL_PATH)/vtk-android/include/ $(LOCAL_PATH)/ves-android/include/ $(LOCAL_PATH)/ves-android/include/ves/kiwi $(LOCAL_PATH)/ves-android/include/ves/ves $(LOCAL_PATH)/ves-android/include/ves/shaders
LOCAL_MODULE    := KiwiNative
LOCAL_SRC_FILES := KiwiNative.cpp
LOCAL_LDLIBS := -llog -lGLESv2 
LOCAL_CFLAGS := -Wno-write-strings -Wno-psabi $(OPENGLES_DEF)
-----See the bold lines. Those lines are related to OpenGL (obviously) and derived from the build.make file that we found.

With all these configurations your project is ready to be built by ndk-build of Android. Have fun. I might have missed some stuff or make mistakes please inform me in such conditions. Because I wrote this article after some time passed.

Whole and files:

Thursday, May 2, 2013

Exploring Flu Trends with Google

As I get close to my thesis I discover really interesting visualization and data analytics stuff. During one of these "discovery sessions" (no, no of course I wasn't randomly surfing reddit) I stumbled upon Google Flu Trends I played with it a little and thougth that it was fascinating. What they do is basically using flu related search term analytics to predict flu trends. After a while, I left it aside since there wasn't any Turkey data as usual... and I wasn't curios about flu trends in Peru

Currently, I am following Data Science course in Coursera and during the first lecture I stumbled upon Google Flu Trends again and I learnt some interesting stuff about it. They overestimated this years outbreak. The reason is that there was a lot of media attention about this years flu outbreak and this caused the number of search terms to be amplified.

Consequently, repurposing the data is good but one should be aware of biases.

Wednesday, March 13, 2013

First thoughts about CS 450: Arts and Computing

This semester, I take a unique course which is called Arts and Computing. This course is thought by 3 professors; Elif Ayiter, Murat Germen and Selim Balcisoy. However, since Murat Germen is on his sabbatical leave he will not attend the classes.

For this course, we are required to create 2 blogs; one is for the group project and one is a personal blog. I was postponing starting my own blog for 2 years, this maybe a kick-starter for me.

This post will be mainly about the first impressions and clouds of ideas about the course, here we go:

After first 15 minutes of the course I associated it with data and information visualization. It’s probably because there are a lot of previous works on information visualization. However, we also saw some artsy stuff but maybe because I am an engineer the infovis part was more attractive for me. The second thing which also got me thinking was there were no recent projects which were shown during the introduction period. I am really curious why this is the case. Is it because the classes of previous years were smarter or is it because as the technology emerges it becomes easier to implement some complex stuff, consequently, you think of something but somebody else has done it before. I would love to see a c-c-c-c-c-c-combo breaker for this trend this year.

I am truly not sure what to do for the course project. As the professors suggested I am trying to think without a context, I am trying to think about pure data and its transformation, but it’s hard, I always end up stuck on technical details. For example, if I am going to make a data visualization framework I immediately think about the platform, specifications, requirements, should it be mobile or desktop? Should it be in Java? etc. However, I guess I would love to deal with “smaller amounts of data” or “rare data”. What do I mean by this? For example, it’s really easy to gather someone’s social network data or location data or personal data if that person is using a computer or a smartphone. However, what about the people who do not really use computers or mobile phones (at least smart phones). How can I extract data from those people and use it? I guess that would not be the essence of this course but I really would like to learn the answer of this question and if I could find such data I would love to use it for the project of this class.

Finally I believe that I might gather some interesting ideas to use next year for my thesis. Since I will be doing my thesis on information visualization, I believe I can greatly benefit from this course.

Saturday, October 13, 2012

Bad Design Examples 2: Captchas

Note for readers: Akbank and Turkcell changed their designs =)

Captchas are all around on the web for a long time. I know how annoying it's to write a captcha wrong 3 times in a row and hold myself not to *punch* the monitor. The situation is worse for some websites.

I want to give 2 bad usage examples of captcha usage from 2 well-known web sites in Turkey.

Akbank Internet Banking

Turkcell Online Services

As you see in the above screenshots you always have to enter a captcha (g├╝venlik kodu) to be able to login the internet banking and the online service. I understand that they want to prevent brute force attacks but this is NOT a good design practice.

The good practice should be putting a trial limit and when the user exceeds the limit then the captcha should be displayed. Google does this very well:

Thursday, March 1, 2012

Bad Design Examples 1: Grooveshark Radio

So I guess it has been more than a year since Grooveshark changed its design and there's something really annoying about it:

There is this radio feature on Grooveshark which lets you listen radios-which are not radios literally they are just playlists- according to the genre you want.

If you click on the radio button while there's nothing on your list it pops-up the Start Radio Station dialog and lets you choose a genre you want.

However, if you have some songs in your playlist and click on the radio button it's going to queue more songs according to your list and won't let you choose a radio station again. So all you get will be this:

It looks like a good feature at first, compacting a feature into one button, saving space and increasing the simplicity but it's actually annoying when you want to cancel your playlist and listen to a radio station.

Instead of just starting the radio by 1 click, like you do when your list is empty, you have to do this in 3 clicks: opening your current song playlist, clearing all of the songs and starting the radio. Because there's no short way to empty your list or I am missing something.

In conclusion, it may seem it's useful to assign two actions to one button but it should be tested well for user experience beforehand.