Ugo.san

Human-computer Interaction, Software Development and things in between

G.a.m.b.i.t – sketching with multiple devices

I’m glad with the final outcome of my system, now that i’m approaching the final steps. G.a.m.b.i.t (Gatherings and Meetings with Beamers and Interactive Tablets) was constructed to allow people to make prototypes of interfaces using any device. In the picture below I’m sketching on a large surface projection and the system is running on a sony e-reader!

gambit sketching system

gambit sketching system

Gambit is described in this paper: GAMBIT: Addressing multi-platform collaborative sketching with html5

 

“Rox: Uma ferramenta para o auxílio no aprendizado de Teoria dos Grafos”

Rox Graph Theory

Download Rox v.0.74

My very first application is turning 10 this year. When I was in college I had this idea for an application to help students (including myself) in assignments for a Graph Theory course that were usually a nightmare. I have open sourced it from the beginning (advised by Terceiro and Mauricio, my colleagues at the time), which gave me a completely new perspective from which I thought software development initially to be. It also put me in contact, for the first time, with a user base of about 60 to 70 students each semester, that was quite a lot. In this tool, you had to compile classes with the algorithms and load them in order to analyze graphs. The tool was used for some years in my university, and I’ve also written my first paper because of it.

After that I’ve decided to improve this tool in order to make it more robust, by enabling users to process a lot of graphs at the same time and to make the algorithms debuggable, with step-by-step execution and visualization — So I’ve made it my graduation project, and I had completely rewritten the tool in the Eclipse platform, calling it RoxGT. The tool came out nice, however it had a problem: nobody wanted to use it. Students still preferred the old version, and that troubled me.

Eventually I understood that the experience I had created with the second version was far more complicated – I did not put the students and their tasks in first place like I did with the first version. The reason why this software failed to be adopted ultimately led me to pursue a master in Human-Computer Interaction, I wanted to understand what makes a software usable, but that is another story.

You can download the first version here, but you will need a Java compiler (I’m not allowed to distribute it with the zip file). If you have any questions you can always tweet me.

 

User Interface Engineering map

Here is a map positioning UI Engineering (the main area of my PhD thesis) among other disciplines. I’ve defined UIE as a combination of Usability Engineering and Information Visualization, so my work addresses both Interface Design and Navigation. This illustration is based on the original at Dan Saffer’s Desigining for Interaction.

User Interface Engineering

User Interface Engineering

As Visual Design is concerned with form – the spatial organization of information and layout within user interfaces. Information Architecture is concerned with the structure of content – how to best organize and label content so that users find the information they need. Interaction Design is focused on behavior and context as well as Human-computer interaction – however they differ at their methods of addressing the human aspect of computer systems since HCI’s are more those of engineering and computer science than of design.

 

How do we perceive lag? Part 1

Humans can perceive from 10 to 12 frames per second individually. Early silent film projections had frame rates varying from 16 to 24 FPS, and nowadays the standard rate for movies is at least 24 FPS. But how is this perceived when we interact with whatever is being showed to us? This series of posts will report a part of my research on investigating the lag perception while drawing.

What is lag?

The phenomenon behind human frame rate perception is called Flicker fusion, i.e. the point in which a flickering image appears as stationary. Early silent film projections had frame rates varying from 16 to 24 FPS, and nowadays the standard rate for movies is at least 24 FPS.

Flicker fusion at 15 frames per second

Within the research domain of human vision, there are works addressing a possible frame rate limit that human beings are able to perceive: it is estimated that humans can perceive from 10 to 12 frames per second individually, above this limit the illusion of motion would take place, and humans would not be able to perceive individual frames.

However, the considered limit applies when no interaction is required. So I’ve started to investigate if this limit can also be applied for the specific case of interacting with devices. More specifically to drawing on electronic devices. The paper is called “Assessing lag perception in electronic sketching” and its available here and here.

15 vs. 30 vs. 60 FPS

Can you tell the difference between the three rates below? Bo Allen have done a simple yet very interesting comparison between the frame rates:


Get Adobe Flash player

Get Adobe Flash player

Get Adobe Flash player

(source: http://boallen.com/fps-compare.html)

* update: There is another great (interactive) tool for comparing refresh rates: http://frames-per-second.appspot.com/

Interaction and Lag

One of my research interests is investigating sketching (human drawing) on electronic devices. In fact, electronic sketching was one of the first input modalities to be considered in computing, take a look at these Youtube videos where Ivan Sutherland is demonstrating his electronic sketching system.. in 1963! I’m also maintaining a playlist about the history of electronic sketching, with some interesting videos, check it out.

Before touchscreen devices became mainstream the only viable option people had to sketch was by using WACOM devices. Now we have different devices, with different performances and screens of many sizes. And we also have HTML5 which allows us to build a single system that runs on all those devices through a browser.

So I’ve started implementing a sketching system and started trying different upper limits for the rendering, as you can see on the video below. Note how the sketch lags behind the pen:

How would people perceive the different rates while sketching without (of course) knowing it in advance? I’ve setup an experiment and tested with 35 subjects, that’s on Part 2 of this series.

The experiment

That will be on Part 2, soon. =)

 

Mammooth Fuzz

This pedal is based on zvex wooly mammoth:

 

Photos:

(Re)making a Conference Poster with Inkscape

Last month I needed to prepare a poster for DIS2012 and EICS2012 conferences describing my research (Addressing Multi-platform Collaborative Sketching), and I wanted to do it with Inkscape. Then I came across that amazing design by Felix Breuer.

But I needed it in a vertical format. So in case you are in the same situation, here it goes:

Download the ZIP file
GitHub repository

You can also see the original (horizontal) version here:

FPSAC poster rasterized

* Update * There is a new “vampire red” version by Mathieu Zen:

POSTER vampire red template

Lot of space with Fullscreen plugin

I’ve just forked an old abandoned plugin project created to make Eclipse go fullscreen. Originally, the plugin maximizes the usable space by hiding everything but the editors and navigator — which means hiding even the status bar, making it really hard for developers to use.

The single available option was to show/hide the Menu Bar, which is useful for RCP apps. But for IDE usage the status line MUST show up (it’s impossible to debug a stacktrace without knowing which line you are, for instance). My buddy Robson is a packager of ArchLinux and they were facing this issue with Eclipse, so he asked me to fix it. :)

Get the plugin

You might get the .jar here and drop it on your plugins/ folder. If you are an Eclipse plugin developer, you might also want to check it out on my github.

Toggle Fullscreen

You might use Ctrl+Alt+Z to toggle fullscreen, or at Window->Fullscreen

Configure

You can choose to hide or show Menu Bar and Status Bar in Window -> Preferences -> General -> Full Screen

Enjoy! :)

Android Lightsword *updated*

Lightsword just reached 100.000 installs, its awesome. The new version have 5 hilt options, gesture support, CLASH sounds and color options. Want a pink saber? ok!

Use the force with Lightsword, an app to simulate a light saber in android. I’ve spend some time to fine tune the accelerometer to provide a smooth interaction — you just have to activate the sword and swing your phone around.  Works great at bars :) Trust me on this one. Now go try it.

Android Theremin is out!

Theremin is finally released on Android Market. It’s in alpha stage, but I will be updating it quite a lot. For now it might be used to produce sounds based on the magnetic field around your phone. You can manipulate it using a metal object such as a coin or your ear phones. It might serve as well as a metal detector! :)

There is a video showing Theremin in action.

I would really appreciate some feedback and feature requests. I hope you enjoy it.

Android Theremin prototype *updated*

"I'm so much cooler than you guys playing violins..."

Theremins are weird musical instruments. They consist basically of two antennas, the player moves its hands around it, altering the magnetic field. The device then registers this changing, playing creepy sounds. Take a look at this Pato Fu’s song and a lesson on how to operate a Theremin.

So I’ve bought an HTC Tattoo and of course, decided to try to develop apps for it. The whole SDK is amazing!

The concept of a Theremin can be implemented on Android, since the phones have a digital compass and registers the magnetic field in 3 axis (!).  Here is the first prototype and my very first Android app. yey! \o/

Its possible to change the magnetic field registered by this mobile using a coin. The closer the coin gets, the lower is the note. Actually real theremins work in the opposite way :)

*Update!*

Click here for the first version

Now, those notes are redered .ogg files of C scales =/ Android doesnt have a MIDI toolkit, right? I want to be able to produce my own frequencies, so the user could manipulate them like in a real theremin.