Some days have passed now, and I spent some time on the video streaming. Well more like "image streaming". As it turns out, android is not streaming friendly yet. With the release of 2.3 they already support audio streaming to the device. Streaming from the device to a server however is a different story. The typical video containers which are supported are 3gp and mp4. There is actually no streaming container implemented. The problem with both of these containers is that the header and meta data is written to a file only after the recording is done. So I could stream the raw data to my server but without the additional information it's worth nothing. There are some workarounds like implementing custom video containers of your own, but that's too much effort for my timeframe. Another workaround would be to save the video to a small file transmit it to the server which can process it and do the whole thing in a loop. The latency would be too big to be useful.
The whole platform restrictions and workarounds are described in a diploma thesis which I found while researching. There are app providers which promote video conferencing apps but as the platform doesn't support streaming naturally, my guess is that it had to be implemented in a workaround.
I used the approach of transmitting single snapshot images from the camera which I could stream over a socketconnection. On the server side I extended my JSF application with components from ICEfaces and PrimeFaces. I used the DynaImage from PrimeFaces which can handle streamed content. For updating the image periodically I used ICEfaces Ajax Push mechanism.
The server and client applications still have to be tweaked to avoid crashes and to provide a faster image update, but here is a first impression on the result. Yeah I know the tab is oversized for the robot but it's my only android device :).