Monday 23 December 2013

Screenshots of Android app viewing Raspberry Pi Light in 3D

It was only a couple more hours of coding in my NetMash (currently still called "Cyrus") app, to turn the command-line success of yesterday into the screenshots of today.

Here's what it looks like once the app has discovered a broadcasting object: our light:


The name comes from the Raspberry Pi object whose link was discovered over BLE as I described yesterday.

Here's what the actual Cyrus object behind that screen looks like in the raw:


I've put the link from BLE into a list, and dropped the RSSI into there as well for debugging purposes, but it needs to be associated with each entry in the link list.

When I jump that link to the "Light" object in the first screen above, I see what object the Raspberry Pi is advertising:


This is a simple cuboid representation of my light. It's purple. I can change its colour by touching it, it takes the point touched and uses the (x,y,z) to set the RGB:


And again, here's the raw Cyrus form of that light. It's a bit more complicated than my example the other day, but that's the way programming is. It's partly because I'm using a 3D object to represent it.


You should at least recognise the R-Pi's URL at the top by now, plus the properties "Rules", "light", "within" and "light-sensor" in there.

You'll see if you look that I had to turn off Bluetooth to get this to work over WiFi. Apparently the Nexus 4 doesn't let you have both on at the same time, which is a bit of a drawback.

So that's the first signs of an AR view over an IoT object!

No comments:

Post a Comment