Tuesday 31 December 2013

Alternative Approaches to the Object Network IoT

I recently mentioned a number of initiatives playing in the same space as my Object Network approach to unifying and animating the Internet of Things.

I've been reading up a bit on The Thing System, and have formed some early thoughts about the commonalities and differences between their approach and my own.

My general goals and philosophy are pretty much aligned with theirs. Adapting to many different Thing technologies, absolute minimum of configuration, open data and protocols, preferring automation over remote control, allowing people to write simple rules, etc.

The main difference is that my approach uses REST, with URLs to the state of Things, encoded in stable formats, and links between them, plus a programming model (and programming language) based on observation of linked state and state transition based on those observations.

These elements of the Object Net give a simplicity and mashability that the more imperative event- and agent-driven approaches lack. The documentation of the Thing System does reveal its complexity pretty quickly.

Looking around, here is another good list of IoT projects like the one I linked to before. Neither mentions the OpenHAB project, which seems at least as active as the Thing System. It is, however, based on an Event Bus and OSGi, both of which weigh against its chances of success in the real world. Maybe it'll attract Enterprise support, though.

In summary, I've yet to see another approach like the Object Network to unifying the IoT that is based on the original Web - with links between state. Most (all?) other approaches seem to be event-based.

Monday 30 December 2013

Collecting the bits of the beacon URL: getting the host's IP number, etc.

In order to broadcast the URL of the Thing object, NetMash needs to discover the IP of the Pi host to put into the URL.

My Raspberry Pi has only one network interface with only one IPv4 IP address. In general, those 'one's aren't true. So in Java you have run a nested enumeration to find out the IP. Here's the algorithm I settled on today after some traditional stack overflow research:

    static public String findTheMainIP4AddressOfThisHost(){ try{
        Enumeration<NetworkInterface> interfaces = NetworkInterface.getNetworkInterfaces();
        while(interfaces.hasMoreElements()){
            NetworkInterface ni=interfaces.nextElement();
            if(!ni.isUp() || ni.isLoopback() || ni.isVirtual()) continue;
            Enumeration<InetAddress> addresses=ni.getInetAddresses();
            while(addresses.hasMoreElements()){
                InetAddress ad=addresses.nextElement();
                if(ad.isLoopbackAddress() || !(ad instanceof Inet4Address)) continue;
                return ad.getHostAddress();
            }
        }
        } catch(Throwable t){ t.printStackTrace(); }
        return "127.0.0.1";
    }
       
I'll actually need to return the InetAddress itself, for when I break it down into bytes to broadcast.

Also today, I put the light rules up on netmash.net, as Cyrus resources: here, here and here. Saves creating new objects locally for every light.

I also moved the light object itself out of a text database file and into the Java code, so that each light is generated afresh with a new UID, which it didn't do before. This means I really do depend on that broadcast URL now, or I'll never be able to find my light object!

All this is heading towards the Java NetMash code itself setting up the BLE beacon. I need to battle with the input and output streams of Runtime.getRuntime().exec() or use java.lang.ProcessBuilder to do that, because I don't think there's a Java API to the BLE stuff, so I'll have to call out to hciconfig and hcitool.

Finally, I found this RGB LED globe bulb again today, after I lost it:


All the circuitry is on the top, there, even if covered in white paint. I can hack into the RGB LEDs with a PCB cutter tool I have, and some gentle soldering. Three quid from Maplin.

Sunday 29 December 2013

Cheap Remote Controlled Consumer Products for Hacking

As the picture below shows, I just bought an AuraGlow RGB LED bulb (left in the picture) and a set of remote-controlled power socket switches (right). Obviously, as you can see, I've started thinking about how to hack the controllers..


They're pretty cheap: the bulb was £20 from Maplin, but there's a cheaper GU10 one for £15; the power switches were three for a tenner from Clas Ohlson. So this makes them appealing for the home hacker.

The bulb is driven by an IR signal, the switches by radio. I'm thinking I can just short the controller button pads with transistors driven by GPIO pins. Or if I can capture the bulb's IR protocol, it may be possible to send that directly. Hopefully there'll be separate R, G and B values sent, not just the preset colours on the controller.

Both controllers are powered by a 3V battery, which makes it possible to power them from the Pi.

I'll let you know of any developments..

Meanwhile, here's what my Christmas Pi looks like now it's been assembled with the slot cut in the top:



Saturday 28 December 2013

Controlling RG(B) light from Android 3D view

Still no Blue, but I can set the LED light on the Pi from Red through Yellow to Green by touching the Android app's 3D object of the light.

The program has changed to achieve this, to allow the RGB light levels to be set from the JSON object's RGB numbers, by setting the mark-space ratio of the R and G independently.

Red:


 Yellow:

Green:


You'll have to trust me that I can set the light colour by touching the 3D block. Until I figure out how to do videos anyway.

Also, spot the new black case and mini prototyping board fixed on top, there. I'll drill the holes soon, so that I can fit the top on..

Friday 27 December 2013

Mark-Space RGB LED Modulation through GPIO

Quick experiment to see if I can set a colour from NetMash on an RGB LED using mark-space modulation driven by the GPIO pins of my Christmas Pi.

I grabbed GPIO 23/24 from these pins:


and wired them into a Red-Green LED. Adding Blue should be easy once I have that going.

The core of the program to drive the LED is this:

        FileWriter l1 = new FileWriter("/sys/class/gpio/gpio23/value");
        FileWriter l2 = new FileWriter("/sys/class/gpio/gpio24/value");

        int mark = 0;
        int total = 1024;
        int d = 8;

        while(true){
            int m=mark/64;
            int s=(total-mark)/64;

            l1.write("1"); l1.flush();
            l2.write("0"); l2.flush();
            Thread.sleep(m);

            l1.write("0"); l1.flush();
            l2.write("1"); l2.flush();
            Thread.sleep(s);

            mark+=d;
            if(mark>=total || mark<=0) d= -d;
        }

This causes the LED I have to pulsate between red, yellow, green, yellow, red, ...

I caught it in its yellow state in this photo:


The earth wire takes the long way round and the battery is unused, in case you were suspicious.. Also I used HDR to take the picture, so that the light wouldn't wash out completely.

My main concern with this was the load on the CPU and the file buffers, constantly turning the light on and off at speed. Here's a run of vmstat 2, to ease my worried mind:

root@raspberrypi:/home/pi/Cyrus# vmstat 2
procs -----------memory---------- ---swap-- -----io---- -system-- ----cpu----
 r  b   swpd   free   buff  cache   si   so    bi    bo   in   cs us sy id wa
 0  0      0  37984   4848  41576    0    0     6     3  590  517  6  2 93  0
 0  0      0  37956   4856  41576    0    0     0    22  643  600  1  2 97  1
 0  0      0  37956   4856  41576    0    0     0    10  657  607  2  3 95  0
 0  0      0  37956   4856  41576    0    0     0     2  685  616  1  1 98  0
 0  0      0  37956   4864  41576    0    0     0    22  665  607  2  1 97  0
 0  0      0  37956   4864  41576    0    0     0     2  705  632  1  1 98  0
 0  0      0  37956   4864  41576    0    0     0     2  645  601  2  2 97  0

Everything has to run as root. Since these are small devices, not multi-user servers, that's not a big issue.

Thursday 26 December 2013

CoAP and FOREST

It looks like there could be a cleaner mapping from my FOREST architectural style to CoAP than there is to HTTP.

FOREST stands for "Functional Observer REST" - it's based on objects observing each other's states, either locally or over the network.

FOREST uses GET and POST to transfer object state between objects. GET is used to read or poll the state of a linked object, and POST to notify either a new object state of interest or an updated object state to a peer object already known to be interested.

CoAP has observation built-in, unlike HTTP.

I implemented my own HTTP stack in NetMash, but CoAP seems a bit more complicated, even when I take out the bits I don't need. Still, I'll be keeping CoAP in my radar.

Wednesday 25 December 2013

Raspberry Pi for Christmas Dessert

Here's my main prezzie today:


It's a Model A, so no Ethernet, half the RAM and only one USB. But cheap as chips.

I also got the camera module, which I'm hoping I can use as a light sensor to trigger the light object in my other Raspberry Pi, which will be wired to an RGB LED.

I'm powering this with a cable back from the USB hub, which saves me a power socket.

Tuesday 24 December 2013

Links between Thing Objects

The other day I mentioned in my IoT Manifesto how things should "look the same if they are the same". In the Object Network, this means that any light Thing produces the same format of JSON on its network interface as any other, even if the first is connected to its host by ZigBee, the other by Z-Wave.

That way, the JSON data from all the Things in your house, or in the world, can be linked up, because anything that understands Thing A's JSON can be expected to understand the format it fetches when it jumps a link from Thing A to Thing B.

Conversely, if you want links between Things, you better make sure all your Things are talking the same language.

The Object Network vision is to create a global "fabric" of sensors and actuators and other logically-abstract objects, all linked up. Like the original vision of the Web, but with data.

As obvious and beneficial as this may seem, there are no initiatives that I could find promoting such an approach towards interoperability.

The nearest in the rough area were the closed-looking industry initiative AllSeen/AllJoyn, the "Open Source Internet of Things" (see also this slide deck) with it's academic feel, and the "Thing System".

I'll dig further to see what I can find out about them or any further examples. Standard linked-up data formats don't appear to be their main feature on first glance, however.


Monday 23 December 2013

Screenshots of Android app viewing Raspberry Pi Light in 3D

It was only a couple more hours of coding in my NetMash (currently still called "Cyrus") app, to turn the command-line success of yesterday into the screenshots of today.

Here's what it looks like once the app has discovered a broadcasting object: our light:


The name comes from the Raspberry Pi object whose link was discovered over BLE as I described yesterday.

Here's what the actual Cyrus object behind that screen looks like in the raw:


I've put the link from BLE into a list, and dropped the RSSI into there as well for debugging purposes, but it needs to be associated with each entry in the link list.

When I jump that link to the "Light" object in the first screen above, I see what object the Raspberry Pi is advertising:


This is a simple cuboid representation of my light. It's purple. I can change its colour by touching it, it takes the point touched and uses the (x,y,z) to set the RGB:


And again, here's the raw Cyrus form of that light. It's a bit more complicated than my example the other day, but that's the way programming is. It's partly because I'm using a 3D object to represent it.


You should at least recognise the R-Pi's URL at the top by now, plus the properties "Rules", "light", "within" and "light-sensor" in there.

You'll see if you look that I had to turn off Bluetooth to get this to work over WiFi. Apparently the Nexus 4 doesn't let you have both on at the same time, which is a bit of a drawback.

So that's the first signs of an AR view over an IoT object!

Sunday 22 December 2013

Successful advertising of a Thing URL over BLE

On Friday I proposed an encoding in BLE for advertising a Thing's URL. A little bit of coding later, and I can confirm that it seems to work fine.

Here's the incantation I made at the Raspberry Pi end, using the proposed encoding of an Object Net URL:

hcitool -i hci0 cmd 0x08 0x0008 1f 02 01 1a 1b ff 4c 00 02 16 c0 a8 00 12 1f 92 c0 93 a9 08 a9 d8 f1 c1 00 00 00 00 00 00 00 00

The first bold numbers are the IP address in hex (192.168.0.18), the next two numbers are the port number (8082) and the following bold numbers are the UID.

Here's the Android log output, partly from my own code:

I/bt-hci  (12892): BLE HCI(id=62) event = 0x02)
I/bt-hci  (12892): btu_ble_process_adv_pkt
D/BtGatt.btif(12892): btif_gattc_upstreams_evt: Event 4096
D/BtGatt.GattService(12892): onScanResult() - address=00:02:72:C6:F1:B3, rssi=-61
D/BluetoothAdapter(18229): onScanResult() - Device=00:02:72:C6:F1:B3 RSSI=-61
I/System.out(18229): ---Cyrus---Thread[Binder_3,5,main]--- 172712,1285
I/System.out(18229): xxxxxxx BLE device: :: 00:02:72:C6:F1:B3 :: -61 :: 02 01 1a 1b ff 4c 00 02 16 c0 a8 00 12 1f 92 c0 93 a9 08 a9 d8 f1 c1 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 
I/System.out(18229): ---Cyrus---Thread[Binder_3,5,main]--- 172713,1
I/System.out(18229): xxxxxxx URL :: http://192.168.0.18:8082/o/uid-c093-a908-a9d8-f1c1.json

There's my URL reconstructed via a simple String.format. The -61 is the signal strength (RSSI) that we'll need later.

Next step is to get the lifecycle, threading and polling of this BLE scanning all sorted out in the Android app so that I can get this to the user quickly and reliably, in the form of a list of URLs to nearby Things ordered by signal strength. They can then choose a link in order to view and interact with it in 3D.

You'll need screenshots, obviously.

Saturday 21 December 2013

Technology trends, making money and social progress

Here in ThoughtWorks we've just been challenged by our CTO, Rebecca Parsons, to suggest interesting trends in the software industry.

My response was an enumeration of topics that are heading towards the vision I offered in my IoT Manifesto:

To invisibly merge the real and the virtual, creating ambient and ubiquitous interaction, and to empower people over the control and sharing of their physical items, virtual data and the rules that animate them.
  1. Internet of Things (not a bad summary here: http://www.datainnovation.org/2013/11/the-internet-of-things/ )
  2. P2P redux - empowering the user, cutting out the middle (pushes against cloud-dependent IoT :-) - IPv6, whitespace
  3. Massively parallel processing - congruent to the IoT: vast arrays of simple processors
  4. Declarative programming languages that can be used to program these arrays of Things and processors
  5. End-User Declarative Programming - real-time manifestation of effects of user's animation rules: like HTML/CSS, Excel, IFTTT
  6. 3D (OpenGL ES2.0 specifically, or WebGL) as the normal interface, replacing 2D/document/desktop metaphors
  7. Minecraft-style interaction simplicity and power in the 3D interface, to create and explore
  8. Augmented Reality, via smartphones, glasses or projection, holographic 3D displays, wall-sized screens
  9. Gestural interaction by hands and body not just fingers - Wii, Leap Motion style, 3D sensing, Minority Report style
  10. 3D printing, robots, telepresence robots

ThoughtWorks' Three Pillars

As you'll find out within a couple of minutes if you jump the link under Rebecca's name up there, in ThoughtWorks we are guided by our "Three Pillars": Pillar 1 being the core business, Pillar 2 being promoting software excellence and innovation, and Pillar 3 being making a tangible contribution to socially-progressive projects. These are not just corporate woolly-speak: in TW they actually mean something.

Rebecca's challenge was coming from a Pillar 2, innovation perspective, but it is possible to see the application of these trends in the other two Pillars.

For the Pillar 1 angle on my list - business value - I suggested the following:

"Visualisation within Data Science could bring in those Minority Report interfaces.. Maybe Data Scientists would be more comfortable and productive using Excel-style formulae programming. Mapping formulae to parallel hardware sounds like a fun project. Imagine a Data Scientist interface allowing quick sketches of rules and reductions that manifest instantly in a 3D visualisation.. :-)  This could be built today pretty easily."

For Pillar 3 - social progress:

"All of my list is highly relevant to our P3 efforts. Cheap, empowering, sharing technology. From sensing forests being illegally hacked to 3D printed prosthetics. Lots of cheap hardware all wired up, bypassing the multinationals!"


Friday 20 December 2013

Advertising an Object URL with Bluetooth LE

In a post the other day, I described the magic incantation to set the advertising data in a BLE beacon:

hcitool -i hci0 cmd 0x08 0x0008 1e 02 01 1a 1a ff 4c 00 02 15 e2 c5 6d b5 df fb 48 d2 b0 60 d0 f5 a7 10 96 e0 00 00 00 00 c5 00

Today I'd like to sketch out some thoughts about how to set a URL in that data, as I discussed in yesterday's post.

First step is to break it down into bits. Following the page I linked to before on The Register, plus extra information from this page on StackOverflow, it seems that this is what the various parts mean:

0x08 0x0008 - command: HCI_LE_Set_Advertising_Data
1e - 30 octets follow out of 31 max
02 01 1a - 2 octets follow, bluetooth flags
1a - 26 octets follow
ff 4c 00 - manufacturer data: Apple specific
02 15 - iBeacon: 21 octets follow
e2 c5 6d b5 df fb 48 d2 b0 60 d0 f5 a7 10 96 e0 - UUID
00 00 00 00 - Major/Minor
c5 - calibrated strength
00 - padding to reach the 31-octet maximum length

Now some of this is iBeacon-specific, and we're not trying to play within Apple's walled gardens, here. But I don't know the magic codes to send data that isn't manufacturer-specific, yet, so at the cost of losing some valuable octets, we can pretend to be Apple for now. But again, what is Apple's magic for data that isn't an iBeacon? We'll have to lose two more octets and start at the UUID. We can at least (hopefully) regain that lost final octet by bumping up all the lengths by one to use the whole available space:

0x08 0x0008 - command: HCI_LE_Set_Advertising_Data
1f - 31 octets follow out of 31 max
02 01 1a - 2 octets follow, bluetooth flags
1b - 27 octets follow
ff 4c 00 - manufacturer data: Apple specific
02 16 - iBeacon: 22 octets follow 
68 74 74 70 3a 2f 2f 66 6f 6f 2e 63 6f 6d 2f 35 31 2e 6a 73 6f 6e
- "http://foo.com/51.json"

Here we've used all the 22 characters to put in a string URL. Hm. 22 characters isn't very long, and this URL has a lot of redundancy, requires us to find a short domain name, as well as depending on DNS to expand that host part into what would be a much longer IP string. Also, there's no room for a port number, and it would be a bit odd using DNS to resolve to a NAT address, so this would only really be useful for public IoT installations.

How about using a link-shortening service? Here is the 21-char URL of my Raspberry Pi hosted Cyrus light object on my LAN, created by Bitly:

http://bit.ly/19i3iqr

So yes, we can put any URL we like into those 21 characters, but now we depend on both the DNS and the Bitly public-Web services!

So we're forced to go back to the octets. What about IPv6? Every Thing has its own IP address! We could put in either the direct IPv6 address of our JSON object, which is 16 octets, or come up with a convention to use IPv4 4-octet addresses plus a unique ID for the object - which is what I use in the Object Network anyway: I currently use 8-byte hex-rendered "UIDs", but there's no limit on their length.

So, not forgetting the port number, the Object Net URL:

http://192.168.0.18:8082/o/uid-c093-a908-a9d8-f1c1.json

Would be reduced down to:

192 168 0 18, 8082, c093 a908 a9d8 f1c1

Which would fit into just 14 octets! With 16-byte Object Net UIDs, we'd hit our 22-octet limit, so we'd hopefully have figured out the whole Bluetooth advertising thing by then, to get the extra space back.

A couple more octets would need to be used to choose either IPv6 or this IPv4 scheme, and to switch on TLS, choose JSON or Cyrus formats, etc.

It does mean that the arbitrary convention I started using for paths to Object Net UIDs - prefixing them with "/o/" - would become baked in, but that's not too restrictive, I think.

It also prevents us using the indirection of DNS if we should need to, when broadcasting URLs from beacon Things. I haven't thought through how much of an issue that would be in the IoT and Object Network future. I'll let you know..

I'll try all this out and report back on how I got on.

Thursday 19 December 2013

Discovery and Set-up in the Object Network IoT

When we have bought our light and turned it on, what happens next?

In today's episode I'll sketch out how discovery and set-up would work for the Object Network Internet of Things.

This is the initial sequence of low-level events:
  1. The light scans for beacons around and picks the strongest one
  2. It gets connected to the LAN, either by getting the WiFi details from its neigbour..
  3. .. or through a BLE connection via its neighbour
  4. It gets the URL of the nearest neighbour Thing from its advertisement
  5. It advertises its own URL on its own beacon
These steps get the light connected to the local net and result in the URL of a nearest neighbour object to put into the light object to get it kicked off.

It can then go ahead and execute the following steps as simple Cyrus rules, which will trigger the underlying FOREST interactions over the LAN (basically just GETs and POSTs of JSON):
  1. It GETs the neighbouring Thing's JSON over the LAN from that URL
  2. It gets that Things "place" (room) URL - it's assuming it's in the same room
  3. It sets its own place/room to that URL, then POSTs its own JSON to that place
  4. The place adds the light as a new Thing in the room with a link back to it
  5. The light scans other Things linked to by the room to find any light sensor
  6. The light runs that rule I described yesterday, to decide whether or not to turn on
Steps 1-3 are encoded in just this one rule:

{ is: light rule
  within: # => @nearest-neighbour:within
  Notifying: => @. with @within
}

This grabs the "within" place URL off the nearest neighbour Thing that was set by the first steps, then sets up notifications to that place, so that it can see the light is now here.

The "#" ensures that this rule only fires if the within property hasn't already been set. The symbol "@." refers to the property being rewritten, here the "Notifying" property which is the current list of URLs to be notified of changes. The symbol "with" means "ensure that this list includes the following" - here the "@within" URL.

Step 4 is this rule:

{ is: place rule
  Alerted: { within: @ }
  sub-objects: => @. with { object: @Alerted }
}

The other end of a Notifying is an Alerted, which is where the place object gets to see the light object's state. It then adds an entry for it in its list of sub-objects.

Step 5 is this rule:

{ is: light rule
  within: { sub-objects: { object: { is: light-sensor } } }
  light-sensor: # => @=within:sub-objects:object
}

This match will jump the place/room link and then all the links of all the Things in the room, searching for a Thing that "is:" a "light-sensor". The "@=" syntax means only get the object links that have been matched by the rule, not all of them. There may be just one, or a list of them.

Additional steps would be needed for the light to triangulate an approximate location in the room for itself from other neighbours' location and signal strength.

Wednesday 18 December 2013

Cyrus rule to turn on a light when it gets dark

Here is what a Cyrus rule could look like, to turn on that light that I mentioned yesterday.

There are two Cyrus objects, the first representing the light, and the other representing the light sensor which is on a different host (e.g. another R-Pi):

{ Rules: uid-RR
  is: light
  light: 0 0 0
  light-sensor: http://X.X.X.X/o/uid-SS.json
}

{ is: light-sensor
  light-level: 200
}

The light object is currently off (RGB=0,0,0). It has a link to the light sensor (uid-SS) whose level is 200.

Here's the rule object that turns on the light when the light level falls below 100:

{ is: light rule
  light-sensor: { light-level: < 100 }
  light: => 1 1 1
}

The light object has a link to this rule (uid-RR). It's on the same host, so doesn't need a full URL.

This rule will attempt to pattern-match to the light and rewrite it if it matches. The "=>" symbol - pronounced "becomes" - is the place where the light is turned on (RGB=1,1,1) if the condition "< 100" matches the light level.

This rule transparently jumps the light-sensor link to see into the light sensor object as if it were inlined, which triggers an observation of the sensor over the network. Whenever the light level is updated, the rule will be re-triggered.


End User Programming

Now clearly I'm not expecting just anyone to be able to program like this, but it's possible to see how close this language is to something that anyone could access through a GUI, perhaps in the style of If This Then That.

People who are competent in Excel formulae or HTML and CSS will have no trouble at all programming their Things in Cyrus.

Tuesday 17 December 2013

Manifesto for the Internet of Things

My and others' vision for the Internet of, well, Everything is:

To invisibly merge the real and the virtual, creating ambient and ubiquitous interaction, and to empower people over the control and sharing of their physical items, virtual data and the rules that animate them.

This requires a low-frictionseamless, open, automated IoT ecosystem, so that people can feel in control of their Things and trust the IoT to work for them almost unconciously.

Towards that goal, here is a Manifesto for the Internet of Things:
  1. I want adding a Thing to be as simple as switching it on
  2. I want all the Things I own to work together, to look the same if they are the same
  3. I want to be able to see and own all the data created by my Things
  4. I want to be in full and sole control of my Things
  5. I want to only see what I need to see and for everything else to be automatic
  6. I want to be able to see and easily modify the rules that are running my Things

Networked Light Example

Take the vanilla example of a networked light with a manual switch:
  1. I should be able to just plug in the light and it should be ready to go
  2. It should look exactly like all the other lights in my house on the network so that I can turn it off with all the others, and it should automatically detect the light level sensor
  3. I, and only I, should be able to see that it's been turned on or off manually, unless I share that information
  4. I, and only I, should be able to turn it on or off remotely, unless I share that ability
  5. The base rule could be "on if room occupied and it's dark"; and I don't need a notification if someone overrides the rule by hitting the manual switch
  6. I should be able to see that rule and modify it to suit my lifestyle
Being able to turn on a light using a dedicated app locked in to a proprietary cloud server after half an hour of configuration and logging in is low value, high cost.

If you are already in the app because it works for everything in the house and it knows where you are so the light is just there in view, a few seconds after plugging it in for the first time, and the interaction is direct over the local network, then the cost-value balance will tip. If it just turns itself on and off automatically instead of you doing it but still lets you manually override that, then it becomes compelling - more so if you can easily adjust the rules for that behaviour.


Monday 16 December 2013

The Plan: an AR view of the IoT

My goal is to build an Augmented Reality view over a house full of IoT sensors and actuators.

The approach is to start with some Raspberry Pies running sensors or actuators and exposing them with REST interfaces, then to turn them into BLE beacons and use that to drive and position an Android Augmented Reality app.

The beacons give distance information to place the AR view, as well as broadcasting the URLs linking to the R-Pies' objects being viewed, to be fetched over WiFi.

Those objects will carry coordinate information to anchor them in the combined view; the 3D objects of many R-Pies will be linked up into a virtual place overlaying your house. Then you will be able to see what's going on in the R-Pies virtually in the form of 3D objects at those URLs.

You spark up the Android app and, as you wander around the house, you would see an AR view of your IoT world: you would see the state of all your sensors and actuators and could interact with them. 

You could see the temperature or occupancy in every room, check the doors and windows, read the weather centre in the garden, see if you have a phone message and check on the house electricity consumption. You could touch actuators to turn on the lights, set the temperature, set your home media centre to record a show, etc.

Then you would also be able to see other, completely virtual, objects such as 3D signs, whose text is some derivative of the state of the IoT world, such as "all doors and windows locked" or a big red button labelled "turn off all the lights".

You could switch to moving by on-screen controls, so that you could interact with your whole house from the comfort of your armchair or bed. By setting a bookmark, you could even visit your house when away, and wander around it from the poolside.

Ultimately, you'd have R-Pies all around your house, perhaps talking to devices directly from the I/O pins, perhaps through Bluetooth, ZigBee or Z-Wave. Each R-Pi would publish its objects into the shared, linked-up virtual world.

The interactions and mashups between your things and to and from the all-virtual objects will be user-programmable with simple rules that match conditions and set new states.

Sunday 15 December 2013

Bluetooth 4.0 LE Beacons on the Raspberry Pi

To make a Raspberry Pi act like an iBeacon, there are two great pages to read here and here. TL;DR: build bluez-5.11 or greater from source, plug in your USB dongle, then type some command line incantations into the Bluetooth subsystem (see below).

Attached is a picture of the four BLE USB dongles that I discovered that have a compatible chip in them, the Broadcom BCM20702A0:


The IOGear one (used in the linked articles above) is hard to track down here in the UK. However, the Trust and Belkin are available in the UK high street at PC World. The Plugable [sic!] also seems easy to get, by mail order.

You may need to tell the Bluetooth driver module to start and to recognise the device, like this:

modprobe btusb
echo 0a5c 21e8 >> /sys/bus/usb/drivers/btusb/new_id

The Belkin is the odd one out, having different Vendor/Product IDs. This requires the following incantation to get the btusb driver attached:

modprobe btusb
echo 050d 065a >> /sys/bus/usb/drivers/btusb/new_id

If you type usb-devices and look for the device, you'll see the Broadcomm BCM20702A0 entry and those two codes up there. It may say Driver=(none)until you do one of the above commands, when it'll turn into Driver=btusb.

The magical incantations I mentioned are:

hciconfig hci0 up

hcitool -i hci0 cmd 0x08 0x0008 1e 02 01 1a 1a ff 4c 00 02 15 e2 c5 6d b5 df fb 48 d2 b0 60 d0 f5 a7 10 96 e0 00 00 00 00 c5 00 00 00 00 00 00 00 00 00 00 00 00 00

hciconfig hci0 leadv 3

It probably goes without saying that plugging the dongle directly into the R-Pi will bring it down, due to the power surge, but I found that it rebooted eventually and carried on fine. Or just use a hub.

If, like me, you want to do this four or six times, you'll need the UUID or major/minor fields in the long string above to differ. See the linked articles above for details.

Check your beacons are working using this great Android app. It has a distance-to-beacon mode in big monochrome fonts that look impressive as you wander the house, waving your phone around.

Saturday 14 December 2013

60 Days of Things

This blog is a diary of my involvement in Thoughtworks' "100 Days of Hardware" global theme. I started late, so for me it's about 60 Days of Hardware.

I intend to use it to document my experiments in Raspberry Pies, Bluetooth beacons, Augmented Reality and the Internet of All The Things.  Or not exactly all of them: maybe three LEDs and a light sensor.