Secure App Access to Raspberry Pi Camera

In this post we will show you how to set up a standard USB webcam with a Raspberry Pi (RPI) and the AppMyProduct service for easy access and high security: There are already many good guides out there that demonstrate how to setup a webcam with the RPI – but they all leave the reader with a combination of firewall hassle and very poor security: You will have to setup port forwarding in the WAN router to access the RPI and there is no encryption or access control. So a lot of hassle and risk of strangers peeking through your webcam.

With a few additional, simple steps as compared to these existing guides, you can eliminate the firewall hassle, add strong encryption and a robust access control mechanism – Nabto’s AppMyProduct to the rescue!

rpi-logitechc270

RPI + Logitech C270

Once you are done, you will be able to securely access your RPI Webcam from anywhere through the AppMyProduct Video App – without any firewall fiddling or messing with DynDNS and self-signed certificates. As you will see below, the apps are freely available from the app stores – and the source code is available so you can tweak yourself. It is a hybrid app, so you can make many interesting changes and additions by just knowing basic HTML.

What you need

  1. A Raspberry Pi board with a Raspbian image installed. We have tested with 2nd and 3rd generation boards, 1st generation will most likely also work like a charm. raspberry-pi-3-hero-1-1571x1080
  2. A camera connected to the PI, this guide assumes a USB webcam but you can use anything that works with the Motion software package, including Raspberry Pi camera modules.
  3. An iOS or Android device.
  4. An AppMyProduct account – create one for free on www.appmyproduct.com, no credit card necessary and you can add several devices completely free of charge for development, testing, home and educational use.

 

Step 1: Log on to the RPI

Log on to your RPI, either directly if you have it connected to a display with keyboard or through ssh from a computer on your network:

$ ssh pi@192.168.1.207
pi@192.168.1.207's password:
...
Last login: Sat Sep 9 09:14:12 2017 from mclappe2.home
pi@rpi:~ $

The default password for the pi user is raspberry.

 

Step 2: Install software

The following must be installed:

  • Motion – webcam software
  • CMake, Git and Ninja – to build the Nabto remote access software

Log on to the RPI as described above and perform the following steps:

$ sudo apt-get update
...
Fetched 9,707 kB in 22s (439 kB/s)
Reading package lists... Done
$ sudo apt-get install motion
...
$ sudo apt-get install git cmake ninja-build
...
Unpacking ninja-build (1.3.4-1.2) ...
Processing triggers for man-db (2.7.5-1~bpo8+1) ...
Setting up ninja-build (1.3.4-1.2) ...

If you already have a running webcam and just want to setup remote access, only lines 1 and 7 are necessary.

Step 3: Configure the webcam software

In this step we must edit two configuration files on the RPI. If you have a favorite editor and know your way around, just do the following:

  • set “daemon on” and “stream_localhost on” in /etc/motion/motion.conf
  • set “start_motion_daemon=yes” in /etc/default/motion

For a bit more detail:

Log on to the RPI as described above and start the nano editor in the console to enable the webcam software to run as a background service:

pi@rpi:~ $ sudo nano /etc/motion/motion.conf

The nano editor on the RPI. It is highly recommended to learn another editor like vi or emacs if you are going to edit a lot of files - but nano seems a popular choice in tutorials.Change “daemon off” to “daemon on” (the line with the cursor above).

Next, as an extra security measure, you can disable insecure remote access to the camera. You can do this by setting “stream_localhost on” in the same file (when we install the secure remote access software in a moment, it will access the camera from localhost):

Screen Shot 2017-09-09 at 19.29.17.pngThe tricky part is to exit this nano editor … press “control-x” (to exit), press “y” (to save) and press enter (to use the default suggested filename).

Next, we must start the webcam software when the RPI boots:

pi@rpi:~ $ sudo nano /etc/default/motion

Screen Shot 2017-09-09 at 19.22.02

Set “start_motion_daemon=yes” and exit as described above (control-x, “y”, enter).

Finally, start the webcam service:

pi@rpi:~ $ sudo service motion start

 

Step 4: Build the remote access software

Perform the following steps on the RPI (ie, log on to the device or continue in the shell from above).

First, retrieve the remote access source code from the github repo https://github.com/nabto/unabto.git:

$ mkdir git
$ cd git
$ git clone https://github.com/nabto/unabto.git
Cloning into 'unabto'...
remote: Counting objects: 5745, done.
remote: Total 5745 (delta 0), reused 0 (delta 0), pack-reused 5745
Receiving objects: 100% (5745/5745), 2.63 MiB | 1.55 MiB/s, done.
Resolving deltas: 100% (3388/3388), done.

Next,  run cmake to prepare the build of the uNabto tunnel service. The key step in line 4 below is quite long, so scroll a bit to see it all or look below where we have zoomed in on it:

$ cd unabto/apps/tunnel
$ mkdir build
$ cd build
$ cmake -DCMAKE_BUILD_TYPE=Release -DUNABTO_CRYPTO_MODULE=openssl_armv4 -DUNABTO_RANDOM_MODULE=openssl_armv4 -GNinja ..
-- The C compiler identification is GNU 4.9.2
-- The ASM compiler identification is GNU
-- Found assembler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc
...
-- Generating done
-- Build files have been written to: /home/pi/git/unabto/apps/tunnel/build

The cmake invocation in step 4 looks as follows:

cmake -DCMAKE_BUILD_TYPE=Release \
-DUNABTO_CRYPTO_MODULE=openssl_armv4 \
  -DUNABTO_RANDOM_MODULE=openssl_armv4 -GNinja ..

Just FYI – the Ninja tool used is an alternative to standard make tool; it builds much faster on the RPI than make, likely because Ninja has optimized filesystem access on the quite slow RPI.

Now you can build the tunnel:

pi@rpi:.../build $ ninja
[77/77] Linking C executable unabto_tunnel

Check you can execute the unabto_tunnel binary (./unabto_tunnel) and copy the resulting binary into /usr/bin:

pi@rpi:.../build $ sudo cp unabto_tunnel /usr/bin

 

Step 5: Create an AppMyProduct device

For this step, you need an AppMyProduct account – go to https://www.appmyproduct.com and sign up if you haven’t done so already.

Next, create a product – you don’t have to tick the “Free product” checkbox, you get a few regular licenses for free with your account (in this way you don’t have to see ads for the first devices you use).

Screen Shot 2017-09-09 at 20.29.24

Next, generate a license by clicking the “Generate license” button and enter 1 as quantity:

Screen Shot 2017-09-09 at 20.36.18

You now have everything you need to run the tunnel on the device – the device id and the key revealed when clicking “Show License Key” are used in the next step:

Screen Shot 2017-09-09 at 20.38.32

Step 6: Configure the tunnel

Log on to the RPI or continue your session above. Edit the file ./git/unabto/apps/tunnel/scripts/unabto_tunnel.conf:

pi@rpi:~ $ cd git/unabto/apps/tunnel/scripts
pi@rpi:.../scripts $ nano unabto_tunnel.conf

Replace the device id and key fields with what you retrieved from the AppMyProduct portal in step 5:

Screen Shot 2017-09-09 at 22.01.07.png

The port number 8081 matches the default configuration of the Motion webcam service, change this if you use a different port number.

Install the startup script and config file from the scripts directory:

$ sudo cp unabto_tunnel.conf /etc
$ sudo cp unabto_tunnel_initd /etc/init.d/unabto_tunnel
$ sudo chmod +x /etc/init.d/unabto_tunnel
$ sudo update-rc.d unabto_tunnel defaults
$ sudo /etc/init.d/unabto_tunnel start

Now everything should be working like a charm! Before actually firing up the app and enjoying remote access to your webcam, let’s check everything looks ok:

$ ps auwwx | grep unabto_tunnel | grep -v grep
pi  16300 0.1 0.2 18428 1932 ? 21:00 0:00 /usr/bin/unabto_tunnel /usr/bin/unabto_tunnel -d vtquiht9.xmnqf.appmyproduct.com -s -k

If nothing is seen as output of the ps command, go back to step 4 and make sure you can execute the binary after building it – and that you copied it into /usr/bin. If this does not help, write a comment below with your error message.

Also check the webcam service is running fine:

pi@rpi:.../scripts $ ps auwwx | grep motion | grep -v grep
motion 15762 1.4 1.1 53716 10020 ?    Sl  20:31  1:13 /usr/bin/motion

If nothing is seen here, make sure everything was installed correctly in step 2. Also, check Motion’s FAQ.

Step 7: Access the webcam

Visit Apple’s App Store or Google Play to download the AppMyProduct Video app. Start the app on the same network as the RPI to discover the device by tapping “Add new” and pair on the subsequent screens:

After pairing, you can see the actual webcam feed from anywhere – your communication with the camera is encrypted and only paired clients are allowed access. You can change settings for the device to control remote access:

On the security settings page, you should disable “Open for pairing” when you don’t want to allow further clients access to the camera. In the Access Control List you can see all the paired clients, ie who has access. You can delete the individual users with access by tapping the element.

Note that the “player” is extremely simple – it can only be used to show the default MJPG stream from the webcam. In a later blog post in this series we will add support for more advanced players for better video quality and also for adding sound.

You can download the source code for the player app from the ionic-starter-nabto-video github repo – it is a hybrid app you can edit by knowing just basic HTML. We will get back to this in a later post, demonstrating how to add new features to the webcam app.

Further Reading on Security

Some of you have asked for a bit of elaboration on how access control works in more detail and how we make sure only the intended users can access the device. It is a bit out of the scope to go into detail with in this post – but the AppMyProduct Video app and the tunnel software installed on the RPI use the Nabto platform under the hood for communication.

The platform provides a few different ways to control access, in this project we use the “Paired Public Key Authentication” approach outlined in section 8.2 of TEN036 “Security in Nabto Solutions”. Basically this ensures that only clients that possess the private key of an RSA keypair where the public key has been installed on the RPI is allowed to access the camera. In addition to the introduction above, you can read more about it in the Fingerprint ACL module documentation in the uNabto SDK.

For a more general introduction to security in the Nabto platform, the remaining sections of TEN036 “Security in Nabto Solutions are a good read – and also, we have a short introduction in this blog post.

 

The Roomba (Part 2 of 2)

Please read Part 1 to read about the preliminary steps in the Roomba saga.

This part will be the fun part where we turn the Roomba into a fully fledged remote controlled vehicle fitted with realtime video feed as well!

If at any time you feel like trying this yourself, you can sign up for a free developer account or/and read more at developer.nabto.com.

In details, this part will be dealing with

  • Turn the Roomba into a remote controlled car
  • Add soundeffects
  • Add sensor data
  • Add video streaming

The RC Roomba

Turning the Roomba into a remote controlled car is really easy once the groundwork is done correctly (see Part 1).

All we need are the new opcodes given to us in the documentation:

  • Drive straight (20 cm/s)
    • Using W
    • {137, 0, 255, 128, 0}
  • Reverse (20 cm/s)
    • Using S
    • {137, 255, 0, 128, 0}
  • Rotate counterclockwise
    • Using A
    • {137, 0, 255, 0, 1}
  • Rotate clockwise
    • Using D
    • {137, 0, 255, 255, 255}

This is all we need on the device side of things.
For the html part I simply chose to create a text input field which detects when a certain character is pressed in accordance with the keys given above. Say, for example, that you press W. The Roomba will now drive in a straight line until the user sends a new command.
If a key is pressed which does not have a predefined action, then the Roomba will stop driving. I usually use space to stop.

Adding sound effects

It is possible to store melodies in the Roomba. On our specific model, 581, we can store 4 melodies with 16 notes each (that storage!). From the documentation we find the serial sequence to be

[140] [Song Number] [Song Length] [Note Number 1] [Note Duration 1] [Note Number 2] [Note Duration 2].etc.

I decided to add a reverse sound, the DSB sound and a Dixie Horn style, River Kwai March (warning: loud) sound. These were added with the following codes

  • Reverse sound
    • {140, 1, 6, 84, 32, 30, 32, 84, 32, 30, 32, 84, 32, 30, 32}
  • DSB sound
    • {140, 0, 3, 74, 40, 75, 20, 70, 32}
  • Dixie Horn sound
    • {140, 3, 12, 86,10, 83,10, 79,10, 79,10, 79,10, 81,10, 83,10, 84,10, 86,10, 86,10, 86,10, 83,10}

The songs can then be played back by this sequence [141][Song Number], like this

  • Reverse sound
    • Played when the Roomba goes in reverse
    • {141, 3}
  •  DSB
    • Using K
    • {141, 0}
  • Dixie Horn
    • Using L
    • {141, 3}

Sensor data

The Roomba has 58 different sensor data packets, which all return hex values. For this I want to query

  • Battery temperature
    • Packet ID 24
    • Data bytes: 1, signed.
    • Returns the temperature of the Roomba’s battery in degrees Celsius.
  • Battery charge
    • Packet ID 25
    • Data bytes: 2, unsigned
    • Returns the charge level in mAh
  • Battery capacity
    • Packet ID 26
    • Data bytes: 2, unsigned
    • Returns battery capacity in mAh
  • Charging state
    • Packet ID 34
    • Data bytes: 1, unsigned
    • Returns 1 if charging, 0 otherwise

We can query all these with a single sequence

[149][Number of Packets][Packet ID 1][Packet ID 2]...[Packet ID N]

which look likes this in our case

{149, 4, 24, 25, 26, 34}

Referencing the above Data bytes gives us a total sensor byte size of 6 bytes. The battery temp and charging state can be taken directly, since they only have 1 data byte. Charge and capacity need some attention since we need to bitwise left shift on the high (first) byte before we can get a decimal number. In C this is done like so

Number = [1]<<8 | [2]

We then calculate the battery level by simply doing

bat_level = 100 * (charge/capacity)

I then created a button for our html code which output charging state, battery level and battery temperature, when pressed.

Video streaming through Nabto tunnel

For this we selected to use a Raspberry Pi Camera Module since we already had one at the office. I then read about various ways of streaming the output and display it on a webserver. I tried many different approaches but eventually settled on RPi Cam Web Interface since it gave the best compromise between low latency, low bandwith usage and “high” resolution. Furthermore, these settings are fully costumisable and the user can thus change values for their needs. An extra feature is the ability to run motion detection and to take 2592 x 1944 pixel static images or 1080p @ 30fps, 720p @ 60fps  and 640x480p 60/90 video recording.

Setting up a webserver on our Pi

To get the video webserver up and running on our Pi is fairly easy when following the instructions laid out here.

sudo apt-get update
sudo apt-get dist-upgrade
git clone https://github.com/silvanmelchior/RPi_Cam_Web_Interface.git
cd RPi_Cam_Web_Interface
chmod u+x *.sh
./install.sh

The installation prompt will have some adjustable settings such as whether to run an apache or nginx webserver. Choose whatever you prefer. I chose to change the default port from 80 to 90 to avoid conflict with any other service running on our Pi. After installation a webserver should now be running. We can access it on our local network by going to pi_ip:port_chosen/html

If no video is running, try issuing ./start.sh or do an update by ./update.sh.

The default settings worked very well for me but if you happen to have a very low bandwith internet connection (or running this on one of the earlier Pi versions) you can try to adjust the bandwith settings.

Tunneling

To make our video feed available from anywhere, we will be running a Nabto tunnel on our Pi as well. Compiling a Nabto tunnel is very easy, simple issue these commands one line at a time on the Pi

cd unabto/apps/tunnel
cmake .
make

which we can execute by doing

./unabto_tunnel -d "id".demo.nab.to -s -k "key"

where the id and key are generated at developer.nabto.com

Finally we need to run the other end of the tunnel on our client machine. I will present how to do this using our simpleclient_app solution which works on Linux, Windows and Mac OSX. Furthermore I will walk through how to use the special Tunnel manager tool for Windows. All this can be downloaded from developer.nabto.com where we also need the common Nabto Client SDK Resources.

Command line

On our client machine, combine the Simple Client and Client SDK Resources such that the structure look like this
./bin/simpleclient_app and ./share/nabto/users/guest.crt and then issue

./simpleclient_app -q "id".demo.nab.to --tunnel client_port:localhost:device_port

which should output a few lines ending with either
tunnelStatus LOCAL
or
tunnelStatus REMOTE_P2P
depending on if the connection to the Pi is local or remote.

Tunnel Manager for Windows

Grab a copy of Tunnel Manager and input

"id".demo.nab.to into "Server"
127.0.0.1:client_port into "Local endpoint"
127.0.0.1:device_port into "Remote endpoint"

Which can look something like this

tunnelmanager_windows

Tunnel Manager for Windows. Change the Server name, Local endpoint and Remote endpoint to whatever you chose earlier on.

The tunnel is now up and running on the Pi and we are connected to from the client.

Let’s have some fun!

Everything is now setup. On our device side we are running a Roomba remote ./unabto_raspberry and a ./unabto_tunnel.

On the client side we are either running no tunnel (if we do not need video streaming) or either a command line ./simpleclient_app or the Tunnel Manager for Windows.

We can now finally access our uNabto Roomba webpage as we did in Part 1 where we will be greeted with this.

roomba_video_html

uNabto Roomba remote control html device driver

As stated below the image, clicking it will open a new window showing us the tunneled stream from the webserver running on the Pi.
Just below the image is a button for reading sensor data and below that I placed the input field for controlling the uNabto Roomba.

All that is left is to test it out!

If you feel like doing the same thing, please feel free to check out the code on Github and sign up for a developer account at developer.nabto.com, it is all free and you can create and manage 10 devices. This is also where you can find the SDKs and other Nabto software!

 

The Roomba (Part 1 of 2)

As some of you might have seen on twitter we had a device laying around at the Nabto headquarters with Nabto written all over it.


Today is the grand reveal of our Roomba hack!

full_roomba_tilt.jpg

I would like one hacked Roomba, please!

Like the CoffeePi this project was rather substantial so the Roomba hack will be split in two parts.

In this part we will be dealing with a common use case, namely to start a clean cycle remotely. For this we will go through

  • Hardware hacking and wiring
  • Adding the Nabto framework

Part 2 will deal with the fun stuff

  • Turn the Roomba into a remote controlled car
  • Add soundeffects
  • Add sensor data
  • Add video streaming

Hardware hacking and wiring

Roombas manufactured after October 2005, have a Serial Command Interface (SCI) for which some great documentation (here and here) is given by the company itself. Reading these revealed that everything on the Roomba is configurable through this interface and it is indeed awesome that the company itself encourages the hacking community. So why not get some Nabto running on our Roomba?
The model we have is a 5xx series. For our specific model, 581, the serial port was hidden underneath a plastic shroud which we tore right off. Underneath we locate the 7 pin SCI which look like this (from the documentation)

sci_pinout.png

We will be using pin 3, 4, 5 and 6 for communication. For this, we settled on using a 5V (!) USB serial cable. If you plan on using a 3.3V serial cable, you will need a logic level converter.

Furthermore, we want the Roomba to truly be standalone so we need some way of converting the unregulated battery voltage of 14.4 V to 5V. My simple, cheap go to solution for situations like this is an USB car charger which can be had for around 1$. They are built for variable voltage inputs of everything from 12V to 24V so they will work just fine for our Roomba project.

car_charger.png

Our 1$ USB car charger all wired up

We then fit the + wire of our USB car charger to pin 1 or 2 on the Roomba and  the + wire to pin 7 on the Roomba, for ground. In total, our wiring should look like this

roomba_wiring

Full Roomba wiring. Please note that the right hand side names and colours refer to the serial cable names. The left hand side refer to the matching colours and +, – of the USB car charger.

Adding the Nabto framework

To use the USB serial cable I wrote some small helper functions, which we will use for waking, initialising and writing/reading to/from the Roomba.

First, we need to initialise our serial connection (which we only need to do once)


char *portname = "/dev/ttyUSB0";

fd = open (portname, O_RDWR | O_NOCTTY | O_SYNC);
if (fd < 0)
{
    NABTO_LOG_INFO(("error %d opening %s: %s\n", errno, portname, strerror (errno)));
    return 0;
}
else{
    NABTO_LOG_INFO(("Opening %s: %s\n", portname, strerror (errno)));
}

    set_interface_attribs (fd, B115200, 0);  // set speed to 115,200 bps, 8n1 (no parity)
    set_blocking (fd, 0);                // set no blocking

We continue by waking up the Roomba from sleep which we do by setting the Device Detect (DD) low for say, 100ms, (as stated in the documentation). This is done by controlling the RTS line of our serial.

// Wake Roomba
setRTS(fd, 0);
usleep(100000); //0.1 sec
setRTS(fd, 1);
sleep(2);

The Roomba is then ready for commands! The code for inputting a virtual clean button press is

// Start clean cycle
char clean[] = {135};
write(fd, &clean, sizeof(clean));
usleep ((sizeof(clean)+25) * 100);

The actual command, or opcode, is 135, taken from the documentation. To stop the cleaning process, we can either issue the clean command again or put the Roomba into sleep mode. To avoid ambiguity when sending commands, I chose the latter, which has the opcode 133.

For now, this is all we need but we will explore many more (much more fun) commands in part 2 (we will update with a link here when posted).

Since we have written the helper functions in pure C, this code can be compiled for multiple devices. I tested it on my laptop running Linux Mint 17.3 Cinnamon 64-bit and on a Raspberry Pi 2 running Raspbian Jessie. It should compile just fine on at least all versions of Raspberry Pi and possibly anything else running Linux with the required libraries.

For our standalone Roomba, we of course settled on using the Pi, which we first set up for wireless network access (check an easy howto here) followed by getting the uNabto files and compiling for the Raspberry Pi. This is done by issuing the following commands one line at a time

sudo apt-get install git
sudo apt-get install cmake
git clone https://github.com/MarcusTherkildsen/unabto
cd unabto/apps/raspberry_pi_roomba
cmake .
make

We now have uNabto compiled on our Pi!
All that is left to do is to create a new device at developer.nabto.com. Simply Add Device and copy the newly created Key 

We now return to the Raspberry Pi and issue the following command for initiating the Nabto software

sudo ./unabto_raspberrypi -d "id".demo.nab.to -s -k "key"

You should see a couple of lines of output ending with

13:40:47:548 unabto_attach.c(575) State change from WAIT_GSP to ATTACHED

Which means uNabto is successfully up and running!
That’s all, we can now remotely start a cleaning cycle and stop it again by opening a browser and access id.demo.nab.to in your browser. You will be met with a log in page, simply click Guest, followed by an image like this

html_dd_screenshoot_roomba.png

uNabto Roomba html device driver

Sliding the switch to either side will now trigger the clean cycle on our Roomba, check it out!

If you suddenly got the urge to create your own IoT device using Nabto feel free to check out nabto.com and sign up for a developer account developer.nabto.com, it is all free and you can create and manage 10 devices. This is also where you can find the SDKs and other Nabto software!

The full code for this simple Roomba uNabto hack can be found at github.

Stay tuned for part 2 where we’ll have some fun!

Snappy Ubuntu Core + uNabto

Today we will have a look at Snappy Ubuntu Core, why it is nice, in principle, and why it still has some way to go before it will really be useful in reality.

Snappy Ubuntu Core, the basics

snappy

Their logo represents the blocks (snaps, kernels, daemons, etc.) that make up a full system.

Snappy Ubuntu Core is “a new, transactionally-updated Ubuntu for IoT devices, clouds and more”. In other words, it’s a new rendition of Ubuntu with some new features as well as a lot of features removed, which we will get back to. It should be noted that the system is also referred to as simply Ubuntu Core (not even the name is set in stone yet) so you will see both ways of writing it in this post.
The main selling point of this free product is that each app, hereafter denounced snaps, are independent blocks, ideally with no external dependencies. This has the huge advantage that a snap can be installed or removed without breaking anything else on the machine. The system can thus be seen as being build by legos, individually separated but put together into a full system.

Installing Ubuntu Core

Getting a copy of Ubuntu Core is fairly straightforward if you just follow the instructions laid out here. As you will quickly notice, Ubuntu Core is available for quite a number of devices although only the Raspberry Pi 2 is supported out of the Pi family at the moment. For IoT projects it does not make a whole lot of sense to run a virtual Ubuntu Core image on a desktop system, but for developing I used it extensively.

So either run a virtual machine using kvm or get your Pi 2 up and running if you feel like following along!

Installing a snap

Once we have Ubuntu Core up and running we can either SSH into it by issuing ssh -p 8022 ubuntu@localhost (virtual kvm) or ssh ubuntu@webdm.local (Pi2) both with the standard password: ubuntu.
Once in, let’s start by getting an overview of the stuff already installed on the machine. We do this by issuing

$ snappy list

The listed items can be either kernels, daemon services or executable binaries, basically everything that makes up our environment.

You should see something like
Name Date Version Developer
ubuntu-core 2016-01-28 7 ubuntu
webdm 2016-01-28 0.11 canonical
pi2 2016-01-28 0.16 canonical

First of,  version 7 of the ubuntu-core is quite old so we run

$ sudo snappy update

to update the system. If webdm is not installed already, then we also install it by issuing

$ sudo snappy install webdm

This service basically gives us a graphical user interface to the Snappy Store which we can access if we are on the Pi2 by opening a browser (on desktop machine) and going to webdm.local:4200.

webdm_store

The Snappy store

Here we basically see the graphical presentation of what we just saw in the command line. We can also see the available snaps for this architecture.
Going back to the command line we can also get a list of all the available snaps for the current architecture by issuing

$ snappy search '*'

At the time of writing this (10th of May 2016) there are exactly 100 snaps available for the Pi2 and 92 for amd64 so the framework is still in its infancy. To search for a specific app we try searching for the unabto snap

$ snappy search unabto

if successful we can install it as before, by issuing

$ sudo snappy install unabto

After it installs, we can try issuing

$ unabto.unabto

which will work perfectly on a 64 bit machine but on the Pi2 we will see this error

$ wiringPiSetup: Must be root. (Did you forget sudo?)

trying the same command with sudo will result in

$ wiringPiSetup: Unable to open /dev/mem: Operation not permitted

to find out why, we need to have a look at how snaps are created

Creating a snap

As mentioned, the number of available snaps to install is still fairly limited and so is the documentation for creating snaps. Through the iterations of the build tool and the Ubuntu Core itself, the specific commands and instructions have changed and there is thus no consensus on how to actually do essential things. Many of the examples given cannot compile with my version of their build tool, called snapcraft.
But still, we march on and we will begin with the way Ubuntu Core wants you to use it.
On my amd64 machine I installed snapcraft by following the instructions here

$ sudo add-apt-repository ppa:snappy-dev/tools
$ sudo apt-get update
$ sudo apt-get install snapcraft

To get a feel of how snapcraft “crafts” programs into snaps we issue

$ snapcraft help plugins

to see all the steps that snapcraft does behind the scenes. For future reference the version I used was 1.1.0 on Linux Mint 17.3 Cinnamon 64-bit.
The only thing we really need to get something going for us is a snapcraft.yaml file as well as an icon. snapcraft.yaml contain instructions for snapcraft so that it knows what to do. In our simple example we write (you can find this code on github as well)

# Notice! Name cannot contain underscore
name: unabto-test
version: 0.1
summary: some summary
description: some description
vendor: mt@nabto.com
icon: icon.png

# Needed packages. If not found on system, snapcraft will install them
build-packages: [openssl, cmake, git]

# To be able to call the binaries in our snap directly from the command line we define binaries
binaries:
  # note, the name to write for
  # executing the binary below it. These two names cannot
  # be the same !
  unabto:
    exec: bin/unabto_raspberrypi
    # We have to allow the bin/binary to open
    # sockets and the like. For this, we use set permissions using the plugs term
    plugs: [network-bind]

# The parts the snap will consist of
parts:
  unabto:
    plugin: cmake
    source: git://github.com/nabto/unabto.git

Note that the above example is NOT exactly how most of the examples found in the Ubuntu Core github  are made at the moment (most of them cannot compile with my version of snapcraft). I suspect this is due to differences between releases and will probably be resolved in time.
Having the snapcraft.yaml file and icon.png in the same folder, now comes the magic!
Simply issue

$ snapcraft

Which will go through the stages laid out in $ snapcraft help plugins. You will notice that the directory now contain
unabto, parts, stage and snap as well as what we are actually interested in, unabto_0.1_amd64.snap, which is our snap! So yeah, that was really all we needed to create a standalone snap, wow !

Now we would like to check out how it works. You will no doubt have noticed the _amd64 in the filename, meaning that this snap is for an Ubuntu Core 64 bit machine. The reason for this is that snapcraft does not support cross compiling (I could not get it to work, at least) and we will later see how to circumvent this.
For now, we transfer and install our newly created snap on our virtual machine by issuing

$ snappy-remote --url=ssh://localhost:8022 install unabto_0.1_amd64.snap

This should run without any problems. We can check if the snap was succesfully installed by sshing into the virtual machine and issue $ snappy list as before. unabto should now appear on the list and we can run the binary we specified in the snapcraft.yaml file by issuing

$ unabto-test.unabto -d id -s -k key

where id and key is created at developer.nabto.com
If everything went as it should you will see a few lines of output ending with

$ 00:04:32:160 unabto_attach.c(575) State change from WAIT_GSP to ATTACHED

which is a sign of success !
We can now access the id in a browser and thus remotely control whatever unabto is running on, great success !
To see some real life uses please check out The CoffeePi (Part 1 of 2)The CoffeePi (Part 2 of 2) and The SunPi control center

Creating a multi architecture snap and creating some actual value into it

Now, I already said snapcraft can’t do crosscompiling so we need to do something else. The easiest way is to create a binary the old fashioned way for each architecture. For this we simply need a compiler for that architecture, as the name suggest. I used this to compile for the raspberry pi. Simply download it and go to the unabto/apps/raspberry_pi folder. If you don’t already have a copy of the uNabto SDK, you can grab one here. In this folder we issue

$ export CC=path/to/raspi_gcc

before issuing the usual

$ cmake .

and

$ make

We can check if the resulting binary really was compiled for an arm board by issuing

$ file raspberry_binary

which should return something like

$ unabto: ELF 32-bit LSB executable, ARM, EABI5 version 1 (SYSV), dynamically linked (uses shared libs), for GNU/Linux 2.6.26, BuildID[sha1]=cbb7dfe8560ba3e63dfffa621cad56f301fff1e7, not stripped

Make sure your main snap folder contain a bin, lib and meta folder as can be seen in the uNabto Ubuntu Core github repository. In bin we create a folder for each architecture as well as an executable that makes sure which to choose depending on the device. You can check that out on github as well. In meta we need an icon, a readme.md and a file almost like before, but this time called package.yaml

name: unabto
vendor: Marcus Therkildsen
architecture: [amd64, armhf]
icon: meta/icon.png
version: 0.1

binaries:
- name: bin/unabto

This time we notice that we defined “architecture: [amd64, armhf]” which should list all the archs we plan on incorporating in our multisnap. Now we can simply issue

$ snappy build

which will create unabto_0.1_multi.snap. This snap can now be installed on all the architectures we compiled each package for. This is done in the same way as before using the snappy-remote command. Again we can run it by issuing

$ unabto.unabto -d id -s -k key

and voila, we now have a working multi arch snap.
Still, the snap doesn’t really do anything other than controlling a dummy switch in the unabto code

Adding functionality

So what if we wanted to use the gpio pins on the RPi2?
Well, all functionality that a snap should have should be contained in the snap itself. That is, after all, the main selling point of Ubuntu Snappy Core. If you had the audacity to try to do anything on Ubuntu Snappy Core anyway you would quickly realise that

curl
wget
apt-get
make
cmake

and many other commands we are used to from Linux is missing. Ubuntu Snappy Core is truly not made for creating anything ON but made for executing stuff made for it on a different, ordinary linux distro. This gets really frustrating fast until you accept that this is the premise of Ubuntu Snappy Core.

The library I usually use for controlling gpio pins on any Raspberry Pi is called wiringPi  so we are going to compile that for our Pi2 and add it as a library. I ended up busting out my old Pi1 to compile the wiringPi library, then transferred it to my 64 bit machine and put it into lib/arm-linux-gnueabihf, as can be seen here. For future reference, these binaries are wiringPi 2.29.
Again, I compiled using snappy build and got a nice multi arch snap which we install as before.

Back on the Pi2 we issue

$ unabto.unabto

and are met with

$ wiringPiSetup: Must be root. (Did you forget sudo?)

and with sudo

$ wiringPiSetup: Unable to open /dev/mem: Permission denied

which leaves us exactly back at where we started.
So I tried many things to get this working. Since the newest version of wiringPi has a workaround for running without sudo (which I actually got working on my Pi1 running Raspbian) I tried to do the same thing here.
This involved remounting the read-only filesystem to make it writable, move around some libraries, add some paths and so forth. This totally goes against the Ubuntu Snappy Core principles and ultimately did not work anyway.
Now, if we run snappy -h on our Pi2 we see there is a way of assigning a hardware device to a package. We try that by issuing

$ sudo snappy hw-assign unabto.nabto /dev/mem

but it still did not work.
Thus, we saw that we cannot add hardware support the way Ubuntu Snappy Core states at their website and we really should not be messing around with remounting the filesystem.

This leaves us with the compromise of creating a custom wrapper, setting the library paths and calling the correct executable. This is already in the package (run_unabto)  so what we ultimately have to do to finally get something working is this:
Just after sshing into you Pi2 Ubuntu Snappy Core, issue this

$ echo "alias run_unabto=/apps/unabto.nabto/current/bin/arm-linux-gnueabihf/run_unabto" >> .bashrc

and then reboot. After this, we are now able to write

$ run_unabto -d id -s -k key

where id and key are found at developer.nabto.com and unabto should finally be running, wopeeh! But what is perhaps even more interesting, is the fact that if we hook up an LED to wiringPi pin number 0 we actually control something now instead of just the dummy switch we saw earlier plus we are as close as can be to the philosophy of Ubuntu Snappy Core: snaps should be selfcontained.
If we decide to remove unabto by issuing

$ sudo snappy remove unabto

and

$ sudo snappy purge unabto

the only remnant of unabto will be the last line in our .bashrc file and I can live with that.

Whuu, that was a bit of a mouthfull!

As seen, it is definitely possible to create snaps for Ubuntu Snappy Core. With time, glitches like examples that won’t compile and outdated tutorials will hopefully be fixed.

If nothing else, it is now possible to run unabto on Ubuntu Snappy Core and actually do something useful with it !

To finish things of, you can publish your snap to the Ubuntu store very easily by following the instructions here. If you would like to check out the code we used for creating the uNabto snap, please check out the relevant github repository.
If you would like to see how to add some more funtionality into your own uNabto snap, please check out The CoffeePi (Part 1 of 2)The CoffeePi (Part 2 of 2) and The SunPi control center.

The CoffeePi (Part 2 of 2)

Breaking news!

The CoffeePi will be appearing at Smart IoT London 2016!
smart-iot-london

Feel free to drop by our booth and get a hot beverage from the CoffeePi served using Nabto!

Back to the story

Please read Part 1 to read about the preliminary steps in the coffee saga.

This part will be dealing with the following points

  • Figuring out how to wire up and emulate the rotary knob
  • Adding the Nabto framework
  • Making the full menu available
  • Order a cup of coffee in London, trigger instant brewing in Denmark

The rotary knob

The rotary knob has three pins as seen in the image below

no_finger.png

The backside of the rotary knob. We see the three pins, the center being common high and the two outer pins which go high or low according to the rotation of the knob

The center pin is common high and the outer pins output the necessary encoder pulse needed for the main board. To get an idea of how these pulses look we checked it out using a logic tester

20160222_131353.jpg

The logic tester in use

which gave an output like this

logic

The output from the logic tester. The upper section is the output for moving to the left and the bottom section is the output for going right.

The upper section, in the image above, shows the output from the rotary knob when turning the knob counterclockwise which if equivalent to going  left on the main board display. Likewise the bottom section shows the clockwise rotation which result in a right movement on the main board display.

From this we see that the direction the knob is turned can be directly translated into which direction we are moving on the main board display simply by relating which pins changes state first. In other words, to move left/right we need to emulate an output like the top/bottom one seen in the image above.

Before doing that, we need a common starting point, meaning that we need to know if the pins are high or low to begin with. The simplest way to control this was to cut the pins like this

rotary_knob_pinout

The two switches represent two optocouplers. This means that we can disable the manual rotary knob controls by switching a pin on our Raspberry Pi. Doing that results in the receiving end of the pins to be pulled low. We thus have our starting point.

In total, our wiring should now look something like this

total_wiring

Wiring diagram for the CoffeePi. In total 10 optocouplers were needed, 6 for the buttons and 4 for the rotary knob. Common high, H, from the CoffeePi and common ground from the Raspberry Pi.

As can be seen from the wiring diagram above the 6 OCs on the left side controls the 6 buttons and the 4 on the right hand side controls the rotary knob. We input a digital high/low value on the bottom which signals a certain output on top, to the coffee machine.
The resistors are 270Ω, common ground is from the Raspberry Pi and H is common high from the CoffeePi (the blue markings in this image from The CoffeePi (Part 1 of 2)).

When the dis. pin is set high the rotary knob works in its normal manual way. When we set it low, the black inputs (generated from manual rotation of the knob) are disabled and we can instead send our own pulses using Rot.1 & 2.

For reference, the wiring looks like this in real life

IMG_20160317_122230 - Kopi

Wiring for the CoffeePi in real life. Notice the two extra optocouplers in the middle which are not hooked up for any output.

As always, the real projects end up being messier than on paper but both images have the same layout: 6 OCs for the buttons on the left and 4 OCs for the rotary knob on the right.

Adding the Nabto framework

Since our platform is the Raspberry Pi (Linux) I suggest checking out Raspberry Pi 3 IoT, perfect for Nabto for instructions on how to get the uNabto software up and running in no time. After that please check out The SunPi control center to get an idea of how we get a browser to communicate with our uNabto software.
For now we only need to worry about unabto_application.c in the src folder.

Making the full menu available

In The CoffeePi (Part 1 of 2) we were only able to progammaticrally push a button after we manually turned the rotary knob to the correct position. Since we now have an understanding of how the rotary knob creates pulses we can now make the full menu available. This is done in the following few steps (code snippets taken directly from unabto_application.c)

Disable manual control

// Shut off manual control
digitalWrite(11, LOW);
delay(wait_msec);

As well as disabling manual control from the rotary knob, the two pins are pulled low such that we always have the same initial setting.

Get a starting point

Since we have 9 available items on our menu we begin by going 8 steps to the left

// Go all the way to the left
for (i = 0; i &amp;amp;amp;lt; 5; i++){

    digitalWrite(13, HIGH);
    delay(wait_msec);
    digitalWrite(14, HIGH);
    delay(wait_msec);
    digitalWrite(13, LOW);
    delay(wait_msec);
    digitalWrite(14, LOW);

    delay(wait_msec);
}

This is done to ensure that we always have the same starting point when getting to the next step.

Selecting desired product

All our pins are now low and it is finally time to go to the desired product. Each item has a certain number associated with them (from the radio buttons) which directly corresponds to how many steps we need to move to the right.

html_page

The CoffeePi is up and running!

The most popular choice is, of course, Coffee and thus it is pre checked when accessing the CoffeePi online. This button has a value, or id, of 3 so when we click Brew the number/id is send using a Nabto request.

// Go id amount of steps to the right to the desired item
for (i = 0; i &amp;amp;amp;lt; id; i++){
    // If odd
    if (i % 2 != 0){
        digitalWrite(14, LOW);
        delay(wait_msec);
        digitalWrite(13, LOW);
    }
    else{
        digitalWrite(14, HIGH);
        delay(wait_msec);
        digitalWrite(13, HIGH);
    }
    delay(wait_msec);
}

After that bit we should now be at the desired item, and we can now emulate a button press

// Push button
digitalWrite(10, LOW);
digitalWrite(10, HIGH);
delay(100); // 0.1 sec
digitalWrite(10, LOW);

The item should now be brewing!

Clean up

While our lovely hot beverage is being brewed we send a few extra commands. First, we ensure that the two pins controlling the pulses are being set low again and finally, we turn on manual control of the rotary knob again.

// Set the steppers to low
digitalWrite(13, LOW);
digitalWrite(14, LOW);

// Turn on manual control
digitalWrite(11, HIGH);

Remarks

Since we started using a synchronous event handler, the Nabto framework expects to get a response in just 10 msec – so we need to run all this pin switching in a separate thread. The thread is set to being detachable, such that it will self terminate when all code inside the thread has been executed. Please see the appropriate code. In a later post, we will show how to use an asynchronous approach instead and save this separate thread.

From London with love

What better way to demonstrate the true IoT this coffee machine has now become by ordering a cup of coffee from London and then seeing it being brewed in Denmark a few seconds later?

Notice how my colleague in London is laughing through the process!

If you suddenly got the urge to create your own IoT device using Nabto feel free to check out nabto.com and sign up for a developer account portal.nabto.com, it is all free and you can create and manage 10 devices. This is also where you can find the SDKs and other Nabto software!

The full code for the CoffeePi can be found right here.

Raspberry Pi 3 IoT, perfect for Nabto

A few days ago (29/2-2016) the new Raspberry Pi 3 was announced. Of course we were all excited here at the Nabto headquarters and quickly bought a few.

Since they are reviewed as the perfect platform for IoT we wanted to check out if it is perfect for Nabto as well. Spoiler alert: it is !

Setting up the Pi 3

The fastest way to get wifi and Nabto up and running on your Pi 3 is to burn an image to your sd card (for this post we are using Raspbian Jessie Lite). After doing that, hook up your Pi3 by wire to your local network. You can now access the Pi, either directly by HDMI and a keyboard or over SSH.

One of the main features of the new Pi3 is the onboard wifi module so the first thing to do is to search for available networks and make the Pi autoconnect to the one we want. This is most easily done by issuing the following commands one line at a time

wpa_cli
scan
scan_results
add_network
set_network 0 ssid "ssid_name"
set_network 0 psk "password_stuff"
enable_network 0
save_config
quit

Where you should, of course, replace ssid_name and password_stuff with the SSID and password of the network you are trying to connect to.
After that you can reboot the Pi and remove the wire and you have a Raspberry Pi 3 ready for IoT!

Setting up Nabto

Setting up Nabto is as easy as setting up the wifi. First of we need to get the necessary tools for getting the uNabto files and compiling for the Raspberry Pi. This is done by issuing the following commands one line at a time

sudo apt-get install git
sudo apt-get install cmake
git clone https://github.com/nabto/unabto.git
cd unabto/apps/raspberry_pi
cmake .
make

We now have Nabto compiled on our Pi!
All that is left to do is to create a new device at portal.nabto.com. Simply Add Device and copy the newly created Key 

We now return to the Raspberry Pi and issue the following command for initiating the Nabto software

sudo ./unabto_raspberrypi -d "id".demo.nab.to -s -k "key"

You should see a couple of lines of output ending with

13:40:47:548 unabto_attach.c(575) State change from WAIT_GSP to ATTACHED

Which means Nabto is successfully up and running!

Trying out Nabto

Now that everything is up and running the final thing to do is to access id.demo.nab.to in your browser. You will be met with a log in page, simply click Guest, followed by an image like this

html_dd

The uNabto demo is up and running!

Sliding the switch to either side will now trigger the onboard activity light on your Raspberry Pi, check it out!

For a more in depth introduction on how to write your own functionality into the Nabto framework, please refer to this blog post The SunPi control center.

The CoffeePi (Part 1 of 2)

When I asked if I could hack the 1300$ (8800 dkk) coffee machine at the Nabto headquarters the responses varied from “no way” to “I’ll give it 30 % chance of success”. Eventually they surrendered to my request, so let’s see if it all worked out!

Since this project was rather substantial, the coffee machine saga will be split in two (possibly three later on) parts.

In this part we will be dealing with

  • Disassembling the coffee machine and finding points of interest
  • Figure out the what is needed to control the buttons
  • Brew an actual cop of coffee using SSH.

Part 2 will deal with

  • Figure out how to wire up and emulate the rotary knob
  • Adding the Nabto framework
  • Making the full menu available
  • Order a cup of coffee in London, get it seconds later in Denmark

Disassembling the coffee machine

before_dismantling.png

The starting point, a shiny functioning Siemens EQ 7 Plus coffee machine

The starting point was the Siemens EQ 7 (TE712201RW) coffee machine, as seen in its standard form in the image above.

First of I had to decide whether to try and access the three pin service port or to rip the machine apart and make some proper hacking. Since I found no information about this service port online I decided on the latter (plus it sounded like a lot more fun).

Machines like these are built with as few screws as possible and thus all side panels are hold in place by small plastic pins. Furthermore the machine is made such that to get to the main board you have to remove the back panel, then the side panels and finally the front panel. In total, dismantling the coffee machine was not easy but I did it MANY times in the process because of broken soldering or crushed wires.

20160218_120943

E-waste or successful hack?

With the front finally of it was time to have a look at the main board.

20160218_121202

The main board inclosure

We see 6 buttons and a rotary knob in the middle. Now we just need to check out what is underneath.

Controlling the buttons

20160218_121542 - marked.jpg

The main board of the coffee machine.

Using a multimeter I began by figuring out any common connections indicated by the blue lines in the above image.

When one of the buttons are pushed a connection is made between the blue and yellow side of the button. We want to be able to create artificial button pushes and thus we need to solder on wires at every yellow line in the above picture and a common blue wire.

20160223_194222.jpg

Strapped up for the party

We can now imitate a button push by simply shortening common with any of the buttons.
To control this programmatically I went with optocouplers, specifically Fairchild 817 chosen because it has a high CTR min. meaning that we can directly activate them using the 3.3V digital pins from a Raspberry Pi, as seen below.

20160222_165956.jpg

Using an optocoupler directly from the 3.3V pins from the Raspberry Pi.

I chose optocouplers since they are basically “light relays” as my colleague said.

opto

Cross-section through an optocoupler.[wiki] As can be seen the red LED is electronically isolated from the green sensor. Thus the two sides can have very different voltage levels without damaging anything.

This turned out to be a good choice since the logic of the coffee machine runs at 5V and the Raspberry digital pins run at 3.3V, as mentioned earlier.

All that is left is then to hook up the main board again and follow the on screen instructions:

20160218_145749

Please close the door!

After “closing the door” we need to mark the wires such that we know what they do.

20160218_155624

Almost assembled and soon labeled.

I simply shorted each wire with the common wire (see above) to figure out the function of each wire.

20160218_181952.jpg

Wires from coffee machine ready to rock

As seen above the wires are now coming out of the coffee machine in a nice little bundle. We now simply need to wire up alle the optocouplers to the raspberry pi and when that is done connect everything together.

20160222_161923

Heating the soldering iron.

Since I did not have any connectors lying around I had to create some using a bit of soldering.

20160222_172341

The optocouplers all wired up.

To avoid damaging the optocouplers (and RPi) I inserted a 270Ω resistor alongside each. The “red” wire above is common ground and the numbered “green” wires are the wires we can digitally set high/low. When we set an optocoupler high it means that power is allowed to flow through on the right hand side in the image above, i.e. a button on the coffee machine is pushed.

Have a cuppa

We can now finally wire everything together such that we can click any of the six buttons by issuing a simple command over SSH:

gpio write 10 1
gpio write 10 0

This basically emitates pushing the button in (first line) and then releasing it again (second line).

So far we still have to manually scroll to the desired item in the menu, but when we are there we can programmatically push the button, as seen below:

20160222_184911

The first cup of coffee ordered remotely (i.e. 5 meters away at my computer)

Great success! 

Issues

As mentioned, we still have to manually scroll to the desired menu item before we can programmatically push a button.

The next part will be dealing with this problem as well as a few others:

  • Figure out how to wire up and emulate the rotary knob
  • Adding the Nabto framework
  • Making the full menu available
  • Order a cup of coffee in London, get it seconds later in Denmark

 

The SunPi control center

Ever since I first build my solar powered Raspberry Pi system, called the SunPi, I wanted to be able to remote control it.

More specifically, in the summertime, when the sun shines heavily the battery would be filled within an hour after sunrise. The rest of the day, the charge controller would simply waste/curtail excess energy by dumping it as heat.
Instead of doing that I wanted to use this energy for something. Specifically I wanted to be able to remote control a relay such that I could activate whatever device I had lying around.

total_overview

Overview of the SunPi system

To determine whether or not to activate the relay I had to be able to read the voltage level which is equivalent to how filled the battery is. I did this using a INA219 current sensor, some Python and plotted it here. But I really wanted to have everything on a single page such that I could quickly determine whether or not to activate the relay. I wanted to be able to access this control center from anywhere and in a secure way. With Nabto I found the solution.

How Nabto works

nabto_overview

Overview of the Nabto framework

As seen in the above figure the client side (e.g. the device you are reading this on) has HTML, CSS and JavaScript running when you visit the SunPi control center. Altering a slider in the HTML calls a JavaScript function which in turn sends a Nabto request to the device side of things.
In this case the device is a Raspberry Pi running some compiled C code. For our power and voltage readings (more below) a Python script is called from within the C code.
That is basically all there is to the framework.

To get a better feel of how to add functionality I will walk through

  • Controlling a GPIO pin
  • Retrieving data

Note that these examples build on the default html_dd (Client side) and this RPi demo (Device side).

Controlling a GPIO pin

HTML

We start by creating a slider with a unique id (rgb-r in this case) containing the OFF and ON options.

<li>
  <h2>Red</h2>
  <div class="ui-li-aside">
    <select id="rgb-r" data-role="slider">
      <option value="off">OFF</option>
      <option value="on">ON</option>
    </select> 
<li>>

JavaScript

When the slider with the rgb-r id is toggled, the accompanying JavaScript-script is activated.

$("#rgb-r").change(function(){
  var state = $(this).val() === "off"?0:1;
  jNabto.request("light_write.json?light_id=1&light_on="+state, setLight_r); 
}); 

Where the setLight_r function shifts the actual slider.
What is perhaps more interesting is the jNabto.request part. The jNabto requests are defined in the unabto_queries.xml file and the light_write.json used above look like this

<query name="light_write.json" description="Turn light on and off" id="1">
  <request>
    <parameter name="light_id" type="uint8"/>
    <parameter name="light_on" type="uint8"/>
  </request>
  <response format="json">
    <parameter name="light_state" type="uint8"/>
  </response>
</query>

As seen, the request sends the two light_id and light_on variables containing information about which pin/light we are dealing with and which state (light_on) we want it in. After this, we expect to receive a json format response, specifically the light_state the pin has been set to. We notice that the query id is 1.

C code

On the device side we arrive at case 1 (in unabto_application.c) since the light_write.json has id = 1.

case 1: {
    uint8_t light_id;
    uint8_t light_on;
    uint8_t light_state;

    // Read parameters in request
    if (!buffer_read_uint8(read_buffer, &light_id)) {
        NABTO_LOG_ERROR(("Can't read light_id, the buffer is too small"));
        return AER_REQ_TOO_SMALL;
    }
    if (!buffer_read_uint8(read_buffer, &light_on)) {
        NABTO_LOG_ERROR(("Can't read light_state, the buffer is too small"));
        return AER_REQ_TOO_SMALL;
    }

    // Set light according to request
    light_state = setLight(light_id, light_on);

    // Write back pin state
    if (!buffer_write_uint8(write_buffer, light_state)) {
        return AER_REQ_RSP_TOO_LARGE;
    }
    return AER_REQ_RESPONSE_READY;
}

Most of this is for error logging, we only need to focus on the setLight function and how we plan on sending a message back to the client side afterwards.
The setLight function looks like this

// Set GPIO pin and return state,
uint8_t setLight(uint8_t id, uint8_t onOff) {
    theLight[id] = onOff;

    NABTO_LOG_INFO(("Nabto: %s turned %s!\n", pin_name[id], on_off[theLight[id]]));

#ifdef __arm__
    /* Toggle GPIO pins on Raspberry Pi	*/
    //Change pin output according to id and theLight state

    if (theLight[id]){
        //Activate pin
        digitalWrite(pin_id[id], LOW);
    }
    else if(theLight[id]==0){
        digitalWrite(pin_id[id], HIGH);
    }
#endif

    return theLight[id];
}

Where we toggle the chosen pin on or off depending on the light_on variable passed from the client.

When we sucessfully set the actual pin we write back the pin state to the client using the buffer_write_uint_8 command. And that is bascially all you need to control a GPIO pin on the Raspberry Pi using the Nabto framework.

Retrieving data

HTML

The HTML code for retrieving data is basically just a button which signals the JavaScript to do a certain task, in this case we are dealing with id=”ina_update” (remember, we are using a INA219 current sensor).

<li>
  <input type="button" id="ina_update" data-icon="refresh" value="Read Voltage and Power"/>
</li>

JavaScript

The accompanying JavaScript function look like this

$("#ina_update").click(function() {
  queryINA219($(this));
});

where

function queryINA219(input) {
  jNabto.request("ina_data.json?", function(err, data) {
    if (!err) {
      temp_power = data.power_w/10000

      // Check if negative
      if (temp_power > 100)
      {
        temp_power = temp_power - 200
      }

      input.val("Voltage: " + (data.voltage_v/10000).toFixed(2) + "V, Power: " + temp_power.toFixed(2) + "W").button("refresh");
    }
  });
}

contain the ina_data.json jNabto request. Again we check the unabto_queries.xml file and make sure that

<query name="ina_data.json" description="Read voltage and power status" id="3">
  <request>
  </request>
  <response format="json">
    <parameter name="voltage_v" type="uint32"/>
    <parameter name="power_w" type="uint32"/>
  </response>
</query>

the ina_data.json query id is 3.

C code

The request is sent to the device side where we end up in case 3 because the ina_data.json request has id = 3.

case 3: {
    // Get voltage from INA219
    getINA219_data(&voltage, &power);

    NABTO_LOG_INFO(("Nabto: Read voltage=%f and power=%f\n", voltage, power));

    // Write back data
    if (!buffer_write_uint32(write_buffer, voltage*10000)) 
    {
        return AER_REQ_RSP_TOO_LARGE;
    }

    // Prepare for negative numbers. This will be converted back in the html_dd app.js file
    if (power<0)
    {
        power = power + 200;
    }
            
    // Write back data
    if (!buffer_write_uint32(write_buffer, power*10000))
    {
        return AER_REQ_RSP_TOO_LARGE;
    }
    return AER_REQ_RESPONSE_READY;
}

As seen above, the power data needs special preparation since we are writing uint32 (unsigned, i.e. numbers are always positive). When the battery charges the power is defined to be positive and when it discharges it is of course negative. Since my solar panel has a maximum rated power of 100W (which it never reaches) and my maximum power consumption is 10W at any given time, I simply add 200 to the power measurement. I made sure to undo this addition in the accompanying JavaScript as seen above (// Check if negative).
Finally we take a look at the getINA219_data function,

// Get INA219 data
void getINA219_data(float* voltage, float* power){

    FILE *fp;
    char path[1035];
    /* Open the command for reading. */
    fp = popen("sudo python path/to/ina219_python_c.py", "r");

    if (fp == NULL) {
        NABTO_LOG_INFO(("Failed to run Python script\n"));
        exit(1);
    }
    /* Read the output a line at a time - output it. */
    int k = 0;
    while (fgets(path, sizeof(path)-1, fp) != NULL) {
        k = k + 1;
        if (k==1){
            *voltage = atof(path);
        }
        else if(k==2){
            *power = atof(path);
        }
    }
    /* close */
    pclose(fp);
}

The function calls a Python script which print out two lines of data. The lines contain the voltage and the power flowing in or out of the SunPi battery, respectively.
The data is then sent back to the client side using the buffer_write_uint32t command as seen above.
These are the building stones for how to write a simple Nabto application. After expanding on this a bit I finally had my control center.

The SunPi control center

control_center

The control center enables the user to control a RGB LED as well as a single relay or retrieve the voltage and power flowing into (or out) of the SunPi battery. It is also possible to retrieve the internal temperature of the SunPi.

Thus I can now simply access the Nabto SunPi control center (just click on guest to try it out. NOTE: no longer available) and then check the voltage. The battery is full when the voltage is above ~13V. If that is the case I can activate the relay or use the extra power for an RGB LED.

The full code is available on Github.

That’s all!