Spark Core

Logging Weather Data to the Cloud

In this post I want to report some of the finding from a 4 month experiment with a Spark Core. The core is uploading pressure measurement values into the cloud every five minutes. It then loggs them into a Google Spreadsheet. I use the Spark Temboo library and service after I failed to reliably poll the core from Google.
Anyway the Spreadsheet has now more than 30,000 entries. The chart generation is a bit slow but everything is still running stable to this day. A plot of the 4 month is shown below:
Temboo
After a measurement is taken the core goes to deep sleep to conserve energy. An STM103 internal HW timer interrupt is used to wake-up in time for the next measurement.
Over the four month I have seen only a few corrupted values. These are most likely transmission errors due to network connectivity issues.

Updated Temboo library available

I finally found some time to fix the problems with the Spark Temboo library. The following issues were fixed:

  • mib(), max() definition clash between the STL library and the Spark Wiring math library.
  • Misc makefile fixes to build the library from a shell

The updated source code is available at Github under https://github.com/bentuino/temboo.git Thanks to all the notes and hits I received from the community.

Temboo ignited with a Spark

This is a re-post of an article I wrote as a guest blogger for Temboo’s blog.
I am working with connected devices and was looking for a cloud service. While surveying the field Temboo caught my eye because of the large number of supported premium web sites and the promise to connect IoT devices to the Internet in a breeze.
spark_size The connected device I am using is a Spark Core. The Spark Core is a sleek small board that offers a powerful 32 bit ARM CPU paired with WiFi. The product grew out of a Kickstarter campaign and is rapidly gaining in popularity. The Spark is nicely priced and everything is open source. The team supporting the Spark Core is smart and supportive and made a great choice to port most of the main Arduino APIs to their platform.
As outline in a blog post here migrating Arduino Libraries to the Spark Core often turns out to be pretty easy. With Temboo providing an open source library for Arduino I was tempted to give it a try. However, I had no Temboo-Arduino setup so I was not sure how hard it would be to get it all up and running.
Well, I am happy to report that is was easier than expected. Temboo’s code is well written. I only had to work around some AVR specific optimizations that Temboo did to save program memory. As the Spark Core is built around a STM32F103 chip resources are not as tight as with the AVR so I simply bypassed these optimizations.
Here are some brief instructions how to install the Temboo Arduino Library. The instructions use the Spark command line SDK setup:

  1. Download the modified Temboo Arduino Library source code from github:
    mkdir temboo
    cd temboo
    git clone https://github.com/bentuino/temboo.git
  2. Get the Spark Core firmware:
    git clone https://github.com/spark/core-firmware.git
    git clone https://github.com/spark/core-common-lib.git
    git clone https://github.com/spark/core-communication-lib.git
    // Merge the two source codes
    cp -fr core-* temboo
    rm -fr core-*
  3. In older Spark firmware there is a small problem that the spark team already fixed. Open the file core-firmware/inc/spark_wiring_ipaddress.h and uncomment the line 54 with your favorite editor:
    // Overloaded cast operator to allow IPAddress objects to be used where a pointer
    // to a four-byte uint8_t array is expected
    //operator uint32_t() { return *((uint32_t*)_address); };
    bool operator==(const IPAddress& addr) { return (*((uint32_t*)_address)) == (*((uint32_t*)addr._address)); };
    bool operator==(const uint8_t* addr);
  4. Save your TembooAccount.h you generated with DeviceCoder to temboo-arduino-library-1.2\Temboo
  5. Now it is time to build the Spark application:
    cd temboo/core-firmware/build
    make -f makefile.temboo clean all
  6. Connect your Spark Core to your computer via a USB cable
  7. Push both buttons, release Reset button and continue holding the other button until RGB-LED lights up yellow
  8. Download application into Spark Core
    make -f makefile.temboo program-dfu

Temboo Examples

Two simple Spark application examples are included:

  • core-firmware/src/application_gxls.cpp – Example demonstrates the Temboo library with Google Spreadsheet
  • core-firmware/src/application_gmail.cpp – Example demonstrates the Temboo library with Gmail

to change the example that is built, edit the first line in the core-firmware/src/build.mk file:

CPPSRC += $(TARGET_SRC_PATH)/application_gxls.cpp

or

CPPSRC += $(TARGET_SRC_PATH)/application_gmail.cpp

Building this code was tested under Windows 8.1 using cygwin and the MINGW version of the ARM GCC compiler tool chain. It should be easy to use this Temboo Library with the Spark Cloud based SDK. To configure the Library to support Spark all the is required is to define the following label:

CFLAGS += -DSPARK_PRODUCT_ID=$(SPARK_PRODUCT_ID)

or add a

#define SPARK_PRODUCT_ID SPARK_PRODUCT_ID

to the source code. Temboo support for the Spark Core is a lot of fun. It is easy to setup your own Temboo account and compile the Temboo Arduino Library that now supports the Spark Core platform. To learn more about similar projects please visit my blog at http://bentuino.com.

Google polls Spark Core

appsIn a previous blog post I was describing an example of how a Spark Core can be used to read weather sensors. The setup was really no different from any simple Arduino Uno setup. It only demonstrated how easy it is to port Arduino Sketches to a Spark Core.
With the integrated WLAN I was interested to connect the Spark Core to the internet cloud. One of the simplest ways I found, was using Google’s Spreadsheet service. I stumbled over this idea in this Spark forum post.
Here is how it works: a Google Script is periodically reading data from the Spark Core via the RESTful Spark API and then appends the data to a Spreadsheet. The code below is a minimalistic Spark code to test the such a setup:

int variable = 1634;
void setup() {
  Spark.variable("variable", &variable, INT);
}
void loop() {
  variable++;
  delay(5000);
}

It publishes a variable for cloud access and then increments it in regular intervals. Together with the following Google script I was able to quickly pull data from my core.

function collectData() {
  var  sheet = SpreadsheetApp.getActiveSheet();
  var sensor = UrlFetchApp.fetch("https://api.spark.io/v1/devices/<id>7/variable?access_token=<token>");
  // parse the JSON the Core API created
  var sensor = JSON.parse(sensor.getContentText());
  // you'll need to unescape before your parse as JSON
  var sensor_result = unescape(sensor.result);
  // create a time stamp
  var d = new Date();
  // append data to spreadsheet
  sheet.appendRow([d, sensor_result]);
}

However when I setup a time trigger to run the script in regular intervals I found the setup to be very unreliable. This is discussed and documented by several Spark Users and as of this writing I have not seen a fix for this problem.
One thing to note is, that this approach is pulling data from the Spark Core rather than the core pushing them to the cloud. This has a significant flaw as we cannot put the core into standby between the measurement intervals. Therefore this solution is anyway not a good choice for low power applications.
So stay tuned, I am experimenting with a better solution that I will blog about in my next post.

Using Arduino code on Spark Core

In an earlier post Spark Core Weather Station I presented code that was reading weather sensors and sent the data over USB to a PC terminal. However the approach I took there to combine all code into a single file is not very practical. So now that the Spark IDE supports multiple project files we can just include Arduino library files.
Screenshot 2014-04-16 20.49.13
So here are some hints how to get Arduino library code and sketches to compile in the Spark framework. I use a TMP102 I2C temperature sensor to demonstrate the porting. Below is an example Sketch that reads the temperature values and sends them to the serial interface:

//////////////////////////////////////////////////////////////////
//Released under the MIT License - Please reuse change and share
//Simple code for the TMP102, simply prints ambient temperature via serial
//////////////////////////////////////////////////////////////////
#include <Wire.h>
int tmp102Address = 0x48;
void setup(){
  Serial.begin(9600);
  Wire.begin();
}
void loop(){
  float celsius = getTemperature();
  Serial.print("Celsius: ");
  Serial.println(celsius);
  float fahrenheit = (1.8 * celsius) + 32;
  Serial.print("Fahrenheit: ");
  Serial.println(fahrenheit);
  delay(200); //just here to slow down the output. You can remove this
}
float getTemperature(){
  Wire.requestFrom(tmp102Address,2);
  byte MSB = Wire.read();
  byte LSB = Wire.read();
  //it's a 12bit int, using two's compliment for negative
  int TemperatureSum = ((MSB << 8) | LSB) >> 4;
  float celsius = TemperatureSum*0.0625;
  return celsius;
}

The main reason this code fails to compile is the Arduino Wire.h library include statement fails on the cloud based Spark Core IDE. The TMP102 sketch code use I2C. So it includes the Arduino Wire Library with the following line:

#include <Wire.h>

This triggers an error in the Spark IDE. To solve this problem I created a new library named Wire.h with the following content:

#include "spark_wiring.h"
#include "spark_wiring_interrupts.h"
#include "spark_wiring_usartserial.h"
#include "spark_wiring_spi.h"
#include "spark_wiring_i2c.h"
/* Includes ------------------------------------------------------------------*/
#include "main.h"
#include "debug.h"
#include "math.h"
#include "limits.h"
#include "stdint.h"
#include "spark_utilities.h"
extern "C" {
#include "usb_conf.h"
#include "usb_lib.h"
#include "usb_desc.h"
#include "usb_pwr.h"
#include "usb_prop.h"
#include "sst25vf_spi.h"
}
#include <sys/types.h>

With this library all we have to do is to uncomment or delete the original include statement. By adding Wire.h file the IDE automatically adds a line like this to the source code:

// This #include statement was automatically added by the Spark IDE.
#include "Wire.h"

As long as the Arduino source follows basic guidelines for portable code I did not face much difficulties to use existing Arduino sources on the Spark Core.

Spark Core compiler toolchain under cygwin

There are good instructions how to install the local toolchain to compile the Spark Core firmware. I don’t want to replicate them here. Go to the Spark Core Github and check the Readme.md or search for a tutorial.
The purpose of this page is to show the steps and pitfalls when installing this toolchain under cygwin on a Windows 7 64 bit computer.
I assume you have at least the base cygwin installation on your machine. For instructions please head over to cygwin.org:
Screenshot 2014-04-07 22.05.40
Make sure you install git and any other goodies you like under cygwin.

  1. GCC for ARM Cortex processors – ARM cross compiler tool chain for Windows
  2. Make – Windows version of gmake
  3. Device Firmware Upgrade (DFU) Utilities – Utility to download the firmware to the Spark Core
  4. Zatig – USB driver for firmware downloads

Now install gcc for ARM and gnu make for Windows. Yes, cygwin also includes gmake. However there is a problem with it around dependencies. The MINGW GCC compiler uses Windows path notation in the *.d dependencies files that gmake under cygwin chokes on. So if you, the second time you try to compile, get an error like this:

obj/src/application.o.d:1: *** multiple target patterns. Stop.

it is because we use MINGW GCC under cygwin instead of a native cygwin compiler. You can also checkout a related posts on the Spark Community Board. It is now time to install the firmware. Pull the following three repositories from Github:

git clone https://github.com/spark/core-firmware.git
git clone https://github.com/spark/core-common-lib.git
git clone https://github.com/spark/core-communication-lib.git

These repositories contain all the Spark Core firmware. Once the source code is downloaded, go to the build directory in the firmware folder and start the compile:

cd core-firmware/build
make

If everything went smooth you should now have a core-firmware.bin file.  Run the Zatig program to install the USB driver. The moment has come to flash the firmware into the Spark Core. For this, push the left button to reset the core while holding down the right button. Release the reset button and wait until the RGB-LED is flashing yellow. You can now download the firmware with the following command:

make program-dfu

Screenshot 2014-04-08 03.03.57
Note the DFU utility always indicates an error “Error during download get_status” This is normal and as long as you see “File downloaded successfully” everything is fine.