Temboo ignited with a Spark

This is a re-post of an article I wrote as a guest blogger for Temboo’s blog.
I am working with connected devices and was looking for a cloud service. While surveying the field Temboo caught my eye because of the large number of supported premium web sites and the promise to connect IoT devices to the Internet in a breeze.
spark_size The connected device I am using is a Spark Core. The Spark Core is a sleek small board that offers a powerful 32 bit ARM CPU paired with WiFi. The product grew out of a Kickstarter campaign and is rapidly gaining in popularity. The Spark is nicely priced and everything is open source. The team supporting the Spark Core is smart and supportive and made a great choice to port most of the main Arduino APIs to their platform.
As outline in a blog post here migrating Arduino Libraries to the Spark Core often turns out to be pretty easy. With Temboo providing an open source library for Arduino I was tempted to give it a try. However, I had no Temboo-Arduino setup so I was not sure how hard it would be to get it all up and running.
Well, I am happy to report that is was easier than expected. Temboo’s code is well written. I only had to work around some AVR specific optimizations that Temboo did to save program memory. As the Spark Core is built around a STM32F103 chip resources are not as tight as with the AVR so I simply bypassed these optimizations.
Here are some brief instructions how to install the Temboo Arduino Library. The instructions use the Spark command line SDK setup:

  1. Download the modified Temboo Arduino Library source code from github:
    mkdir temboo
    cd temboo
    git clone https://github.com/bentuino/temboo.git
  2. Get the Spark Core firmware:
    git clone https://github.com/spark/core-firmware.git
    git clone https://github.com/spark/core-common-lib.git
    git clone https://github.com/spark/core-communication-lib.git
    // Merge the two source codes
    cp -fr core-* temboo
    rm -fr core-*
  3. In older Spark firmware there is a small problem that the spark team already fixed. Open the file core-firmware/inc/spark_wiring_ipaddress.h and uncomment the line 54 with your favorite editor:
    // Overloaded cast operator to allow IPAddress objects to be used where a pointer
    // to a four-byte uint8_t array is expected
    //operator uint32_t() { return *((uint32_t*)_address); };
    bool operator==(const IPAddress& addr) { return (*((uint32_t*)_address)) == (*((uint32_t*)addr._address)); };
    bool operator==(const uint8_t* addr);
  4. Save your TembooAccount.h you generated with DeviceCoder to temboo-arduino-library-1.2\Temboo
  5. Now it is time to build the Spark application:
    cd temboo/core-firmware/build
    make -f makefile.temboo clean all
  6. Connect your Spark Core to your computer via a USB cable
  7. Push both buttons, release Reset button and continue holding the other button until RGB-LED lights up yellow
  8. Download application into Spark Core
    make -f makefile.temboo program-dfu

Temboo Examples

Two simple Spark application examples are included:

  • core-firmware/src/application_gxls.cpp – Example demonstrates the Temboo library with Google Spreadsheet
  • core-firmware/src/application_gmail.cpp – Example demonstrates the Temboo library with Gmail

to change the example that is built, edit the first line in the core-firmware/src/build.mk file:

CPPSRC += $(TARGET_SRC_PATH)/application_gxls.cpp

or

CPPSRC += $(TARGET_SRC_PATH)/application_gmail.cpp

Building this code was tested under Windows 8.1 using cygwin and the MINGW version of the ARM GCC compiler tool chain. It should be easy to use this Temboo Library with the Spark Cloud based SDK. To configure the Library to support Spark all the is required is to define the following label:

CFLAGS += -DSPARK_PRODUCT_ID=$(SPARK_PRODUCT_ID)

or add a

#define SPARK_PRODUCT_ID SPARK_PRODUCT_ID

to the source code. Temboo support for the Spark Core is a lot of fun. It is easy to setup your own Temboo account and compile the Temboo Arduino Library that now supports the Spark Core platform. To learn more about similar projects please visit my blog at http://bentuino.com.

8 Responses

  1. Hi,
    Thanks for posting this. I’ve been trying to get the code to compile, but am running into a couple of errors at the very end. I’d appreciate your help. Thanks.
    Building file: ../src/coap.cpp
    Invoking: ARM GCC CPP Compiler
    mkdir -p obj/src/
    arm-none-eabi-gcc -g3 -gdwarf-2 -Os -mcpu=cortex-m3 -mthumb -I../lib/tropicssl/ include -I../src -I. -ffunction-sections -fmessage-length=0 -Wall -MD -MP -MF ob j/src/coap.o.d -DSPARK_PRODUCT_ID=65535 -DPRODUCT_FIRMWARE_VERSION=65535 -DRELEA SE_BUILD -fno-exceptions -fno-rtti -c -o obj/src/coap.o ../src/coap.cpp
    arm-none-eabi-gcc.exe: error: CreateProcess: No such file or directory
    makefile:103: recipe for target ‘obj/src/coap.o’ failed
    make[1]: *** [obj/src/coap.o] Error 1
    makefile.temboo:165: recipe for target ‘check_external_deps’ failed
    make: *** [check_external_deps] Error 2

    1. It is difficult to determine. I advise to wipe all ./obj folders and build clean. Then check carefully if there is no compiler error earlier. Also, if you point me to your source I can give it a try.

    1. I got reports from several users of this problem and finally found some time to look into it.
      I was trying to recompile my original temboo examples and get the same error. These examples use to compile and are now broken. This seem to be a compiler/library problem that was introduced with updates to the Spark SDK.
      I don’t have a local spark compiler installation handy to debug it further. Maybe someone from the Spark Team has some insights what could cause the problem.

      1. I’ve been trying to compile this for the Spark Core using OS X and find that it does not compile in that environment either. I’ve asked the Temboo folks to try and sort it out. I’d love to run Temboo on the Spark Core.

  2. Hi, i was trying to compile your code but having issue with standard C++ min, max function.
    arm-none-eabi/include/c++/4.8.4/bits/stl_algobase.h:260:56: error: macro “max” passed 3 arguments, but takes just 2
    max(const _Tp& __a, const _Tp& __b, _Compare __comp)
    I also got the same problem while trying to compile the imported example from Spark online IDE. Do you have any idea what this error is and how to fix it?
    Thanks a bunch.

  3. Awesome! Great work! Would this work with Spark Photon too? My understanding is that the Photon is backward compatible with the Core?

    1. I don’t have a proton to confirm but I expect it will work as Proton is promised to be backward compatible with Spark Core.

Leave a Reply to Hoang Cancel reply