Temboo ignited with a Spark

This is a re-post of an article I wrote as a guest blogger for Temboo's blog.

I am working with connected devices and was looking for a cloud service. While surveying the field Temboo caught my eye because of the large number of supported premium web sites and the promise to connect IoT devices to the Internet in a breeze.

spark_size The connected device I am using is a Spark Core. The Spark Core is a sleek small board that offers a powerful 32 bit ARM CPU paired with WiFi. The product grew out of a Kickstarter campaign and is rapidly gaining in popularity. The Spark is nicely priced and everything is open source. The team supporting the Spark Core is smart and supportive and made a great choice to port most of the main Arduino APIs to their platform.

As outline in a blog post here migrating Arduino Libraries to the Spark Core often turns out to be pretty easy. With Temboo providing an open source library for Arduino I was tempted to give it a try. However, I had no Temboo-Arduino setup so I was not sure how hard it would be to get it all up and running.

Well, I am happy to report that is was easier than expected. Temboo's code is well written. I only had to work around some AVR specific optimizations that Temboo did to save program memory. As the Spark Core is built around a STM32F103 chip resources are not as tight as with the AVR so I simply bypassed these optimizations.

Here are some brief instructions how to install the Temboo Arduino Library. The instructions use the Spark command line SDK setup:

  1. Download the modified Temboo Arduino Library source code from github:
  2. Get the Spark Core firmware:
  3. In older Spark firmware there is a small problem that the spark team already fixed. Open the file core-firmware/inc/spark_wiring_ipaddress.h and uncomment the line 54 with your favorite editor:
  4. Save your TembooAccount.h you generated with DeviceCoder to temboo-arduino-library-1.2\Temboo
  5. Now it is time to build the Spark application:
  6. Connect your Spark Core to your computer via a USB cable
  7. Push both buttons, release Reset button and continue holding the other button until RGB-LED lights up yellow
  8. Download application into Spark Core

Temboo Examples

Two simple Spark application examples are included:

  • core-firmware/src/application_gxls.cpp - Example demonstrates the Temboo library with Google Spreadsheet
  • core-firmware/src/application_gmail.cpp - Example demonstrates the Temboo library with Gmail

to change the example that is built, edit the first line in the core-firmware/src/build.mk file:

or

Building this code was tested under Windows 8.1 using cygwin and the MINGW version of the ARM GCC compiler tool chain. It should be easy to use this Temboo Library with the Spark Cloud based SDK. To configure the Library to support Spark all the is required is to define the following label:

or add a

to the source code. Temboo support for the Spark Core is a lot of fun. It is easy to setup your own Temboo account and compile the Temboo Arduino Library that now supports the Spark Core platform. To learn more about similar projects please visit my blog at http://bentuino.com.

13 thoughts on “Temboo ignited with a Spark

  1. Ed Ewing

    Hi,
    Thanks for posting this. I've been trying to get the code to compile, but am running into a couple of errors at the very end. I'd appreciate your help. Thanks.

    Building file: ../src/coap.cpp
    Invoking: ARM GCC CPP Compiler
    mkdir -p obj/src/
    arm-none-eabi-gcc -g3 -gdwarf-2 -Os -mcpu=cortex-m3 -mthumb -I../lib/tropicssl/ include -I../src -I. -ffunction-sections -fmessage-length=0 -Wall -MD -MP -MF ob j/src/coap.o.d -DSPARK_PRODUCT_ID=65535 -DPRODUCT_FIRMWARE_VERSION=65535 -DRELEA SE_BUILD -fno-exceptions -fno-rtti -c -o obj/src/coap.o ../src/coap.cpp
    arm-none-eabi-gcc.exe: error: CreateProcess: No such file or directory
    makefile:103: recipe for target 'obj/src/coap.o' failed
    make[1]: *** [obj/src/coap.o] Error 1
    makefile.temboo:165: recipe for target 'check_external_deps' failed
    make: *** [check_external_deps] Error 2

    1. kkaiser Post author

      It is difficult to determine. I advise to wipe all ./obj folders and build clean. Then check carefully if there is no compiler error earlier. Also, if you point me to your source I can give it a try.

  2. Ed Ewing

    Thanks. I got it to work and build. What is the reference you make later about the spark product id (assuming you dont mean the device id):
    CFLAGS += -DSPARK_PRODUCT_ID=$(SPARK_PRODUCT_ID)
    Never heard of that and I see its in the error msg I originally sent you. Just dont know where to find it.

    1. kkaiser Post author

      This variable was used in earlier Spark FW releases. The value is copied from a Environmental Variable of your shell. Anyway, I use it to differentiate between Arduino and Spark. It just needs to be defined, the value does not matter. Some of the enhancements in the Spark FW broke the Temboo library. But it is fixed now. Just try the latest code and see if it works for you.

  3. Pingback: IoT Process – Week 9 | JinTemp

    1. kkaiser Post author

      I got reports from several users of this problem and finally found some time to look into it.

      I was trying to recompile my original temboo examples and get the same error. These examples use to compile and are now broken. This seem to be a compiler/library problem that was introduced with updates to the Spark SDK.

      I don't have a local spark compiler installation handy to debug it further. Maybe someone from the Spark Team has some insights what could cause the problem.

      1. Jim Schrempp

        I've been trying to compile this for the Spark Core using OS X and find that it does not compile in that environment either. I've asked the Temboo folks to try and sort it out. I'd love to run Temboo on the Spark Core.

  4. Hoang

    Hi, i was trying to compile your code but having issue with standard C++ min, max function.
    arm-none-eabi/include/c++/4.8.4/bits/stl_algobase.h:260:56: error: macro "max" passed 3 arguments, but takes just 2
    max(const _Tp& __a, const _Tp& __b, _Compare __comp)

    I also got the same problem while trying to compile the imported example from Spark online IDE. Do you have any idea what this error is and how to fix it?
    Thanks a bunch.

    1. kkaiser Post author

      I fixed the problem. You can now update to the V1.2.8 Version in the Spark on-line SDK and it will compile again.

  5. Stu

    Awesome! Great work! Would this work with Spark Photon too? My understanding is that the Photon is backward compatible with the Core?

    1. kkaiser Post author

      I don't have a proton to confirm but I expect it will work as Proton is promised to be backward compatible with Spark Core.

Comments are closed.