Logging Weather Data to the Cloud

In this post I want to report some of the finding from a 4 month experiment with a Spark Core. The core is uploading pressure measurement values into the cloud every five minutes. It then loggs them into a Google Spreadsheet. I use the Spark Temboo library and service after I failed to reliably poll the core from Google.

Anyway the Spreadsheet has now more than 30,000 entries. The chart generation is a bit slow but everything is still running stable to this day. A plot of the 4 month is shown below:


After a measurement is taken the core goes to deep sleep to conserve energy. An STM103 internal HW timer interrupt is used to wake-up in time for the next measurement.

Over the four month I have seen only a few corrupted values. These are most likely transmission errors due to network connectivity issues.

PCB Design Workshop

Last weekend I attended a cool 1 day workshop "Designing a Circuit Board" given by Matt Berggren. Matt is a member of supplyframe San Francisco with many years of experience as a PCB designer and instructor.

The class is positioned as "Learn to Build a PCB from the ground up..." targeting hackers and professionals that want to tip their toes into the water of hardware design. This was the 4th installment of this class that always fills up withing a very short time.

Matt is an engaging presenter. He spent the morning explaining the fundamentals of PCB boards and technology. The class does't assuming an EE degree so Matt is careful to explain the fundamentals and terms relevant for PCB design. In fact he is quite a master in catering to the beginners and more advanced members of his audience.

Here is an overview of the topics Matt covered in the morning:

  • Circuit Board basics and terminology - layers, cores, finishes, objects (pads, vias, components, lands, land patterns, footprints, vias), multilayer vs. single-layer, layer types, etc.
  • Some electrical basics related to boards - dielectrics, copper considerations, current carrying, impedance control, parasitic capacitance (don't freak out, we'll explain all of this)

The afternoon was mostly spent hands-on working with Eagle. Here is what we covered:

  • Build our first schematic + PCB + CAM (simple USB power supply that generates some very basic voltages we can use on the bench to do stuff - very simple USB->LDO->Connector)
  • Looked at Eagle and learn some important Eagle terminology, basics of the menus, command line interface, shortcuts, etc + basic workflow (should prepare you for doing this on your own)

Cudo and thanks to Matt. He runs these classes for free at weekends. His audience's backgrounds are very divers. Some are complete electronic novices and some are pros. Matt masters this challenge by providing a lot of well selected practical technical information with entertaining anecdotes. Great teaching job.

Updated Temboo library available

I finally found some time to fix the problems with the Spark Temboo library. The following issues were fixed:

  • mib(), max() definition clash between the STL library and the Spark Wiring math library.
  • Misc makefile fixes to build the library from a shell

The updated source code is available at Github under https://github.com/bentuino/temboo.git Thanks to all the notes and hits I received from the community.

Temboo ignited with a Spark

This is a re-post of an article I wrote as a guest blogger for Temboo's blog.

I am working with connected devices and was looking for a cloud service. While surveying the field Temboo caught my eye because of the large number of supported premium web sites and the promise to connect IoT devices to the Internet in a breeze.

spark_size The connected device I am using is a Spark Core. The Spark Core is a sleek small board that offers a powerful 32 bit ARM CPU paired with WiFi. The product grew out of a Kickstarter campaign and is rapidly gaining in popularity. The Spark is nicely priced and everything is open source. The team supporting the Spark Core is smart and supportive and made a great choice to port most of the main Arduino APIs to their platform.

As outline in a blog post here migrating Arduino Libraries to the Spark Core often turns out to be pretty easy. With Temboo providing an open source library for Arduino I was tempted to give it a try. However, I had no Temboo-Arduino setup so I was not sure how hard it would be to get it all up and running.

Well, I am happy to report that is was easier than expected. Temboo's code is well written. I only had to work around some AVR specific optimizations that Temboo did to save program memory. As the Spark Core is built around a STM32F103 chip resources are not as tight as with the AVR so I simply bypassed these optimizations.

Here are some brief instructions how to install the Temboo Arduino Library. The instructions use the Spark command line SDK setup:

  1. Download the modified Temboo Arduino Library source code from github:
  2. Get the Spark Core firmware:
  3. In older Spark firmware there is a small problem that the spark team already fixed. Open the file core-firmware/inc/spark_wiring_ipaddress.h and uncomment the line 54 with your favorite editor:
  4. Save your TembooAccount.h you generated with DeviceCoder to temboo-arduino-library-1.2\Temboo
  5. Now it is time to build the Spark application:
  6. Connect your Spark Core to your computer via a USB cable
  7. Push both buttons, release Reset button and continue holding the other button until RGB-LED lights up yellow
  8. Download application into Spark Core

Temboo Examples

Two simple Spark application examples are included:

  • core-firmware/src/application_gxls.cpp - Example demonstrates the Temboo library with Google Spreadsheet
  • core-firmware/src/application_gmail.cpp - Example demonstrates the Temboo library with Gmail

to change the example that is built, edit the first line in the core-firmware/src/build.mk file:


Building this code was tested under Windows 8.1 using cygwin and the MINGW version of the ARM GCC compiler tool chain. It should be easy to use this Temboo Library with the Spark Cloud based SDK. To configure the Library to support Spark all the is required is to define the following label:

or add a

to the source code. Temboo support for the Spark Core is a lot of fun. It is easy to setup your own Temboo account and compile the Temboo Arduino Library that now supports the Spark Core platform. To learn more about similar projects please visit my blog at http://bentuino.com.

Google polls Spark Core

appsIn a previous blog post I was describing an example of how a Spark Core can be used to read weather sensors. The setup was really no different from any simple Arduino Uno setup. It only demonstrated how easy it is to port Arduino Sketches to a Spark Core.

With the integrated WLAN I was interested to connect the Spark Core to the internet cloud. One of the simplest ways I found, was using Google's Spreadsheet service. I stumbled over this idea in this Spark forum post.
Here is how it works: a Google Script is periodically reading data from the Spark Core via the RESTful Spark API and then appends the data to a Spreadsheet. The code below is a minimalistic Spark code to test the such a setup:

It publishes a variable for cloud access and then increments it in regular intervals. Together with the following Google script I was able to quickly pull data from my core.

However when I setup a time trigger to run the script in regular intervals I found the setup to be very unreliable. This is discussed and documented by several Spark Users and as of this writing I have not seen a fix for this problem.

One thing to note is, that this approach is pulling data from the Spark Core rather than the core pushing them to the cloud. This has a significant flaw as we cannot put the core into standby between the measurement intervals. Therefore this solution is anyway not a good choice for low power applications.

So stay tuned, I am experimenting with a better solution that I will blog about in my next post.

Using Arduino code on Spark Core

In an earlier post Spark Core Weather Station I presented code that was reading weather sensors and sent the data over USB to a PC terminal. However the approach I took there to combine all code into a single file is not very practical. So now that the Spark IDE supports multiple project files we can just include Arduino library files.

Screenshot 2014-04-16 20.49.13

So here are some hints how to get Arduino library code and sketches to compile in the Spark framework. I use a TMP102 I2C temperature sensor to demonstrate the porting. Below is an example Sketch that reads the temperature values and sends them to the serial interface:

The main reason this code fails to compile is the Arduino Wire.h library include statement fails on the cloud based Spark Core IDE. The TMP102 sketch code use I2C. So it includes the Arduino Wire Library with the following line:

This triggers an error in the Spark IDE. To solve this problem I created a new library named Wire.h with the following content:

With this library all we have to do is to uncomment or delete the original include statement. By adding Wire.h file the IDE automatically adds a line like this to the source code:

As long as the Arduino source follows basic guidelines for portable code I did not face much difficulties to use existing Arduino sources on the Spark Core.

RedBear BLE Module

Bluetooth Low Energy (BLE) transceiver board. BLE is a new protocol introduced in the 4.0 revision of the Bluetooth standard. It is a wireless personal area network (PAN) technology aimed at novel applications in the healthcare, fitness, security, and home entertainment industries.  BLE is not backward-compatible with the previous Bluetooth protocol. However it uses the same 2.4 GHz frequency bands but a simpler modulation scheme. The picture below shows the RedBearLab BLE Mini Board.MKRBL2-2T

The module is a combo of a RBT01 BLE module featuring the Ti CC2540 Single chip Bluetooth (SoC) and a breakout board that offers a micro USB connector and a 3.3Volt UART connectors. The main reason this board caught my attention is the fact that it also has solder points for additional GPIOs on the back of the board. These GPIOs can be custom programed. So it should be possible to hook up  I2C or SPI sensors to the board and with a bit of software be able to monitor them.
Using such a BLE board  is not exactly an IoT solution as the connectivity to the cloud would have to be implemented with for example a Wireless enhanced Gallileo. Also the BLE's range is rather limited. However when it comes to power consumption BLE has a leg up as it was specifically designed for very low power.

Spark Core compiler toolchain under cygwin

There are good instructions how to install the local toolchain to compile the Spark Core firmware. I don't want to replicate them here. Go to the Spark Core Github and check the Readme.md or search for a tutorial.
The purpose of this page is to show the steps and pitfalls when installing this toolchain under cygwin on a Windows 7 64 bit computer.

I assume you have at least the base cygwin installation on your machine. For instructions please head over to cygwin.org:

Screenshot 2014-04-07 22.05.40

Make sure you install git and any other goodies you like under cygwin.

  1. GCC for ARM Cortex processors - ARM cross compiler tool chain for Windows
  2. Make - Windows version of gmake
  3. Device Firmware Upgrade (DFU) Utilities - Utility to download the firmware to the Spark Core
  4. Zatig - USB driver for firmware downloads

Now install gcc for ARM and gnu make for Windows. Yes, cygwin also includes gmake. However there is a problem with it around dependencies. The MINGW GCC compiler uses Windows path notation in the *.d dependencies files that gmake under cygwin chokes on. So if you, the second time you try to compile, get an error like this:

it is because we use MINGW GCC under cygwin instead of a native cygwin compiler. You can also checkout a related posts on the Spark Community Board. It is now time to install the firmware. Pull the following three repositories from Github:

These repositories contain all the Spark Core firmware. Once the source code is downloaded, go to the build directory in the firmware folder and start the compile:

If everything went smooth you should now have a core-firmware.bin file.  Run the Zatig program to install the USB driver. The moment has come to flash the firmware into the Spark Core. For this, push the left button to reset the core while holding down the right button. Release the reset button and wait until the RGB-LED is flashing yellow. You can now download the firmware with the following command:

Screenshot 2014-04-08 03.03.57

Note the DFU utility always indicates an error "Error during download get_status" This is normal and as long as you see "File downloaded successfully" everything is fine.

Windows for Galileo


According to HotHardware.com there are signs that Microsoft will start supporting Galileo with a new "Windows on Devices" version that targets IoT and other smart devices.  Why is this noteworthy?  Well, this would in fact mean that the PC-era "Wintel" team is entering the Maker scene with their newly paired product offerings supporting an Arduino Maker platform.

This is certainly a welcome move as it broadens the choice of platforms and products Makers have to use in their project.

Spark Core Weather Station

In my previous blog post I described my first encounter with the Spark Core. Today I want to demonstrate a first simple code example. For this I connected the Spark Core to a Weather Shield from Sparkfun. The shield offers sensors for light, humidity, temperature and pressure. It can even be extended with rain and wind sensors as well as GPS.
.Spark Core Weather
The shield comes with a nice set of libraries and examples that I used as a starting point. To keep things really simple, I combine the entire Weather Shield source with the sensor library functions and the setup() and loop() into a single file. This did not take long and compiled quickly. I also removed the wind and rain related functionality as I did not plan to use those. The source code below takes measurements every second and writes the them to the USB serial port.

For debugging I used the Serial Communication link over USB. Windows users have to install a COM driver. However, MAC and Linux support the Spark Core USB functionality out of the box.

I was really pleased to see how well the Spark Core supports Arduino libraries and well written legacy code. With only a few code modifications I had the sensors up and running.

The setup is now streaming values over a USB cable to a PC. There I captured the values with Tera Term and created a Weather Graph from the comma separated values (CVS). The example below shows the pressure curve of a Bay Area Storm passing by at end of February 2014.
CA Storm
This setup is a somewhat trivial example that a basic Arduino can also do. The project really does not take advantage of the Spak Core's connectivity to the internet. So stay tuned for my next blog post where I will add internet connectivity to the setup.