The TTL or time to live value is set to 60 seconds in all the published responses. To the best of my knowledge this is a “reserved-for-future-expansion” thing and is not used right now.

The stream should be continuously broadcasting (including keep-alives every 9 seconds) but it does die from time to time. There are three variables, the core, the cloud and the browser and generally speaking it is the browser that fails for me, but it is sometimes hard to know. You can use curl to log events to a file on the command line on a PC too.

You can remove the connect button and just always connect to the stream when the html page loads. That would give you the best chance of restarting, but it makes debugging harder.

If you are worried about missed events, you should consider Spark.variables instead. With Spark.publish like in this tutorial, you get push-type behavior, but with Spark.variables it would pull-type behavior. Both are good but they just have different uses.

I am note sure what you mean by:[quote=“mwdll, post:47, topic:3469”]
It also doesn’t seem to be published, so I can’t leverage it in some downstream system.
[/quote]

So if you are having a specific problem, let me know and I will try to help.

Thanks @bko. That was exactly what I needed to know, that I’m not missing some great capability of related to TTL. I was hoping there was some short-term storage of events that TTL was defining… maybe that is for the future. I want to capture events on a server, but I don’t want to listen constantly. Will have to wait for webhooks and keep using a REST call to my server using TCPClient.

@bko, I’ve played with both of these tutorials with a great deal of success! They’re excellent write ups and the firmware and html is easy to follow!!!

I’ve searched at great length to find some example html just to show variables. That would seem an easier task than displaying an event, something that updates a web page as often as specified, but I’ve come up short trying to modify these spark.publish examples to show variables instead of events. Frankly, I think it’s just a matter of retrieving, parsing, and displaying the json, but again, comin’ up short… Any chance you could shed some light on how to do such a simple task?

If you have seen my Spark.publish() tutorials, you know that I like to have private web pages (since if it was public, your access token would be exposed) that read and even graph data from my Spark core. Well, you can do similar things with Spark.variables() and here’s how.
Declare Yourself
Let’s say you have a Spark.variable declared and you want to read it. In my case, it is a temperature variable from a DS18B20 temperature sensor and I have converted the value to a double-precision floati…

Hi @bko, I am just at the #thingscon in Munich, met @zach and solved my problem to consume these events with a plain node.js program, e.g. from the server. I ripped the code off the spark CLI into a small standalone program and just post this here for others in case they need it. Below code uses the node request library, ripped out of the spark cli

@bko, that new writeup was exactly what I was after! A simple writeup for displaying variables AND published streams gets people off on the right foot! I tried pulling apart spark helper, but as wonderful as that tool is, the code behind it was just too complex to try to simplify. Thanks so much for all that you do!

This is a mini-tutorial!
Let's say your have sensor farm of Spark cores all publishing their data to the Spark Cloud. You need a way to see all that data in near real-time, all on one dashboard. So, how can you do it? There are lots of ways but here's a simple one that you can edit and change to meet your needs.
First a couple of in action screen shots, first for the event named "**Uptime**":
[image]
And here's another shot for the event named "**Temp**":
[image]
As you can see, …

The basic situation is a sensor farm periodically reporting events to a server but the sensor’s future activity is modified by the contents of a database. Thus, I need to transmit and record the event but i need to send data back to the reporting sensor. I agree that these tutorials should be self-contained so just echoing the sensor value back would be a great start. Thanks again.

Let's say you have a Spark core with both a Spark variable and a Spark function. In my case, I have the world's simplest servo control sketch, using the nice miniature servo in the Spark Maker Kit. Using these parts, we will build a servo that you can control from a web page anywhere and monitor the position of the servo too!
Here's a preview of the finished tutorial:
[image]
There is a coarse adjust slider, two fine adjust buttons, and the current position of the servo is displayed. T…

The web database part of this is tricky since without a dedicated server, which I can’t really use for a tutorial, you are limited in what you can show.

That way you can tell if it is the core side or the web side that is not working.

You want to keep your access token secret, but your device id shows when you publish a public event, so we can see it. Right now, only three cores are publishing the Uptime event and only is doing it in JSON format:

So if you core is 53ff…1587, then it is publishing a JSON. Otherwise, I would start looking at the core side.

With curl or with the HTML, you don’t need the << >> angle braces around the numbers. You want to put just the numbers there.

Also you cannot use your device name, you must use the hex number device id in the HTML file for the deviceID variable.

To get email from the forum, click on your avatar picture in the upper right corner and select Profile. Now click on the Preferences button with the little gear and scroll down to the email section. There are several options for you to choose there.