I’m going to start testing out live video streaming of open source projects. I’ve been a viewer on Twitch.tv for a while and have recently started checking out LiveCoding.tv.

To make it easy to find my stream, I figured I’d embed it directly on my WordPress blog. While there are a few plugins that purport to do that through WordPress widgets or shortcodes, they each have a major limitation: they always showed up. In my case, I only want the stream to show when I’m broadcasting, not waste time with a black box in the middle of the page.

Supporting HTTPS

The normal way to embed a stream is by creating an IFRAME with a source of http://www.twitch.tv/{$channel}/embed (http://www.twitch.tv/bertjohnsondotcom/embed in my case). You’ll notice that’s over HTTP, but I use CloudFlare for SSL offloading to ensure all connections are performed over HTTPS.

By default, Twitch doesn’t support HTTPS streaming yet. It’s been on the roadmap for a while, but any attempts to visit https://secure.twitch.tv are redirected back to the HTTP version. When embedding the HTTP version of Twitch, you won’t see the content or even a warning on the HTTPS site in most browsers. Instead, it will silently fail.

To work around that, I found an HTTPS Flash endpoint available on Justin.tv’s CDN (which was the precursor to Twitch.tv). There’s a definite tradeoff, as requiring Flash means this won’t work on many browsers (including the vast majority of mobile platforms), but I consider that better than allowing non-encrypted traffic.

The end result

When I first created OpaqueMail, I faced the difficult choice between S/MIME and PGP as the standard for encryption.

The advantages for S/MIME were:

A lower barrier of entry due to supporting libraries pre-installed with Windows.

Greater familiarity and ease of use for developers used to public key infrastructure.

Lower complexity of managing, securing, and choosing keys.

S/MIME adoption has grown, partly thanks to the usability of OpaqueMail, but it remains prohibitively complex for many scenarios.

Email is being increasingly secured through PGP. I don’t have reliable data, but PGP seems to enjoy wider adoption and awareness from the general public. I’ve wanted to support both for a while now, but needed a good reason to embark on the PGP path.

A tipping point for me this month has been Facebook’s new support for publishing PGP keys. Finally, there is a public, (largely) trusted database where users can share keys. Instead of the traditional “web of trust”, I expect key databases (like keybase.io) to foster increased adoption.

With that background, I’ve started adding PGP support to OpaqueMail, now available to test in the 2.2.0 release. For the first time, OpaqueMail now has a dependency on another open source library: BouncyCastle. PGP is far too complex to implement from scratch and thankfully BouncyCastle provides secure, high-performance, and complete libraries for cryptography. The Legion of the Bouncy Castle dates back to the 1990s and their code has been scrutinized by tens of thousands of developers.

OpaqueMail 2.2.0 features PGP decryption and signature verification only. Encryption and signature creation is planned for a future release.

Beyond that, I may start another project to streamline public key discovery from Facebook and other federated sources.

WordPress Files

For your files (code, uploads, and other multimedia), it’s easy. Under your Web App settings, choose “Backup” and create a schedule. In my case, I went with daily backups (retained for 30 days), backed up in the same resource group in the same region. With about 1 GB of files, backups complete in under 10 minutes.

WordPress Database

Now what about the database? Normally, I’d just set up a linked resource so the database would get backed up along with the website. Unfortunately, that doesn’t seem to be working in my subscription (and there are more than a dozen threads complaining about the same issue on TechNet). When I select a database to be backed up with my WordPress site, it fails instantly with a vague error in the operations log, leaving me with a 0 MB backup.

As a backup, I figured I’d just set up a scheduled database backup that would be saved on the filesystem, which would be part of the files backup. Well, it turns out that most of the mature WordPress backup plugins (e.g., WP-DB-Backup, BackWPup, BackUpWordPress) don’t support multi-sites and/or don’t support Project Nami. So that option’s out.

I recently launched a new intranet for a Fortune 500 client focused on internal communications and marketing awareness.

They use Tracx to mine and aggregate insights from social networks like Facebook and Twitter. Tracx enables multiple views for slicing and dicing tweets and analyzing sentiments of social posts. Imagine being able to query across all of your brands, segmented by country, age group, or gender.

Our goal was pretty simple in this case. We just wanted to display the ten most recent social posts (Facebook or Twitter) attached to a portfolio of nearly 100 brands.

Tracx provides a REST API, but no SDKs or guidance for integrating client-side. It seems like their API is targeted more towards .NET, Java, and open-source back-ends.

Here’s how I accomplished Tracx integration via JavaScript.

Getting Started: Your Key, Secret, and Query

In order to access Tracx’s OData endpoints, you’ll need an application key and secret. Those will be 32-digit alphanumeric keys.

You can log into the Tracx API portal at https://api.tra.cx/. From there, find the appropriate query to run. That took me a bit of trial and error before I found the following parameters to pass to the “/activity/posts” endpoint:

When we’re assembling our parameters to pass in, we have to order and sign/hash them in a precise way.

Support IE8/9 and Providing a Shared Cache

In our case, we decided that making a web service call directly to Tracx was undesirable for two reasons:

Tracx requires requests to be submitted using the “POST” verb, but older browsers including IE8 and IE9 only reliable support “GET” requests via XmlHttpRequest.

Tracx service calls have a cost, for both performance and licensing. Instead of having many users make dynamic calls simultaneously, we’d be better off having service calls cached for everybody for a period of time.

In order to accomplish both, we implemented my .NET Proxy open source solution, hosted in Azure. .NET Proxy supports shared caching, configurable via the web.config. It also supports verb transformation; by passing in a “httpmethod” parameter on the “GET” query string with the value “POST”, it proxies its outbound calls as “POST”. That allows “GET”-only browsers to simulate “POST”s.

The source code below assumes .NET Proxy is being used as an intermediary.

Tracx in JavaScript: Source Code

Here’s the complete source code for embedding Tracx tweets and social posts on your website.

Make sure to include the sha1.js and oauth.js dependencies referenced above.

The “LoadSocial” function loads the data asynchronously. Once loaded, the second method outputs the posts to a DIV with ID “TracxContent”.