{"API Evangelist"}

I had the pleasure of finally meeting someone in person, that I feel like I’ve known for years, while I was over in Australia, for API Days Sydney—Keran McKenzie (@keranm), API Evangelist for MYOBapi. I got to drink, and talk with Keran (finally), but more importantly I got to listen to him talk about his approach to evangelism at the accounting API.

I have given, and listened to more talks on evangelism and advocacy than I care to remember. I always work hard to make sure I sit and list to as many evangelism talks as I can—there is always something to learn. Keran's talk was informative, thorough, and most importantly it was genuine. Keran discussed many of the proven areas of evangelism we all share, but not in a robotic or corporate way, he shared his, and MYOB’s philosophy, and told the honest story about the challenges and triumphs.

Keran gives an energetic talk, but it wasn’t just the high energy that sold it for me, it was demonstrating that he actually gives a shit about how to do it right, admitting he doesn’t know everything, or nails it in all categories, and he gives an honest, friendly, approachable, and genuine story of what is API evangelism at MYOB.

This approach to API evangelism is critical to success. You won’t last long as an evangelist if you are full of shit, or the company behind you is, sorry—I’ve been in this position too many times. Don’t get me wrong, every API evangelist role is difficult, and full of compromises, challenges, and things that are out of your control—even in my world. In the end, the only thing you can do is be honest, genuine, and as transparent as you can, and never be afraid to tell you story like Keran does.

I spend a lot of time wading through press releases at the number of the dominate aggregate news outlets, looking for API news. I also have a number of scripts running, keeping an automated eye on the press sections for some of the companies I track on. The number of press releases available in corporate press areas vs. the number of press releases submitted by PR outlets, is much larger--which is a missed opportunity in my book.

Each day I look through PRWeb, PR Newswire, Business Wire and others trying to find the latest API stories. This is a process that will always be partially a manual process, as you just can’t look for API news without getting petroleum, education, pharmaceutical, and other API news—which has nothing to with my beloved application programming interface. In short, these sites suck. Their UI is crap, and their search mechanisms are shit.

The biggest problem with these news outlets do not usually even have RSS, let alone an API, requiring me to go to each site, each day, and sift through the search results. The world of press releases is like many other industries I engage with—needs a dead simple API driven solution. I want a press release API that allows any company to submit for free, but the problem is nobody will in the beginning, so it also needs to go out and find the direct corporate press sources as well, and scrape this long tail of the PR world.

I’d pay a monthly fee to be able to search this vast PR database via an API, and be able to integrate directly into my API monitoring system. It is something I could build, but I just do not have time, and as I do with my other ideas, I am putting out there for someone else to do. The API would be a very simple design, but the legwork to make it truly a rich, and valuable source of press, would take a significant amount of time.

This is one of the ideas I hope someone does not do, thinking it is a VC level start-up idea. It really is something that a handful of folks could bootstrap, and do very well in monetizing premium services, but if you go get VC money you will probably fuck it all up, and have to chase some unnecessary approaches to making money. If someone doesn’t do soon, I’m going to have to build an API focused version, because I am spending about 1-2 hours a day on this each day.

I am a couple days late on this weeks API.Report, after being sick last week, taking the weekend to recover--at least I got it done. The process is proving to very valuable to my understanding of what is going on, so I predict it will continue.

The Weekly API.Report represents the best of what I've read throughout the week, and is only what I personally felt should be showcased. Each news item comes with a link, and some thoughts I had after curating the piece of API related news. I'm trying to break down stories into as coherent buckets as I can, but it remains something that is ever changing for me, but ultimately I will settle on a clear definition for each of the research areas.

I am a big fan of any API 101 work out there, helping on-board new users, and industries:

That concludes my report on what I read last week across the API space. I'm still working on pulling together a summary e-newsletter version, allowing people to get an executive summary of what I thought was important from the week--I am hoping it will be available next week. I'm also going to auto-generate some visualizations, and summary counts for each week. I'd like to see a tag cloud, and overall counts to help me understand the scope of news I cover each week.

As I did last week, I'm walking away with a better awareness of what is happening across the space. It isn't enough for me to read all of this API news in isolation, it helps to see it side by side with other news, allowing me to see and understand patterns that I may have missed.

When you click on the APIs.son search icon in the address bar, you get a very cool visualization. Its pretty basic at the moment, just a visual catalog of the APIs available in the include collection of my stack, but when you connect up with the Swagger visualization work he's already done, we could have a pretty cool API catalog for managing and exploring microservices.

I have almost 20 micro-services listed, and Swagger.ed gives me the ability to navigate it in a very interactive way. Whats next? We don’t know…it is about exploration, and finding out the most meaningful way of exploring the APIs I deploy and aggregate into APIs.son collections.

I can see having a visual catalog of all of my API design that I collect, then deploy, and evolve them as needed for various parts of my infrastructure or for my clients, into other sub-collections? Disconnected collections? Loosely coupled collections? Not sure how tight I want things, part of the micro service definition for me is the size of the network connecting the services, as well as the services theselves.

Rebecca Williams (@internetrebecca) pointed me to a recent discussion on this topic earlier today, to which I added a couple of suggestions, but ultimately it is something I would like to see a more progressive solution emerge, something that can answer it real-time, and change as the API inventory in the federal government changes and evolves. Keeping a list, like 18F is doing, is a start, but we need more.

One of the side projects that I work on a couple of hours each week, is the profiling of federal government APIs using APIs.json, and Swagger, for use in my federal API stack. I only have a handful of the 120+ APIs that I’m profiling completed, but once done, you will be able to search for APIs based upon whether or not an API uses the verbs GET, POST, PUT, and DELETE—giving you a much more detailed picture of how government APIs function.

Generating machine readable API definitions in Swagger and API Blueprint are time consuming, but once you tie it together using a discovery format like APIs.json, it opens up a lot more opportunity to answer questions about government APIs. I’m doing a lot of the heavy lifting currently, to establish a critical mass of API definitions in federal government, then I am hoping that each agency will take ownership over maintaining their own definitions--if not, I think it can just as easily be done from the outside-in, by the community.

I’m excited for the potential, when all of the meta-layer of APIs and open data in government is rich, well defined, and machine readable by default. The continuing data.json work out of the government, and our own APIs.json format is looking to help with this over time. There is a lot of work ahead, as well as education to occur, before the meta data layer for government APIs is machine readable by default, but once it is, we’ll be able to answer questions like this in a much more efficient way.