I have just come back from APIstrat 2018 in Nashville Tennessee, it was an exceptionally well run event which provided an awesome opportunity to learn from the great and the good of the API community. If you have ever been to Nashville you will know that its a very cool city. Hosting the event in the Music City Center created really cool buzz and atmosphere which I think would be impossible to create anywhere else. The event was very well run, the content exceptional and the contacts I made so good, I look forward to attending the next one where ever it will land.
The first major take aways for me was that the Open Source community's collaboration on the OpenAPI specification is the key to a successful API Strategy. A well designed and well thought out API Specification helps drive all parts of API lifecycle management, and OAS 3.0 is further enabling new tools and resources that we will all benefit from.
The second takeaway for me was the importance of including security definitions and complete data specifications in your API spec that allows for DevSecOps and thorough API Security testing and auditing to help with that API Governance that most organizations strive for.
To keep things even more interesting than they already were, I thought I would use APIs to help me make sense of 3 days of presentations that I attended. Towards that end I recorded a few presentations and I set myself the challenge of using APIs to analyse the content. I didn't record every presentation so couldn't analyse all of the content but have selected @ presentation on GraphQL as my guinea pig. Mike's presentations were as entertaining as they were informative, he is a fellow that shares my belief in APIs, Microservices and Kubernetes.
So firstly, I sent his recorded presentation to the Google Speech API to get it translated from Voice to Text. As it was a twenty-minute recording, I had to take advantage of the LongRunningRecognize method that starts an asynchronous operation that can take a while to complete. However, I was able to keep up to date by checking the speech operations API. As a result I had quickly turned Mike's entertaining speech into a json transcript even though the translation clearly wasn't perfect, as some of Mike's jokes were lost in translation (I assure the reader it was a result of the asynchronous operation and not the quality of the Mikes joke).
Throw the data into ELK
Now I had to do a bit of coding to break the json transcript up into text and then feed it into the ElasticSearch Bulk API to get it into an Index that I could then analyse. I am a big fan of using the search and analysis power of ElasticSearch to quickly make sense of large volumes of data.
I now had 1875 words in an Index in ElasticSearch and needed to make some sense of them, so I filtered out some word classes like pronouns and conjunctions to try and leave just key words. In Kibana it is very easy to look at the data and so I created a tag cloud visualisation, which is a word cloud that looked like this:
This gives you a flavour of Mike's presentation but I am not convinced that I have captured the entire essence of Mike's message. Using just the APIs to try and analyse his speech, I used asynchronous APIs and had to fiddle with API security of different providers, and those topics were definitely key areas in this years conference. Another Open Source collaboration to watch is the Asynch API there is a lot of work going on in this space and there is definitely scope for a lot more blogging on that topic.