RSS

API Bots News

These are the news items I've curated in my monitoring of the API space that have some relevance to the API definition conversation and I wanted to include in my research. I'm using all of these links to better understand how the space is testing their APIs, going beyond just monitoring and understand the details of each request and response.

Bots, Voice, And Conversational APIs Are Your Next Generation Of API Clients

Around 2010, the world of APIs began picking up speed with the introduction of the iPhone, and then Android mobile platforms. Web APIs had been used for delivering data and content to websites for almost a decade at that point, but their potential for delivering resources to mobile phones is what pushed APIs into the spotlight. The API management providers pushed the notion of being multi-channel, and being able to deliver to web and mobile clients, using a common stack of APIs. Seven years later, web and mobile are still the dominant clients for API resources, but we are seeing a next generation of clients begin to get more traction, which includes voice, bot, and other conversational interfaces.

If you deliver data and content to your customers via your website and mobile applications, the chance that you will also be delivering it to conversational interfaces, and the bots and assistants emerging via Alexa and Google Home, as well as on Slack, Facebook, Twitter, and other messaging platforms, is increasing. I’m not selling that everything will be done with virtual assistants, and voice commands in the near future, but as a client we will continue to see mainstream user adoption, and voice be used in automobiles, and other Internet connected devices emerging in our world. I am not a big fan of talking to devices, but I know many people who are.

I don’t think Siri, Alexa, and Google Home will live up to the hype, but there is enough resources being invested into these platforms, and the devices that they are enabling, that some of it will stick. In the cracks, interesting things will happen, and some conversational interfaces will evolve and become useful. In other cases, as a consumer, you won’t be able to avoid the conversational interfaces, and be required to engage with bots, and use voice enabled devices. This will push the need to have conversationally literate APIs that can deliver data to people in bite-size chunks. Sensors, cameras, drones, and other Internet-connected devices will increasingly be using APIs to do what they do, but voice, and other types of conversational interfaces will continue to evolve to become a common API client.

I am hoping at this point we begin to stop counting the different channels we deliver API data and content to. Despite many of the Alexa skills, and Slack bots I encounter being pretty yawn-worthy, I’m still keeping an eye on how APIs are being used by these platforms. Even if I don’t agree with all the uses of APIs, I still find the technical, business, and politics beyond them evolving worth tuning into. I tend to not emphasize to my clients that they work on voice or bot applications if they aren’t too far along their API journey, but I do make sure they understand one of the reasons they are doing APIs is to support a wide and evolving range of clients, and that at some point they’ll have to begin studying how voice, bots, and other conversational approaches will be a client they have to consider a little more in their overall strategy.


Clearly Designate API Bot Automation Accounts

I’m continuing my research into bot platform observability, and how API platforms are handling (or not handling) bot automation on their platforms, as I try to make sense of each wave of the bot invasion on the shores of the API sector. It is pretty clear that Twitter and Facebook aren’t that interested in taming automation on their platforms, unless there is more pressure applied to them externally. I’m looking to make sure there is a wealth of ideas, materials, and examples of how any API driven platform can (are) control bot automation on their platform, as the pressure from lawmakers, and the public intensifies.

Requiring users clearly identify automation accounts is a good first place to start. Establishing a clear designation for bot users has its precedents, and requiring developers to provide an image, description, and some clear check or flag that identifies an account as automated just makes sense. Providing a clear definition of what a bot is, with articulate rules for what bots should and shouldn’t be doing is next up on the checklist for API platforms. Sure, not all users will abide by this, but it is pretty easy to identify automated traffic versus human behavior, and having a clear separation allows accounts to automatically turned off when they fit a particular fingerprint, until a user can pass a sort of platform Turing test, or provide some sort of human identification.

Automation on API platforms has its uses. However, unmanaged automation via APIs has proven to be a problem. Platforms need to step up and manage this problem, or the government eventually will. Then it will become yet another burdensome regulation on business, and there will be nobody to blame except for the bad actors in the space (cough Twitter & Facebook, cough, cough). Platforms tend to not see it as a problem because they aren’t the targets of harassment, and it tends to boost their metrics and bottom line when it comes to advertising and eyeballs. Platforms like Slack have a different business model, which dictates more control, otherwise it will run their paying customers off. The technology, and practices already exist for how to manage bot automation on API platforms effectively, they just aren’t ubiquitous because of the variances in how platforms generate their revenue.

I am going to continue to put together a bot governance package based upon my bot API research. Regardless of the business model in place, ALL API platforms should have a public bot automation strategy in place, with clear guidelines for what is acceptable, and what is not. I’m looking to provide API platforms with a one-stop source for guidance on this journey. It isn’t rocket science, and it isn’t something that will take a huge investment if approached early on. Once it gets out of control, and you have congress crawling up your platforms ass on this stuff then it probably is going to get more expensive, and also bring down regulatory hammer on everyone else. So, as API platform operators let’s be proactive and take the problem of bot automation on directly, and learn from Twitter and Facebook’s pain.

Photo Credit: ShieldSquare


The Waves Of API Driven Bots Invading Our Shores

As each wave of technology comes crashing on the shores of the API space you’ll mostly find me silent, listening and watching what is happening. Occasionally you’ll hear me grumble about the aggressiveness of a single wave, or how unaware each wave is of the rest of the beach, or of the waves that came before them. Mostly I am just yelling back to the waves that claim, “we are going to change the beach forever”, and “we are the wave that matters, better than all the waves that came before us”. Mostly, it is the hype, and the unrealistic claims being made by each wave that bothers me, not the waves themselves.

I do not think that technology won’t have an impact on the beach. I just think that us technologists tend to over-hype, and over-believe in the power each wave of technology, and that we do not consider the impact on the wider beach, and the amount of sand that ends up in everything. I don’t doubt that there will be some gems found in the sand, and that geologically speaking that the ocean plays a significant role in how the coastline is shaped. I’m just choosing to sit back on the bluff and enjoy my time on the beach, and not choosing to be a three year old playing in each of the waves, super excited by the sound each crash makes on the beach. I’m not saying that playing in the waves is wrong, I’m just choosing to look at the bigger picture from up here on the bluff.

You can see one such canvas being painted over the last couple of years with what has become to be known as “bots”. Little automated nuggets of tech goodness, or evil, depending on your location on the beach. People love saying that bots will change everything. They’ll be your assistant. They’ll do everything for you. They’ll automate your life. Take care of your parking tickets. Buy your groceries. Raise your children. Feed hungry people in Africa. When in reality, they tend to be annoying, harassing, and can be mess up an entire election kind of bad. They can DDoS. They can threaten to kill and rape you. But, hey, let’s keep investing in them, and building platforms that support them, without ever acknowledging the negative consequences they have on our beautiful beach.

Some days when I’m swimming in the bot waves I’ll be completely consumed. The undertow grabs me, spins me around, and I don’t know which way is up, and I end up with a mouth and ass-crack full of sand before I can make my way to the beach. I felt this way over last Christmas as I tried to make sense of the fake news engine, and what was coming out of Russia. Other days I feel like I’m walking on the beach collecting agates, finding some polished glass, but occasionally also finding some really beautiful agates. Today is one of those days, and I seem to be finding more bots that are actually useful, and do one thing well, without all the bullshit, and hype. Showing me the potential of this technology, and the specs of usefulness it can bring to our silicon beach.

I’m finding useful bots that will convert a file for me. Transcribe a audio file. Notify me of a changes within my domain(s), which happens to be beach front real estate. Amidst all the sand and gravel I am seeing meaningful bot implementations. They aren’t anywhere near the automation we have been promised, but they are providing some value. Today I have a handful of these agates in the palm of my hand. We’ll see if I’m actually able to use these in my day to day world. Maybe hang one from the window by a string. Mount one in a piece of jewelry that will be worn infrequently. Who knows, maybe one will become something I hold in my regularly, rubbing, soothing me as I do what I do as the API Evangelist each day.

Even with all of this (potential) usefulness I am finding in today’s bot waves, I’m still reminded of the power of the ocean. The dangers of sneaker waves while I’m heads down looking for agates. The power of the undertow while swimming on a sunny day. The ability for thousands of waves to come in and take away the beach, the bluff, and destroy the house I’ve built, as well as my neighbors. I’m reminded that no matter how shiny each gem is that I find in the waves, or how much I love the sound each wave crashing on the beach, but I can never stop thinking about the power of the ocean at scale. I mean, as our president recently point out, we are all “Surrounded by water. Big water. Ocean water.” Let’s not be distracted by each wave, and make sure we are always paying attention to the bigger picture. I feel like we have to do this for the “kids” as well, so the waves don’t sneak up on them.


Machine Readable Definitions For All Things API, Including Your Bots

Every aspect of my business runs as either YAML or JSON. This blog post is YAML stored on Github, viewed as HTML using Jekyll. All the companies, services, tooling, building blocks, patents, and other components of my research all live as YAML on Github. Any API I design is born, and lives as an OpenAPI YAML document on Github. Sure, much of this will be imported, exported, and exported with a variety of other tools, but the YAML and JSON definition is key to every stop along the life cycle of my business, and the work that I do.

It isn’t just me. I’m seeing a big shift in how many platforms, services, and tooling operate, with often times YAML, and still in many situations it has JSON, XML, and CSV at its core. Everything you do should have some sort of schema definition, providing you with a template that you can reuse, share, collaborate, and communicate around. Platforms should allow for the creation of these template schema, and enable the exporting, and importing of them, opening up interoperability, and cross-platform functionality–much like APIs do in real-time using HTTP. This is what OpenAPI has done for the API lifecycle, and there should be many complementary, or even competing formats that accomplish the same, but for specific industries, and use cases.

You can see this in action over at AWS, with the ability to export your Lex bot schema for use in your Alexa skill. Sure, this is interoperability on the same platform, but it does provide one example of how YAML and JSON definitions can help use share, reuse, and develop common templates for not just APIs, but also the clients, tooling, and other platforms we are engaging with. You’ll see this expand to every aspect of tech as continuous integration and deployment takes root, and Github continues it’s expansion beyond startups, into the enterprise, government, and other institutions. Along the way there will be a lot of no name schema finding success, but we will also need a lot more standardization and maturing as we’ve seen with OpenAPI, for all of this to work.

I hear a lot of grumbling from folks when it comes to YAML. I get it, I had the same feeling. It also reminds me of how I felt about JSON when it first emerged. However, I find YAML to be very liberating of brackets, slashes, and other delimiters, but I also find it is just one format, and I should always be supporting JSON, XML, and CSV when it comes to one dimensional schema. I don’t find it a challenge to convert between the formats, or keep some things one-dimensional to bridge to my spreadsheet oriented users. I actually feel it helps me think outside of my bubble. I enjoy rifling through the YAML and JSON templates I find on Github from a variety of operations, defining their bots, conversational interfaces, visualizations, CI/CD, configuration, clients, and other aspects of operations. Even if I’m never using them, I find it interesting to learn how others define what they are up to.


Which Platforms Have Control Over The Conversation Around Their Bots

I spend a lot of time monitoring API platforms, thinking about different ways of identifying which ones are taking control of the conversation around how their platforms operate. One example of this out in the wild can be found when it comes to bots, by doing a quick look at which of the major bot platforms own the conversation around this automation going on via their platforms.

First you take a look at Twitter, by doing a quick Google search for Twitter Bots:

Then you take a look at Facebook, by doing a quick Google search for Facebook Bots:

Finally take a look at Slack, by doing a quick Google search for Slack Bots:

It is pretty clear who owns the conversation when it comes to bots on their platform. While Twitter and Facebook both have information and guidance about doing bots they do not own the conversation like Slack does. Something that is reflected in the search engine placement. It is also something that sets the tone of the conversation that is going on within the community, and defines the types of bots that will emerge on the platform.

As I’ve said before, if you have a web or mobile property online today, you need to be owning the conversation around your API or someone eventually will own it for you. The same comes to automation around your platform, and the introduction of bots, and automated users, traffic, advertising, and other aspects of doing business online today. Honestly, I wouldn’t want to be in the business of running a platform these days. It is why I work so hard to dominate and own my own presence, just so that I can beat back what is said about me, and own the conversation on at least Google, Twitter, LinkedIn, Facebook, and Github.

Seems like to me, if you are going to enable automation on your platform via APIs, it should be something that you own completely, and make sure you provide some pretty strong guidance and direction.


Observability In Botnet Takedown By Government On Private Infrastructure

I'm looking into how to make [API security](http://security.apievangelist.com) more transparent and observable lately, and looking for examples of companies, institutions, organizations, politicians, and the government are calling for observability into wherever APIs are impacting our world. Today's example comes out of [POLITICO's Morning Cybersecurity email newsletter](http://www.politico.com/tipsheets/morning-cybersecurity), which has become an amazing source of daily information for me, regarding transparency around the take down of bot networks. _"If private companies cooperate with government agencies - for example, in the takedown of botnets using the companies' infrastructure - they should do so as publicly as possible, [argued the Center for Democracy & Technology]( https://www.ntia.doc.gov/files/ntia/publications/cdt-ntia-nistcommentsbotnetsfinal.pdf) . "One upside to compulsory powers is that they presumptively become public eventually, and are usually overseen by judges or the legislative branch," CDT argued in its filing. "Voluntary efforts run the risk of operating in the dark and obscuring a level of coordination that would be offensive to the general public. It is imperative that private actors do not evolve into state actors without all the attendant oversight and accountability that comes with the latter."_ I've been tracking on the transparency statements and initiatives of all the API platforms. At some point I'm going to assemble the common building blocks of what is needed for executing platform transparency, and I will be including these asks of the federal government. As the Center for Democracy & Technology states this relationship between the public and private sector when it comes to platform surveillance needs to be more transparent and observable in all forms. Bots, IoT, and the negative impacts of API automation needs to be included in the transparency disclosure stack. If the government is working with platform to discover, surveil, or shutdown bot networks there should be some point in which operations should be shared, including the details of what was done. We need platform [transparency](http://transparency.apievangelist.com) and [observability](http://observability.apievangelist.com) at the public and private sector layer of engagement. Sure, this sharing of information would be time sensitive, respecting any investigations and laws, but if private sector infrastructure is being used to surveil and shut down U.S. citizens there should be an accessible, audit-able log for this activity. Of course it should also have an API allowing auditors and researchers to get all relevant information. Bots are just one layer of the API security research I'm doing, and the overlap in the bot world when it comes to API transparency, observability, and security is an increasingly significant vector when it comes to policing, [surveillance](http://surveillance.apievangelist.com), but also when it comes to protecting the privacy and safety of platform people (citizens).


100K View Of Bot Space From The API Evangelist Perspective

I had a friend ask me for my thoughts on bots. It is a space I tend to rant about frequently, but isn’t an area I’m moving forward any meaningful research in, but it does seem to keep coming up and refuses to ever go way. I think bots are a great example of yet another thing that us technologists get all worked up about and think is the future, but in reality, while there will only be a handful of viable use cases, and bots will cause more harm, than they ever will do any good, or fully enjoy a satisfactory mainstream adoption.

First, bots aren’t new. Second, bots are just automation. Sure, there will be some useful automation implementations, but more often than not, bots will wreak havoc and cause unnecessary noise. Conveniently though, no matter what happens, there will be money to be made deploying and defending against each wave of bot investment. Making bots is pretty representative of how technology is approached in today’s online environment. Lot’s of tech. Lot’s of investment. Regular waves. Not a lot of good sense.

Top Bot Platforms Ok, where can you deploy and find bots today? These are the dominant platforms where I am seeing bots emerge:

  • Twitter - Building bots on the public social media platform using their API.
  • Facebook - Building Facebook messenger bots to unleash on the Facebook Graph.
  • Slack - Building more business and productivity focused bots on Slack.

There are other platforms like Telegram, and folks developing interesting Github bots, but these three platforms dominate the conversation when it comes to bots in 2017. Each platform brings it’s own tone when it comes to what bots are capable of doing, and who is developing the bots. Another important thing to note across these platforms is that Slack is really the only one working to own the bot conversation on their platform, while on Facebook and Twitter allow the developer community to own the conversation about exactly what are bots.

Conversational Interfaces When it comes to bots, and automation, I’m always left thinking more broadly about other conversational interfaces and Siri, or more specifically Amazon Alexa. The Amazon Alexa platform operates on a similar level to Slack when it comes to providing developers with a framework, and tooling to define and deliver conversational interfaces. Voice just happens to be the interface for Amazon, where the chat and messaging window is the interface for Slack, as well as Twitter and Facebook. Alexa is a bot, consuming API resources alongside the other popular definitions of what is a bot on messaging and social channels–expanding the surface area for how bots are deployed and engaged with in 2017.

Bots And APIs To me, bots are just another client application for APIs. In early days APIs were about syndicating content on the web, then they were used to deliver resources to mobile applications, and now they are delivering content, data, and increasingly algorithms to devices, conversational interfaces, signage, automobiles, home appliances, and on and on. When any user asks a bot a question, the bot is the making one or many API calls to get the sports statistic, news and weather report, or maybe the purchase of a product. There will be many useful scenarios in which APIs will be able to deliver critical resources to conversational interfaces, but like many other client implementations, there will be many, many bad examples along the way.

Algorithmic Shift In 2017, the API space is shifting gears from primarily data and content based APIs, to a more algorithmic focus. Artificial intelligence, machine learning, deep learning, cognitive, and other algorithmically fueled interfaces are emerging, wrapped in APIs, intent on delivering “smart” resources to the web, mobile, and conversational interfaces. We will continue to see an overwhelming amount of discussion at the intersection of bots, API, and AI in coming years, with very little actual results delivered–regardless, there will be lots of money to be made by a few, along the way. Algorithms will play a central role in ensuring the “intelligence” behind bots stay a black box, and sufficiently pass as at least magic, if not entirely passed off as comparable to human intelligence.

Where Will The Bot Money Be? When it comes to making money with bots, there will only be a couple value creation centers. First, the platforms where bots operate will do well (most of them)–I am not sure they all will generate revenue directly from bots, but they will ensure bots are driving value that is in alignment platform revenue goals. Next, defensive bot solutions will generate sufficient amounts of revenue identifying and protecting businesses, institutions, and government agencies from the bot threat. Beyond that, venture capital folks will also do well investing in both the bot disruption, and bot defensive layers of the conversation–although VCs who aren’t directly involved with bot investment, will continue to be duped by fake users, customers, and other bot generated valuations. Leaving bot blemishes on their portfolios.

Who Will Lose With Bots? Ultimately it is the rest of us who will come out with on the losing side of these “conversations”. Our already very noisy worlds will get even noisier, with more bot chatter in the channels we currently depend on daily. The number of humans we engage with on a daily basis will decrease, and the number of frustrating “conversation” we find ourselves stuck in will increase. Everything fake will continue inflate, and find new ways to morph, duping many of us in new and exciting ways. Markets will be noisy, emotional, and always artificially inflated. Elections will continue be just an an outright bot assault on voters, leaving us exhausted, numb, and pretty moldable by those who have the biggest bot arsenals.

Some Final Thoughts On Bots I am continuing to see interesting bots emerge on Twitter, Facebook, Slack, and other channels I depend on like Github. I have no doubts that bots and conversational solutions will continue to grow, evolve, and result in a viable ecosystem of users, service providers, and investors. However, I predict it will be very difficult for bots to ever reach an acceptable mainstream status. As we’ve seen in every important conversation we are having online today, some of most badly behaved amongst us always seem to dominate any online conversation. Why is this? Bots. We will see this play out in almost every business sector.


Bot Observability For Every Platform

I lightly keep an eye on the world of bots, as APIs are used to create them. In my work I see a lot of noise about bots usually in two main camps: 1) pro-bot - bots are the future, and 2) anti-bot - they are one of the biggest threats we face on the web. This is a magical marketing creating formula, which allows you to sell products to both sides of the equation, making money off of bot creation, as well as bot identification and defense–it is beautiful (if you live by disruption).

From my vantage point, I’m wondering why platforms do not provide more bot observability as a part of platform operations. There shouldn’t be services that tell us which accounts are bots, the platform should tell us by default, which users are real and which are automated (you know you know). Platforms should embrace automation and providing services and tooling to assist in their operation, which includes actual definitions of what is acceptable, and what unacceptable bot behavior. Then actually policing this behavior, and being observable in your actions around bot management and enforcement.

It feels like this is just another layer of technology that is being bastardized by the money that flow around technology so easily. Investment in lots of silly useless bots. Investment in bot armies that inflate customer numbers, advertising, and other ways of generating attention (to get investment), and generate revenue. It feels like Slack is the only leading bot platform that has fully embraced the bot conversation. Facebook and Twitter lightly reference the possibilities, and have made slight motions when it comes to managing the realities of bots, but when you Google “Twitter Bots” or “Facebook Bots”, neither of them dominate the conversation around what is happening–which very telling around how they view the world of bots.

Slack has a formal bots directory, and has defined the notion of a bot user, separating them from users–setting an example for bot developers to disclose who is bot, and who is not. They talk about bot ethics, and rules for building bots, and do a lot of storytelling about their vision for bots. Providing a pretty strong start towards getting a handle on the explosion of bots on their platform–taking the bull by the horns, owning the conversation, and setting the tone.

I’d say that Slack has a clearer business model for bots–not that people are actually going to pay for your bot (they aren’t), but a model is present. You can some smell of revenue strategies on Facebook, but it just feels like all roads lead to Facebook, and advertising partners there. I’d say Twitter has no notion of a botlike business model for developers. This doesn’t mean that Facebook and Twitter bots don’t generate revenue for folks targeting Facebook and Twitter, or play a role in influencing how money flows when it comes to eyeballs and clicks. Indirectly, Twitter and Facebook bots are making folks lots of money, it is just that platforms have chosen not to observable when it comes their bot practices and ecosystems.

Platform observability makes sense for not just platform, and bot developers, as Slack demonstrates it makes sense for end-users. Incentivizing bots generating value, instead of mayhem. I’m guessing advertising-driven Facebook and Twitter have embraced the value of mayhem–with advertising being the framework for generating their revenue. Slack has more of a product, with customers they want to make happy. With Facebook and Twitter the end-users are the product, so the bot game plays to a different tune.


A Bot That Actually Does Useful Things For Me

I’m not a fan of the unfolding bot universe. I get it, you can do interesting things with them–the key word being interesting. Most of what I’ve seen done via Twitter, Facebook, and Slack Bots really isn’t that interesting. Maybe it’s that I’m old and boring, or maybe because people aren’t doing interesting things. When you hear me complain about bots, just remember it isn’t because I think the technology approach is dumb, it’s because I think the implementations are dumb.

After several dives into the world of bots, looking to understand how bots are using APIs, I’ve found some interesting Twitter bots, and an even smaller number of Slack bots I found to be useful–I have yet to find an interesting Facebook Bot. Honestly, I think it is the constaints of each platform that are incentivizing interesting things to be done, and also the not interesting, and even dangerous things to be done. So I find it interesting when the bot conversation moves to other platforms, bringing with it a new sets of constraints, like I just saw with a new bot out of Hashicorp.

Hashicorp’s Bot does mundane Github janitorial work for me! This is automation (aka bot) activity I can get behind. I feel like much of the Slack automation I’ve seen is doing things that wouldn’t actually benefit me, and would be creating more noise than any solution it would bring–this is due to how I use Slack, or rather how I don’t use Slack. I’m a HEAVY Github user, and there are MANY tasks that are left undone. Things like tagging repos, README files, licensing, and the other things we either forget about, or just don’t have the time for. You fire up a bot to help me with these things, my ears are going to perk up a bit when it comes to the bot conversation.

In the end, I just need to remember that it is not bots that are boring and dumb–people are. ;-) That includes me. I find the concept of a Github bot infinitely more valuable than a Facebook, Twitter, or Slack Bot. I’m curious to see where Hashicorp takes this, and now that the concept of a Github Bot is on my radar, I’m guessing I will see other examples of it in the wild. I’m hoping this is an area we’ll see more bot development and investment, but I also understand Facebook, Twitter, and Slack have relevance in other peoples world, and that I’m the oddball here who finds Github a more interesting platform.


Setting The Rules For API Automation

Twitter released some automation rules this spring, laying the ground rules when it comes to building bots using the Twitter API. Some of the rules overlap with their existing terms of service, but it provides an interesting evolution in how platform providers need to be providing some direction for API consumers in a bot-driven conversational landscape.

They begin by laying the ground rules for automation using the Twitter API:

Do!

  • Build solutions that automatically broadcast helpful information in Tweets
  • Run creative campaigns that auto-reply to users who engage with your content
  • Build solutions that automatically respond to users in Direct Messages
  • Try new things that help people (and comply with our rules)
  • Make sure your application provides a good user experience and performs well — and confirm that remains the case over time

Don’t!

  • Violate these or other policies. Be extra mindful of our rules about abuse and user privacy!
  • Abuse the Twitter API or attempt to circumvent rate limits
  • Spam or bother users, or otherwise send them unsolicited messages

Twitter is just evolving their operation by providing an automation update to the Twitter rules and the developer agreement and policy, outlining what is expected of automated activity when it comes to engaging with users account, when bots are tweeting, direct messages, and other actions you take when it comes to Tweets or Twitter accounts. Providing an interesting look at the shift in API platform terms of service as the definition of what is an application continues to evolve.

While there were may automated aspects to the classic interpretation of web or mobile applications, bots are definitely bringing an entirely new meaning to what automation can bring to a platform. I think any API driven platform that is opening up their resources to automation is going to have to run down their list of available resources and think deeply about the positive and negative consequences of automation in the current landscape. Whether it is bots, voice, iPaaS, CI, CD, or any other type of API driven automation, the business, and politics of API operations are shifting rapidly, and the threats, risks, and stakes are only going to get higher.


An Opportunity To Emulate Slack Buttons

Slack released their Slack Buttons last year, to help as they state "reduce the number of small yet high-frequency tasks that quietly devour a user’s time." I know folks are obsessed with voice, bot, and other conversational interfaces, but I agree with Slack, that there is a huge opportunity to help users execute common API-driven functions with a single push of a button. It is something I blog about regularly, helping folks realize the opportunity in the development of API driven, embeddable buttons, that go beyond what Slack is doing and would run anywhere on the web, in the browser, or even on mobile and other Internet connected devices.

Zapier has taken a stab at this with Push by Zapier, and they have the inventory of APIs to support it. However, what I am thinking about should be even easier, embeddable anywhere, and leverage the web technology already in use (forms, inputs, GET, POST, etc.). If in a browser it should work like bookmarklets, and if it exists within the pages of a website or application, it should work as some simple copy/paste HTML with minimum JS when necessary to avoid blockage by ad blockers and other privacy protection tooling.

I understand that it is easy to follow the pack when it comes to the latest technology trends, and I'm sure voice and bots will gain some mindshare when it comes to conversational interfaces. However, sometimes we just need the easy button for those high-frequency tasks that quietly devour our time, as Slack puts it. As I see JSON-LD embedeed into more web pages, further pleasing the Google search algorithm, there would also be more semantic opportunities for browser buttons to engage with the data and content within web pages in a more meaningful, and context aware way.

API driven buttons and similiar embeddables and browser tooling is such an interesting opportunity that I would tackle myself, but I'm so resistant to anything that might take me away from my work as the API Evangelist, which is why I'm putting it out here publicly. I will put more time into some single use button examples that might start the conversation for some, but I recommend taking a look at what Slack and Zapier have done to get started. As the pace of technology continues it's blind march forward at the request of VCs, I'm confident that there is huge opportunity for simple solutions like buttons to make a significant impact--let me know if you'd like to know more.


Asynchonous Conversational Interfaces For Us Anti Social Folks

I get why people are interested in voice-enabled solutions like Alexa and Siri. I'm personally not a fan of speaking to get what I want, but I get the attraction for others. Similarly, I get why people are interested in bot enabled solutions like Facebook and Slack are bringing to the table, but I'm personally not a fan of the human-led noise in both of these channels, let alone automating this mayhem with bots.

In short, I'm not 100% on board that voice and bots will be as revolutionary as promised. I do think they will have a significant impact and are worthy of paying attention to, but when it comes to API driven conversational interfaces, I'm putting my money on push driven approaches to making API magic happen. Approaches like Push by Zapier, and Webtask.io, where you can initiate a single, or chain of API driven events from the click on a button in the browser, in a web page, on my mobile phone, or hell, using the Amazon Dash button approach.

These web tasks operate in an asynchronous way, making them more conversational-esque. Allowing those of us who are anti-social, and have adequate air gapped our social and messaging channels, and haven't fully subscribed to the surveillance economy, alternate solutions. These mediums could even facilitate a back and forth, passing machine readable values, until the desired result has been achieved. Some conversations could be predefined or saved, allowing me to trigger using a button at any point (ie. reorder that product from Amazon, retweet that article from last week). I'm not saying I don't want to have an API-enabled conversation, I'm just not sure I want a speaker or bot always present to get what I need to get done in my day.

I understand that I am not the norm. There are plenty of folks who have no problem with devices listening around their home or business, and are super excited when it comes to engaging with half-assed artificial intelligence wich accomplish basic tasks in our world(s). But I can't be that far out in left field. I'm thinking the landscape for conversational interfaces will be diverse, with some voice, some chat, and hopefully also some asynchronous approaches to having conversations that can be embedded anywhere across our virtual and physical worlds.


Thinking More About API Driven Conversational Interfaces

I am spending a lot of time thinking about conversational interfaces, and how APIs are driving the voice and bot layers of the space. While I am probably not as excited about Siri, Alexa and the waves of Slack bots being developed as everyone else, I am interested in the potential when it comes to some of the technology and business approaches behind them.

When it comes to these "conversational interfaces", I think voice can be interesting, but not always practical for actually interacting with everyday systems--I just can't be talking to devices to get what I need done each day, but maybe that is just me. I'm also not very excited about the busy, chatty bots in my Slack channels, as I'm having trouble even dealing with all the humans in there, but then again maybe this is just me. 

I am interested in the interaction between these conversational interfaces and the growing number of API resources I track on, and how the voice and bot applications which are done thoughtfully, might be able to do some interesting things and enable some healthy interactions. I am also interested in how webhooks, iPaaS, and push approaches like we are seeing out of Zapier, can influence the conversation around conversational interfaces. 

Conceptually I can be optimistic about voice enablement, but I work in the living room across from my girlfriend, I'm just not going to be talking a lot to Siri, Alexa or anyone else...sorry. Even if I move back to our home office, I'm really not going to be having a conversation with Siri or Alex to get my work done, but then again maybe its just me. I'm also really aware of the damaging effects of too much messaging, chat, and push notification channels open, so the bot thing just doesn't really work for me, but then again maybe it's me. 

I am more of a fan of asynchronous conversations than I am of the synchronous variety, which I guess could be more about saved conversations, crafted phrases or statements that run as events triggered by different signals, or even by me when I need via my browser--like Push by Zapier does. I see these as conversations, that enable single or a series of API enabled events to occur. This feels more like orchestration, or scripted theater which accomplishes more of what I'm looking to do than synchronous conversations would accomplish for me.

Anyways, just some exercising of my brain when it comes to conversational interfaces. I know that I'm not the model user that voice and bot enablement will be targeting with their services, but I can't be all that out in left field (maybe I am). Do we really want to have conversations with our devices or the imaginary elves that live on the Internet in our Slacks and Facebook chats? Maybe for some things? What I'd really like to see is a number of different theaters where I can script and orchestrate one time, and recurring conversations with the systems and services I depend on daily, with the occasional synchronous engagement required with myself or other humans, when it is required.


Flow Abstraction And Intent Layer On Top Of APIs To Feed The Bots

I was reading an interesting post on developing bots from Meya, a bot platform provider, which I think describes the abstraction layer between what we are calling bots, and what we know as APIs. I have been trying to come up with a simple way of quantifying the point where bots and APIs work together, and Meya's approach to flow and intent provides me with a nice scaffolding.

The flow step of their bot design rationale provides a nice way to think about how bots will work, breaking out each step of the bot interaction in plain English. They use a YAML format they call Bot Flow Markup lLnguage, or BFML, to describe the flows, comparing BFML to HTML, with this definition:

HTML is spatial, and BFML is temporal. HTML determines where UI exists, and BFML determines when UI exists.

The second part of their bot design rationale involves Intents, providing this additional definition:

If BFML is like HTML, then intents are like URLs.

According to Meya, "intents can be keywords, regular expressions, and natural language models as you get more sophisticated". This seems to be where the more human aspect of what is getting done here is defined, mapping each intent to a specific flow, which can execute one or many steps to potentially satisfy the intent.

The third step is components, which is where the direct API connection comes clear. If you look at their example, in the component they are simply making a call to the Chuck Norris joke API, returning the results as part of the flow. Each part of the flow calls its targeted component, and each component can make a GET, POST, PUT, PATCH, or DELETE to an API that provides the data, content, or algorithm behind the component.

This provides me with a beginning scaffolding to think about how bot platforms are constructing the API abstraction layer behind bot activity. I will be going through other bot platforms to understand each individual napproach. Bots to me are just another endpoint for the API economy, and like mobile phones, we can have the API layer be shadowy and dark, or we can have it be more transparent and standardized, with platforms sharing their approach like Meya does.

I am picturing a world where we share open definitions of bot flows, and the intents that trigger them in YAML.  There will be marketplaces of flows, sharing the logic behind the what is working (or not) within the bot community. These flows shouldn't be a company's secret sauce, any more than the API definitions that they employ within each function are. The secret sauce should be the data, content, and algorithms behind each API, that is called as part of any flow, designed to satisfy a specific intent.

When providers like Meya share they approach via their blog it gives me the opportunity to learn about their approach. It also gives me the opportunity to explore, and compare with the rest of my research, without having to always fire up their platform--which I do not always have the time to do (I wish I did). This helps me push forward my bot research in baby steps, derived from people who are doing interesting things in the space and are willing to share with the community--which is what API Evangelist is all about.


Beyond Mobile: API Ready For iPaaS, Voice, and Bots

I enjoy being able to switch gears between all the different areas of my API research. It helps me find the interesting areas of overlap and potentially synchronicity in how APIs are being put to work. After thinking about the API abstraction layer present in Meya's bot platform, I was reading about Clearbit's iPaaS integration layer with Zapier. Zaps are just like the components employed by Meya, and Clearbit walks us through delivering intended workflows with the valuable APIs they provide, executed Zapier's iPaaS service.

Whether its skills for voice, intents for bots, or triggers for iPaaS, an API is delivering the data, content, or algorithmic response required for these interactions. I've been pushing for API providers to be iPaaS ready, working with providers like Zapier for some time. I predict you'll hear find me showcasing examples of API providers sharing their voice and bot integration solutions, just like with Clearbit has with their iPaaS solutions, in the future.

I would say that even before API providers think about the Internet of Things, they should be thinking more deeply about iPaaS, voice, and bots. Not that all these areas will be relevant, or valuable to your API operations, but they should be considered. If you have the resources, they might provide you with some interesting ways to make your API more accessible to non-developers--as Clearbit opens their blog post opening.

When it comes to skills, intents, and iPaaS workflows, I am thinking we are going to have to be more willing to share our definitions (broken record),  like we see Meya doing with their Bot Flow Markup Language (BFML) in YAML. I will have to do some more digging to see how Amazon is working to make Alexa Skills more shareable and reusable, as well as take another look edition of the Zapier API to understand what is possible--I took a look at it back in the spring, but will need a refresher. 

While the world of voice and bots API integration seems to be moving pretty fast, I predict it will play out much like the iPaaS world has, and take years to evolve, and stabilize. I'm still skeptical about the actual adoption of voice and bots, and it all living up to the hype, but when it comes to iPaaS I'm super hopeful about the benefits to actual humans--maybe if we consider all of these channels together, we can leverage them all equally as common tools in our API integration toolbox.


The Bot Platform That Operates Like Alexa Will Win

I'm going through Amazon's approach to their Alexa voice services, and it is making me think how bot platforms out there should be following their lead when it comes crafting their own playbook. I see voice and bots in the same way that I see web and mobile--they are just one possible integration channel for APIs. They each have their own nuances of course, but as I'm going through Amazon's approach, there are quite a few lessons on how to do it correctly here--that apply to bots. 

Amazon's approach to investment in developers on the Alexa platform and their approach to skills development should be replicated across the bot space. I know Slack has an investment fund, but I don't see the skills development portion present in their ecosystem. Maybe it's in there, but it's not as prominent as Amazon's approach. Someday, I envision galleries of specific voice and bot skills like we have application galleries today--the usefulness and modularity of these skills will be central to each provider's success (or failure).

I had profiled Slack's approach before I left for the summer, something I will need to update as it stands today. I will keep working on profiling Amazon's approach to Alexa, and put together both as potential playbook(s). I would like to eventually be able to lay them side by side and craft a common definition that could be applied in both vthe oice API, as well the bot API sector. I need to spend more time looking at the bot landscape, but currently I'm feeling like any bot platform that can emulate Amazon's approach is going to win at this game--like Amazon is doing with voice.


API Providers Could Add A Page To Showcase Their Bots

I am coming across more API providers who have carved off specific "skills" derived from their API, and offering up as part of the latest push to acquire new users on Slack or Facebook. Services like Github, Heroku, and Runscope that API providers and developers are putting to work increasingly have bots they employ, extending their API driven solutions to Slack and Facebook.

Alongside having an application gallery, and having an iPaaS solution showcase, maybe it's time to start having a dedicated page to showcase the bot solutions that are built on your API. Of course, these would start with your own bot solutions, but like application galleries, you could have bots that were built within your community as well.

I'm not going to add a dedicated bot showcase page until I've seen at least a handful in the wild, but I like documenting these things as I think of them. It gives me some dates to better understand at which point did certain things in the API universe begin expanding (or not). Also if you are doing a lot of bot development around your API, or maybe your community is, it might be the little nudge you need to be one of the first APIs out there with a dedicated bot showcase page.


Staying True To My API Core And Seeing Bots As Just One More Design Constraint

Over the last five years many of us have been pushing forward our API design skills to deliver valuable resources to mobile apps. The multi channel opportunity for delivering data, content, and other resources to not just websites, and web applications, but also to iPhone, Android, Windows, and other mobile platform is clear. Alongside delivering web apps to multiple browsers (IE, Chrome, and Firefox), we had to learn to deliver to an increasingly diverse mobile ecosystem (iOS, Android, Windows).

When you have an API core, crafting new endpoints that speak to new channels is not as daunting of a task. I'm thinking through how to deliver API Evangelist knowledge into a Slack channel, Facebook Messenger discussions, and via Alex Voice for use at our APIStrat Conference in Boston this fall. Whether I am creating bots, or delivering new skills to voice enablement, I'm just adding new design constraints to my already existing stack of things I consider when I'm crafting my API stack(s).

I can't help but be in tune with the frenzy around bots, and the interesting things AWS is doing within the Alexa ecosystem. I do not feel like it is a distraction from my core business, as I'm just applying some new design constraints to what I am already doing. I'm not chasing entirely new things, I'm just responding, and participating in what the latest shifts in what technology are.

I don't feel like any of the 50 areas of the API space that I'm keeping an eye on are distractions, as long as I stay true to my API core, and they help me critically think through, and apply new design constraints to the core business value I am already bringing to the table.


It Will Be All About Doing Bots, Selling Picks & Shovels, Providing Platforms, Or The Anti-Bot Technology

As I monitor the wacky world of bots that is unfolding, I am beginning to see some pretty distinct pools of bot focused implementations, which tell me two things: 1) There will be a lot of activity in this space in the coming year & 2) It is going to be a shit show helluva ride. 

If you haven't picked up on it yet, everyone is talking about bots (even me). Its the new opportunity for startups, investors, and consumers! Whether its Twitter, Slack, or any of your other preferred messaging channels, bots are invading. If you believe what you read on the Interwebz, the bot evolution is upon us--let's all get to work.

First, you can build a bot. This is by far the biggest opportunity around, by developing your own sentient being, that can deliver anyone sports stats and financial tips in their Slack channel. Every techie is busy sketching out their bot design, and firing up their welder in the workshop. Of course I am working on my own API Evangelist bot to help do my evil bidding, because I am easily influenced by trends like this.

If building bots isn't in your wheelhouse, then you can always sells the picks & shovels to the bot builders. These will be the API driven dictionaries, file conversion, image filtering, SMS sending, flight lookup, hotel booking, and other picks and shovels that the bot builders will be needing to do what they do. You have a database of election data? Maybe a bunch of real estate and census data? Get to work selling the picks and shovels!

If you aren't building bots, or equipping the people that are, you are probably the Twitters, Slacks, Microsofts, and Facebooks of the world who are looking to be the platforms in which all the bots will operate, and assault the humans. Every platform that has an audience in 2016, will have a bot enablement layer by 2018, or you will go out of business by 2020. Not really, but damn, that sounded so good, I'm going to keep it. 

The best part of diving deeper into the world of bots, is that you don't have to look to far before you realize this is nothing new. There have been chatbots, searchbots, scrapebots, and formbots for many many years. Which really completes the circle of life if you realize there is already a thriving market for anti-bot technology, as demonstrated in this email I received the other day:

I wanted to follow up with you and see if you might benefit from an easy and accurate way to protect your website from bad bots, API abuse and fraud.

With Distil Networks, you can put an immediate stop to competitive data mining, price scraping, transaction fraud, account hijacking, API abuse, downtime, spam and click fraud. 

Website security can be a bit overwhelming, especially with bot operators becoming more sophisticated.

We’ve put together eight best practices for fighting off bot intrusions to give you a solid foundation and complete understanding when evaluating bot mitigation vendors.

Bots have been delivering value, and wreaking havoc for some time now. With this latest support from platforms, new attention from VC's, and a fresh wave of bot builders, and their pick and shovel merchants, I think we will see bots reach new heights. While I would love for these heights to be a positive thing, I'm guessing with the amount of money being thrown at this, it will most likely be a cringe worthy shit show. 

Along with voice enablement, I will be tracking on the world of bots, partly because I can't help myself (very little self control), but mostly because I think the possibilities are significantly increased with the number of API driven resources that are available in 2106. The potential for more interesting, and useful bot implementations, if done right, is pretty huge when you consider the wealth of high value data, content, and algorithmic resources available available to the average developer. 

I am working to remain optimistic, but will not be holding my breathe, or expecting the best in all of this.


No We Do Not Build Open Source Software, We Build Bots That Make It Look Like You Do #DesignFiction

If you want to stay relevant in the modern software industry, you need to be sending all the right signals your customers will be looking for. One of these signals is showing that your company supports the concept of open source software. To get government contracts these days, you often have to provide open source software possibilities -- even if your core offering is proprietary, or operates in the cloud.

It is always cool to do open source, but sadly it is also something that isn't in the DNA of every company. Without an existing team, most companies are faced with the costly challenge of assembling the right team to deliver an open source offering. This is where we come in, and our fully automated, open source software community bot, where you set your spend, and we take care of everything else.

Our bots will slowly build the software, write blog posts, submit bug tickets, an manage community forums. We do not really build open source software solutions, we provide a bot army that will put on the show of an open source community, within your domain. Most of your customers will never venture into the open source side of your operations, but for those that do, we can put on a good show. To the moderately engaged individual, your open source software community will send all the right signals. There will be commits, conversations, pseudo implementations, and even active Twitter conversations, followers and much, much more.

Our Open Source Software Community Bots™ will take care of all the heavy lifting for your company. If you sign up for a 3 year contract today, we throw in a free OSS Haterz™ add-on package, helping you deal with the inevitable fact that haters are gonna hate. Remember, you have a 3-5 year window to make the impact that you will need, and achieve the numbers required to achieve the exit you are looking for. Make sure you are sending the right signals, and using the latest bot-enabled tech.


Sucked Into The World Of Bots And APIs

I am slowly getting sucked into the world of bots. I've been tagging stories related to Twitter bots for some time, but it was the growing buzz of Slack bots that has really grabbed my attention. It pushed me to light up a research area, so that I can begin to look at things closer, and work to understand the common building blocks, like I do for the other areas of the API space.

The world of bots intrigues me, from the perspective of how APIs can be used to execute bots, but also provide the valuable resources needed to deliver bot functionality. I feel like the list of categories of available Slack bots are somewhat telling of the business potential for bots:

While I find Twitter bots creative and interesting, there are many I also found annoying as hell. I've had similar responses to some of the bots I've encountered on Slack. I tend to be pretty boring when it comes to goofy shit online, so my threshold for bot silliness is low. I don't care what others do, I just don't go there that often, so you'll see me highlighting more of the business productivity bots, or the more creative ones. 

Another thing I think is significant, is the growing number of platforms in which bots are becoming common practice, with many of the bots operating on multiple platforms.

  • Telegram bots
  • Snapchat bots
  • Kik bots
  • Trello bots
  • Another interesting part of this, is that not all bots are executed via API. Some bots simply use the chosen protocol of popular messaging applications, and operate via an account setup on the platform. However, even with these implementations, APIs still come into play in providing the bot with its required skills. 

    I think bots are significant to API providers, and they should be working to better understand how their APIs can be used to either drive bot behavior via popular messaging platforms, as well as how bots can operate on their platform, either using APIs, or a common messaging format. There is a lot of chatter around bots online to sort through, and will be something that takes me a while to produce some coherent research, but you can keep an eye on it, as I progress via my Github research project for bots.


    Wordnik API Is The Base Building Block Of Twitter Bots

    I’m fascinated by the rise of Twitter bots. Little automated bundles of social media joy, built to spew mostly nonsense, but everyone once in a while you find nuggets of wisdom in the Twitter API fueled bots. Personally I have never built a Twitter bot, only because I don’t need another distraction, and I can see the addictive potential of bot design.

    My partner in crime Audrey Watters (@audreywatters), who has hands on experience building her own bots, said something interesting the other day while we were partaking in our daily IPAs—the wordnik APIs it the base building block of any Twitter bot.

    Sure enough, when you search “twitter bots wordnik” on the Googlez, you get a wide variety of python, nobde.js, php, and other recipes for building Twitter bots, with almost all of them using the Wordnik API as the core linguistics engine for the bot.

    Two overlaps with other stories I written here. 1) I feel the Twitter API holds a lot of potential as a training ground for IoT connectivity, and 2) I think APIs, and specifically the Wordnik API, and the work to come out of Wordnik, like Swagger, has a significant role to play in the future of the Internet.

    I know many of us technologists have grand visions of where the Internet is going, and would like things to happen faster. but I think the “smart” everything we are looking for is going to take some time, and is something you can see playing out slowly in things like Twitter bot design. I’m sure there are many negative incarnations, but many of the Twitter bots I’ve seen have been interesting little nuggets of API driven (Twitter & Wordnik) chaos, driving us towards an unknown, but definitely Internet connected, online world.


    Early Thoughts on Robots.json

    My friend @harmophone, Director of Platform for the Klout API, wrote up a great piece before #APIStrat, called A Short Proposal for Robots.json.

    This is a topic that I've been meaning to make time to discuss, but only now finding time. I love exchanging ideas with @harmophone, because we tend to disagree on some important API related topics, however he is extremely knowledgeable on the space, and like me he possesses some strong opinions and is very open to lively discussion via blogs, twitter and on stage at events.

    On the topics of web scraping, API rate limits, and API access that exist in the realm of what I call the politics of APIs, @harmophone and I couldn't disagree more on what is acceptable use in the API economy. What is interesting though, within this disagreement we find affinity in the topic of the robots.json.

    @harmophone has a very succinct description of what a robots.json file would be:

    A Robots.json file, or something like it, would contain all desired rules for retention, caching, access control, license rights, chain of custody, business models and other use cases allowed and encouraged for developers.

    I think a machine readable file that provides access to all aspects of an APIs terms of use is a great idea. Beyond the obvious benefits of publishing a robots.json file for API providers and consumers, I think the practice would help standardize industry best practices for API terms and conditions. In my experience most API providers just emulate what they see in the space and if top players lead on this subject, others will follow.

    @harmophone and I both agree that there should be a robots.json, however we come at it from two opposing viewpoints. @harmophone is coming at from an API vendor position, and I'm coming at from the stance of an API consumer. @harmophone wants to establish a definition of where the provider stands, setting the "rules of the road" as Twitter would say--where I want a single location to find the same definition, so that as an API consumer I can understand the "rules of engagement", and decide if I want to even waste my time, based upon their stance.

    With a standardized robots.json, as a developer I could very easily set the bar for which API providers I integrate into my applications. If a company's views on "retention, caching, access control, license rights, chain of custody and business models" are not in alignment with my businesses, I can easily determine this without spending any time learning about their API--something that can be very costly and time consuming.

    @harmophone uses Craigslists as the lead in for what a possible robots.json file could be used for, and the tone that it could set. While the tone of Craiglist's robots.json file might be very stern, setting what I would consider to be a very uninformed, "get off my lawn" vibe, I don't think all robots.json would reflect this tone.

    Forward thinking companies could use a robots.json to easily differentiate themselves from outdated viewpoints on what "retention, caching, access control, license rights, chain of custody and business models" means, allowing developers to quickly identify leading players, and through "markets" set what is the prevailing definition is for the current times.


    If you think there is a link I should have listed here feel free to tweet it at me, or submit as a Github issue. Even though I do this full time, I'm still a one person show, and I miss quite a bit, and depend on my network to help me know what is going on.