RSS

API Showcase News

These are the news items I've curated in my monitoring of the API space that have some relevance to the API showcase conversation and I wanted to include in my research. I'm using all of these links to better understand how the space is testing their APIs, going beyond just monitoring and understand the details of each request and response.

Helping Define Stream(Line)Data.io As More Than Just Real Time Streaming

One aspect of my partnership with Streamdata.io is about helping define what it is that Streamdata.io does–internally, and externally. When I use any API technology I always immerse myself in what it does, and understand every detail regarding the value it delivers, and I work to tell stories about this. This process helps me refine not just how I talk about the products and services, but also helps influence the road map for what the products and services deliver. As I get intimate with what Streamdata.io delivers, I’m beginning to push forward how I talk about the company.

The first thoughts you have when you hear the name Streamdata.io, and learn about how you can proxy any existing JSON API, and begin delivering responses via Server-Sent Events (SSE) and JSON Patch, are all about streaming and real time. While streaming of data from existing APIs is the dominant feature of the service, I’m increasingly finding that the conversations I’m having with clients, and would be clients are more about efficiencies, caching, and streamlining how companies are delivering data. Many API providers I talk to tell me they don’t need real time streaming, but at the same time they have rate limits in place to keep their consumers from polling their APIs too much, increasing friction in API consumption, and not at all about streamlining it.

These experiences are forcing me to shift how I open up conversations with API providers. Making real time and streaming secondary to streamlining how API providers are delivering data to their consumers. Real time streaming using Server-Sent Events (SSE) isn’t always about delivering financial and other data in real time. It is about delivering data using APIs efficiently, making sure only what what has been updated and needed is delivered when it is needed. The right time. This is why you’ll see me increasingly adding (line) to the Stream(line)data.io name, helping focus on the fact that we are helping streamline how companies, organizations, institutions, and government agencies are putting data to work–not just streaming data in real time.

I really enjoy this aspect of getting to know what a specific type of API technology delivers, combined with the storytelling that I engage in. I was feeling intimidated about talking about streaming APIs with providers who clearly didn’t need it. I’m not that kind of technologist. I just can’t do that. I have to be genuine in what I do, or I just can’t do it. So I was pleasantly surprised to find that conversations were quickly becoming about making things more efficient, over actually ever getting to the streaming real time portion of things. It makes what I do much easier, and something I can continue on a day to day basis, across many different industries.


Reducing Polling Of Your Existing API Using Streamdata.io

I’ve partnered with Streamdata.io, resulting in me getting more acquainted with their API solutions, and telling the story of that process here on API Evangelist. I figured I would dive right in and start with the basics of what Streamdata.io does–turning your existing web API into a real-time stream. Streamdata.io acts as a reverse proxy that translates REST API polling into a stream of data. Instead of constantly polling your API for changes, your API clients will poll Streamdata.io and get a JSON Patch update if anything has changed, and reducing the impact of the requests your clients will make to your API.

When thinking about what Streamdata.io does it is easy to get caught up on the real time and streaming nature of what they do, but the most immediate value they bring to the table is about making your relationship with your API clients more efficient. Streamdata.io reduces the costs associated with operating your API, stepping in between you and your demanding clients, and act as a buffer that will reduce the load on your servers. Eliminating one of the biggest headaches for API providers, and reigning in the behavior by our most demanding, and demanding clients.

I’m always surprised by the answers I get from API providers when I ask them why they rate limit their APIs. I’d say that 80% of the time it is based upon reducing the overhead and impact on backend systems, and dealing with the bad behavior of API consumers. Streamdata.io provides a pretty compelling solution to help alleviate this reality of operating APIs for most API providers. It isn’t just about making things real-time, it is more about cost savings, and minimizing the impact of API consumption on our back-end solutions. Making rate limiting irrelevant, unless you have some other specific business needs behind your decision.

There are numerous other benefits Streamdata.io brings to the table, but reducing the load on your APIs probably the most relevant to ALL of my readers who operate APIs. We can always do better when it comes to making our APIs more efficient, and Streamdata.io is a way we can do this with minimal costs, in minutes, not days, weeks, or months. Which is one of the primary reasons I am partnering with Streamdata.io. It is a service I find easy to push as part of my API storytelling here on the blog, and happy to have become part of the team.

Disclosure: Streamdata.io is the primary partner for the API Evangelist website.


Reducing Polling Of Your Existing API Using Streamdata.io

I’ve partnered with Streamdata.io, resulting in me getting more acquainted with their API solutions, and telling the story of that process here on API Evangelist. I figured I would dive right in and start with the basics of what Streamdata.io does–turning your existing web API into a real-time stream. Streamdata.io acts as a reverse proxy that translates REST API polling into a stream of data. Instead of constantly polling your API for changes, your API clients will poll Streamdata.io and get a JSON Patch update if anything has changed, and reducing the impact of the requests your clients will make to your API.

When thinking about what Streamdata.io does it is easy to get caught up on the real time and streaming nature of what they do, but the most immediate value they bring to the table is about making your relationship with your API clients more efficient. Streamdata.io reduces the costs associated with operating your API, stepping in between you and your demanding clients, and act as a buffer that will reduce the load on your servers. Eliminating one of the biggest headaches for API providers, and reigning in the behavior by our most demanding, and demanding clients.

I’m always surprised by the answers I get from API providers when I ask them why they rate limit their APIs. I’d say that 80% of the time it is based upon reducing the overhead and impact on backend systems, and dealing with the bad behavior of API consumers. Streamdata.io provides a pretty compelling solution to help alleviate this reality of operating APIs for most API providers. It isn’t just about making things real-time, it is more about cost savings, and minimizing the impact of API consumption on our back-end solutions. Making rate limiting irrelevant, unless you have some other specific business needs behind your decision.

There are numerous other benefits Streamdata.io brings to the table, but reducing the load on your APIs probably the most relevant to ALL of my readers who operate APIs. We can always do better when it comes to making our APIs more efficient, and Streamdata is a way we can do this with minimal costs, in minutes, not days, weeks, or months. Which is one of the primary reasons I am partnering with Streamdata.io. It is a service I find easy to push as part of my API storytelling here on the blog, and happy to have become part of the team.

Disclosure: [Streamdata.io](https://streamdata.io is the primary partner for the API Evangelist website.


Caching For Your API Is Easier Than You Think And Something You Should Invest In

I’m encountering more API providers who have performance and scalability concerns with their APIs, who are making technical procurement decisions (gateways, proxies, etc) based upon these challenges, but have not invested any time or energy into planning and optimization of caching for their existing web servers that are delivering their APIs. Caching is another aspect of HTTP that I keep finding folks have little or no awareness of, and do not consider more investment in it to assist them in alleviating their scalability and performance concerns.

There was a meeting I attended a couple weeks back where an API implementation was concerned about a new project for bulk loading and syncing of data between multiple external systems and their own, because of the strain it put on their database. Citing that they received millions of website, and API calls daily, they said they could not take the added load on their already strained systems during the day, limiting this type of activity to a narrow window at night. I began inquiring regarding caching practices in place on web, and API traffic, and they acknowledged that they new of no such activity or practices in place. This isn’t uncommon in my experiences, and I regularly encounter IT groups who just don’t have the time and HTTP awareness to implement any coherent strategy–this particular one just happened to admit it.

My friends over at the API Academy have a great post on caching for RESTful and Hypermedia APIs, so I won’t be addressing the details of HTTP, and how you can optimize your APIs in this way. API caching isn’t an unproven technology, and it is a well known aspect of operating on the web, but it does take some investment and awareness. Like API design in general, you have to get to know the resources you are serving up, understand how your consumers are putting these resources to work, and adjust, dial-in, and tweak your caching strategy. It is something that gets incrementally harder, the more time zones you operate in, but with some investment you can significantly increase the scalability of your APIs, the performance of properly cached paths, and do more with less resources. Scaling the size of your server isn’t always the first sensible thing you should be doing, a coherent caching strategy will be a much wiser and cost-effective approach in the long run.

A lack of API caching strategy amongst my clients and readers has a damaging effect on API operations. However, I’d say the most damage done isn’t by the lack of a strategy, it is the reverberating decisions made around the inability to properly scale, and deliver the performance API clients are needing. I see many technology procurement decisions being made where scalability and performance are a major part of the conversation and decision making process. Where conversations around API caching have never occurred. This is just lazy. This is just ignoring one of the key tenets of what makes the web work. This is just investing in technical debt, over making sensible architectural decisions, and spending the time to get to know the resources you are serving up, and how your customers are using them. Learning about HTTP, and caching does take some investment and planning, but it is nowhere the investment and planning that will be required to unwind the technical debt you’ve acquired made from the other bad technology purchasing decisions you’ve made along the way.


API Providers Could Add A Page To Showcase Their Bots

I am coming across more API providers who have carved off specific "skills" derived from their API, and offering up as part of the latest push to acquire new users on Slack or Facebook. Services like Github, Heroku, and Runscope that API providers and developers are putting to work increasingly have bots they employ, extending their API driven solutions to Slack and Facebook.

Alongside having an application gallery, and having an iPaaS solution showcase, maybe it's time to start having a dedicated page to showcase the bot solutions that are built on your API. Of course, these would start with your own bot solutions, but like application galleries, you could have bots that were built within your community as well.

I'm not going to add a dedicated bot showcase page until I've seen at least a handful in the wild, but I like documenting these things as I think of them. It gives me some dates to better understand at which point did certain things in the API universe begin expanding (or not). Also if you are doing a lot of bot development around your API, or maybe your community is, it might be the little nudge you need to be one of the first APIs out there with a dedicated bot showcase page.


If You Are Proud Of Your API Patents Publish Your Portfolio And Showcase Them

I'm going to keep beating the patent API drumbeat, until I bring more awareness to the topic, and shine a light on what is going on. While I will still be my usual self and call out the worst behavior in the space, I am also going to try and be a little more friendlier around my views, and try and help bring more options to the table. This is a serious problem, nobody is talking about, and one that has many dimensions and nuances--if you want my raw stance on API patents, you can read it here

One area I wanted to try and cover, in response to my friends trying to convince me their aren't bad people, in having patents. I know you aren't, and it isn't my goal to make you look bad in this, it is only to shine light on the entire process, how broken it is, and call out the worst offenders. If you truly believe in patents, protecting the work you've done, and that your intentions are good, share your patent portfolio with the world, and showcase it like you do the other aspects of the work you do. You will craft a press release about everything else you do, do the same for your patents. 

I do not think patents are bad. I think ill-conceived patent ideas, that haven't been properly vetted by the under resourced USPTO, that are used in back door dealings as leverage, and litigated in a court of law are bad. I'll take your word that your patents are good, and you aren't operating in any of these areas, if you are public, transparent, and openly proud of them, as you state in private conversations.

Part of the purpose of my research is to encourage good behavior in the sector, by highlight the common building blocks of the space. I think I will add a patent portfolio building to my research. While I have ZERO examples to highlight, I encourage API companies to do this, and would love to highlight in a positive way, any company that is straight up enough to showcase their patents. If you are proud of your API patents, and do not have bad intentions in having them, please publish your portfolio, show case them as you would anything else you are doing--help bring API patents out of the shadows.


Time Tracking Platform Harvest Moves API Docs and App Showcase to Github

Time Tracking API platform Harvest has embraced Github as part of their API ecosystem. I'm always on the hunt for examples of API providers using Github, so I figured I'd showcase Harvest's creative use of the social coding platform.

Starting with their documentation, the Harvest team has moved the API documentation to a Github repository, allowing developers to "watch" the API, get updates when changes are made, asks questions or even contribute to the API docs by submitting a pull request.

Harvest is also using the wiki portion of their Github repo for a developer application gallery they are calling Community Creations and Hacks, where they showcase innovative uses of the Harvest API--currently displaying 20 integrations by Harvest users.

I'm currently tracking on 11 separate uses of Github for API management, and always on the hunt for new ways to use Github to support API ecosystems. Nice move Harvest!


A 3rd Party API Showcase for Your API

I stumbled across the Twitter Counter API in my monitoring for the API Stack this morning. The Twitter Counter API allows you to retrieve key metrics on any Twitter account like username, url and avatar.  All data you can get via the Twitter API, but with Twitter Counter API you get additional information like account growth statistics and ranking, that Twitter doesn't provide at all.

I find it fascinating that someone can build an API to augment an existing API, which is why I keep talking about it, I guess :) We are seeing a more standardized version of this with API aggregation providers like Singly and Adigami, where they not only aggregate APIs from a variety of sources, they also build entirely new APIs based the added value that is created after they are brought together.

Thinking about if further, it would be cool if you could submit your API to be listed in your parent API providers API area. Think of APIhub and Mashape, but every API area would have its own 3rd API marketplace. API providers often allow 3rd party developers to submit code libraries and samples to be listed as resources, as well as applications for listing in an application showcase. So it makes sense to potetially allow for your developers to submit APIs for validation and publishing into a designated area.

It seems to me that we shouldn’t exist as islands, we should be able to invite in other API resources built on top of our APIs, or that compliment our APIs. We should also have terms of use and pricing models that invite others to take our API resources and deploy in other ecosystem, building the next wave of BaaS providers that will be delivering specialized stacks of resources for developers to efficiently build mobile and web apps.


Does Your API Showcase Its DOers?

Poster boy for how to properly run your API ecosystem properly, Twilio, recently updated their DOer Gallery to highlight developers in the Twilio ecosystem that build cool stuff on the popular voice and SMS API.

Twilio has the best record I’ve seen of any API, when it comes to showcasing and being loved by their developer community, and I'm sure the DOer Gallery plays an important role in that.

The Twilio DOer Gallery has the following features:

  • Personal Details
  • Short Bio
  • Skills
  • Other Profiles
  • Projects

Devloper Galleries like Twilios might not be for every API platform. But if you have a passionate base of developers you might want to consider giving them their own profile and a gallery where they can not just discover and interact with each other, it can let other companies find potential developers to execute projects via your API.

A Developer Gallery can be a great way to give your API developers some love and attention. Twilio even features developers from their DOer Gallery on their blog in a "DOer of the Month".

Would showcasing your “API DOers” benefit your API community?


Factual Launches App Gallery to Showcase Data Apps

Factual has launch a new application gallery to showcase the diverse number of applications built using data provided by Factual.

You can search for apps, browse by category, and filter by open source, paid or free apps.  Looks like there are about 18 apps in the directory currently ranging from augmented reality to daily deals.

The Factual App Gallery isn’t a particularly unique launch, we are seeing app showcases popup within many APIs, but it shows that Factual is gaining steam, and I think it shows the appetite for building apps around datasets is growing.


Showcase Your API Developers and Their Applications

Do you have a cool application built on top of your API? (Hopefully you do!)

I'm sure there are some amazing developers who have worked hard on developing applications that make use of your API. They have seen the value your API delivers and built an application that extends that value to their users.

An application showcase could be an important building block of your API community. An application showcase can provide an great way to reward your developers with exposure. It also will make them feel like an important part of your community. This is a great way to encourage their participation in other areas of your API ecosystem.

An application showcase also can inspire new developers looking for ideas of how they can use your API. Developers might not understand how to put your API to use, and seeing how other community members have used the API may help. You never know -- your community may even show you some ways of using your API that you never have thought of.

You can see application showcases being use by successful APIs such as Zemanta, Paypal, Google, and even the World Bank.

Consider an application showcase for your API developer community.

bit.ly API contest - Building Block Showcase

Holding an API contest is a great way to spur innovation around your API and its community.

bit.ly is a popular URL shortening service that offers an API as part of its core software-as-a-service.

In January 2009 it held a successful API contest and is looking to do it again with a new bit.ly API contest.

The prizes offered::
  • 1st prize - Makerbot Thing-O-Matic 3D Printer
  • 2nd prize - 1TB USB hard drive enclosed in a vintage nintendo game (Zelda, Metroid, etc)
  • 3rd prize - Set of BuckyBalls magnetic building spheres
bit.ly encourages developer to be creative and come up with unexpected uses, but it also plants a few ideas that the company woul'd like to see developers work on.

There are a lot of developers that may know about your API, but not actively involved. Your developers may need a little bit of motivation to get them working.

An API contest is a great way to light the fire under your development community, stimulate innovative uses of your API, and generate some buzz around your API community.

LinkedIn Labs - Building Block Showcase

LinkedIn has released an API labs to showcase various internal projects using the LinkedIn API.

LinkedIn Labs hosts a small set of projects and experimental features built by the employees of LinkedIn. They are published demonstrations and intended to be low-maintenance experiments and may be added and removed over time based on popularity and support.

Four projects the Labs showcases are:
  • NewIn - This application shows new members joining LinkedIn from around the world.
  • ChromeIn - Integrate LinkedIn directly into Google Chrome. Easy access to your LinkedIn updates, anytime.
  • Instant Search - A sample application to search LinkedIn, built over the new Linkedin Javascript APIs.
  • Signal - Signal is aimed at making it easy for all professionals to glean the most relevant insights from the never-ending stream of status updates and news.
  • An API Labs is a great way to showcase experimental and innovative projects that utilize your API.
Encouraging your internal staff to spend time on Labs projects and showcasing on site can improve internal understanding of challenges faced by developers when integrating with your API.

An API Labs environment can be extended to your API developers and partners as well. This is a great way to encourage innovation and building community around your API.

If you think there is a link I should have listed here feel free to tweet it at me, or submit as a Github issue. Even though I do this full time, I'm still a one person show, and I miss quite a bit, and depend on my network to help me know what is going on.