This website is currently dormant!
RSS

API Definitions News

These are the news items I've curated in my monitoring of the API space that have some relevance to the API definition conversation and I wanted to include in my research. I'm using all of these links to better understand how the space is defining not just their APIs, but their schema, and other moving parts of their API operations.

I Wish API Providers Published Their Developer Portals On Github So I Could Submit Pull Requests

I spend a lot of time looking through the developer portals of API providers. I see a lot of things, both the good and bad, while navigating these portals, and while some of the bad stuff I see are way too big for me to doing anything, there are many little things I see that I could help do something about. Sometimes it is just spelling mistakes, sometimes broken links, and other times I want to rewrite API descriptions, and add to the resources that are available for an API.

During these travels, I got to thinking...what if API providers published their developer portals to Github? Just like I do with my API Evangelist sites. Then I could fork their portal, and submit pull requests with rewrites of the description of what your API does, adding little tweaks, here and there, helping polish the portal--potentially making things easier for developers. I’m sure this thought scares the hell out of some companies, who can’t imagine, allowing outside input like this, but if you sit down and think about it for a while, it just might make sense.

When I was in Washington D.C., I created a default portal template that runs on Github. I might upgrade this, and make it available for API providers to use as a forkable template, to seed any API developer portal. Doing this makes it easy to provide a base set of building blocks that every API provider should start with, allowing them to delete the elements that they don’t want. I am also considering making a default one, that can be used as a starter developer portal for Internet of Things (IoT) platforms.

I thoroughly enjoy the pull requests that I get for spelling and grammar mistakes I made on API Evangelist, from some of my favorite grammar trolls. Running all of my websites on Github Pages has changed the way I tell stories, and interact with my audience. I’ve always encouraged API providers to use Github for an increasing number of API management tasks, but actually running your entire portal as a Github repo, would this to a new level. This approach could be used for larger, or smaller developer platforms, and I even use Github pages as a front-end for private API developer portals--ask me how.

It is just a suggestion for you API providers, and soon to be API providers. Github provides a very low cost, scalable way to launch your portal, and you never know, people like me who spend a lot of time in your developer portal, might actually help make it a better place. I wish more API providers would publish their portal as a Github repository, as well as graphic designers in the space help craft template portals, that providers can put to work. My portal definitions will always reflect best practices that I see across the API landscape, but they won’t always be the slickest looking when it comes to graphic design—just not my core competency.


Why Would Want To List All Your Universities Web Services (APIs) Out In The Open, Via Central Portal? What A Security Risk!

I went up to California State University Channel Islands the other day to talk APIs with their tech team, and I was happy to find at least one strong API skeptic on the team. API skeptics also give me material for stories, so I thoroughly enjoy coming across them, and telling these stories is how keep polishing my argument for the next API skeptic encounter at campus IT, at the higher educational institutions that I visit.

During the discussion I was asked several interesting questions, the first one was: why would you want to list of your web services (APIs) for your public university, out in the open, via a central portal?? What a security risk!!

Sorry, But Security Through Obscurity Is Not A Strategy
I’m sorry, hiding things, and hoping nobody finds them is not a valid IT strategy. You are a public university, and you should have a sophisticated identity and access management, as well as well tight approach to security. If you can’t properly secure an API resource available at a public URL, you shouldn’t be running campus IT, sorry. You should be able to provide a public description, of what resources are available, without giving away the farm.

The Feeling You Have —It Comes From Legacy Power And Control
As an IT professional, if you don’t acknowledge that power and control exists within classic IT processes, you are in denial. I’ve taken over numerous company IT operations, and there is ALWAYS political struggles between IT and the rest of operations. If you list all digital resources in a single portal, and make them available in a self-service way, it disrupts existing power and control structures. When you make things hard to find, you play a central role in people finding these resources, and you are in power—adding to decades of legacy stories about IT being a negative force in operations. Self-service access to campus resources is the future, no matter how much you resist.

Centralized Location To Aggregate All Digital Resources
A centralized API portal provides a single place for anyone to discover or share their digital resources. Like the main website, administrators, faculty, students, parents, and the public will know there is a single place to find machine readable versions of campus resources. API portals are more than just a listing of APIs, it includes documentation, code samples, widgets, buttons, spreadsheet connectors, visualizations, blogs, twitter accounts, and other resources that can turn a simple portal into an active ecosystem where everyone collaborates around campus resources.

Self-Service Access For 3rd Party Vendors Delivering Vital Campus Services
It takes resources to engage with 3rd party vendors on campus, and a single, self-service portal provides a standardized way for vendors to access the institutional resources they need to deliver the services they are bringing to campus operations. There is no reason that IT or departmental contacts should ever be bottlenecks in delivering the information vendors will need. API portals should provide them with the access they need, along with proper identity and access management, and monitoring of exactly how vendors are accessing, and putting campus resources to use (or not).

Interoperability Between Higher Educational Institutions
Most higher educational institutions are part of a larger network of institutions, and at the very least have relationships with other schools, in which information, and resources are shared. As with vendors, much of this can be provided via a self-service API portal, where institutions can find the data, content, and other resources they need, able to access only the resources they are supposed to during the fullfillment of established relationships.

Smoother Interactions With Local, State, And Federal Government
Higher educational institutions have regular integration points with local, state, and federal government agencies. Data is reported on a regular basis, back and forth, and a central API portal is excellent for aggregating the commons resources a government agency will be asking for. API portals are not just for student hackers, or 3rd party developers, they are increasingly how government agencies are sharing, and getting what they need to regulate, monitor industry, and govern more effectively.

It Is About People Getting Access To What They Need
Central API portals are about making sure people get access to what they need, and at a public university, this should be priority number one. You have data, content, and other digital resource students, faculty, administrators, vendors, government, parents and alumni need, and because you aren’t confident in securing it, from a handful of hackers you are going to hide this away, and forgo all the benefits? I’m sorry, the perceived negatives just don’t out-weighs the positives. There are proven ways to secure APIs, and for much of university operations, it won’t hurt to provide a title, description, and location of a resource, and let identity and access management handle who should be able to get at resources—or not.

Bringing IT Out of Shadows Across Campus Operations
Security is a concern, and should be front-and-center in all conversations, but it is too often used to hide away insecurities, incompetency, and shortcomings in security, rather than actual address the root of security concerns. If a resource is accessible via Internet protocols, over campus networks, it should be mapped out, with the definition made publicly available in a sensible way. This conversation should involve careful consideration what resource should be public, and what should remain private, with a heavy emphasis on transparency at a public university.

Between 2000 and 2007, we start seeing a shift of IT services into the cloud, and much of this has happened, as un-sanctioned shadow IT, by administrators, faculty, and students looking to get their work done, because IT hasn’t been able to keep up. The cloud evolution of IT has been all API driven, and making IT resources available to the public at Amazon sounded insane at first, but it has changed not just how Amazon operates, but has evolved the entire world deploys software architecture—using APIs.

There are many great examples of leading companies, making personally identifiable information, banking, healthcare, and other vital data and content available on the open Internet, in a secure way. It is not a security risk to share information, and provide access to your valuable resources, in a central, publicly available portal--if you do it right. Who knows, if you do open up, there may be some unintended consequences, leading to IT being seen as positive influence on the education process, as opposed to the roadblock image it often has on campuses across the country.


New API Management Providers: Clean, Modern API Portals With ReadMe

It makes me happy to see new arrivals in the world of API management service providers, especially after all the consolidation we saw last year with many of the 1st and 2nd wave of providers like Mashery, Vordel, Layer7, and Apiphany. One of the new API management providers that have emerged is Readme, who is looking to provide an attractive, simple, and intuitive way to launch developer portals for your APIs.

Readme reminds me of some of the landing page tools designed for web marketers over the last decade, to provide informative gateways for site visitors, but ReadMe is all about providing meaningful doorways to our API resources. First, I like ReadMe's definition of what is a developer hub?

  • Documentation - Topical guides, tutorials and troubleshooting.
  • API Reference - Low-level, deep-dive reference material.
  • Community - Provide support and answer questions.

Their definition of what is a developer hub is goes beyond just the technical documentation as what many developers see, and acknowledges that your portal should be about providing the resources necessary to educate API consumers, and potentially build community around their needs. ReadMe provides the essential building blocks I'd expect of any modern API portal builder:

  • Theme Builder - Easily create a beautiful dev community that matches your brand.
  • Editor - Markdown-based drag-and-drop editor makes documentation almost fun.
  • API Explorer - Let users play with your API right inside the documentation.
  • Application Keys - Your users can view their application keys embedded right in the docs.
  • Support - Let users ask questions and request features in the support forums.

Then ReadMe goes a little further to include some concepts that I see in some of the leading API providers, features that go beyond just basic features, allowing you to better manage your API portal:

  • Collaboration - Crowdsource your docs! Users can keep docs current by suggesting changes.
  • GitHub Sync - Keep auto-generated reference docs synced with your actual code changes.
  • Versioning - Maintaining old or testing beta versions of your docs is a breeze.

In 2014, emulating the social elements that Github has introduced into the world of coding, in your API program is essential. You cannot manage your entire API community by yourself, and including your developer in the process is essential. This adds relevant layers to the term "open" that everyone likes to use, providing the roots you will need to actually build trust with your developers, in something that goes both ways, and will also grow your own trust of developed within your own API community.

I'm keeping an eye on what ReadMe is up to, alongside the other API management providers I've been tracking on. I haven't give a lot of attention to the API management space in the last year, as I've been focusing on the faster growing areas like API discovery, design, and integration, but now that I see new players stepping up, I will make sure and give the area equal attention in my research and monitoring.


Coca-Cola to Launch New Developer Portal and APIs Site


Coca-Cola will launch a new developer site, powered by Apigee, to simplify the process of accessing and using the company's internal and public-facing APIs. Coca-Cola is just one of the many enterprises leveraging APIs for marketing, operations, mobile app development and more. In June, ProgrammableWeb reported that Oreo is using APIs to build and promote a groundbreaking digital marketing and social campaign in China. Most of the APIs used by Coca-Cola are internal and not public facing. However, successful APIs are having an impact on the company's entire operations. Kevin Flowers, CTO of Coca-Cola Enterprises, spoke at the Business of APIs Conference in December 2013. In a presentation at the conference, he said: Our [Coca-Cola's] entire operations are now impacted by successful APIs. We're an operations company. We run distribution, manufacturing, sales — so those operations, everything from procurement to our customer engagement programs, or supply chain, everything now has APIs enabling it. And I think that's a really important point because we're not a dot-com. Internal production APIs used by Coca-Cola include Foursquare, Amazon S3, Twilio, SmartPay, Salesforce Chatter. There is also a Coca Cola Enterprises API powered by Mashery that allows the company's customers, suppliers, consumers and partners to build innovative CCE Web and mobile applications. Coca-Cola is also using APIs to create innovative IoT-to-business applications. Nordic APIs reported in April that Coca-Cola had programmed vending machines to perform self-diagnostics. When the program determines that repairs are needed, the vending machine utilizes the Twilio API to send a text to repair contractors requesting that the machine be serviced. According to Computerworld UK, Coca-Cola started using APIs earlier this year to build and promote a Coke rewards consumer-loyalty-type marketing strategy. When it comes to public-facing APIs, Flowers said at the Business of APIs Conference: From a public-facing API strategy, we really are not expecting our corporation [Cola-Cola] necessarily to be one that there's a large developer community just hacking and growing and building applications on our data. Where we see the public facing is we're able to leverage the APIs to an extended developer community. So for us it would be more going to mobile developers or cloud-based platforms to say, "Look, here's the data. Let's work together and innovate" versus a developer community. Find out more about the existing Coca-Cola Enterprises API at the official Coke CCE website.

URL: http://apiscene.com/out?url=http%3A%2F%2Fwww.programmableweb.com%2Fnews%2Fcoca-cola-to-launch-new-developer-portal-and-apis-site%2Fbrief%2F2014%2F09%2F18

Continuing With The API Restaurant Analogy

I began exploring the use a restaurant menu to help people understand API copyright, and how your API definition is not your secret sauce, and that there is so much more to your API that just the surface area.

I'm continuing my exploration of using restaurants as an analogy to help onboard people with not just API copyright, but introducing APIs to the masses--helping them understand the layers to the API space, in a way that is familiar.

To prepare for future stories and conversations I'm expanding on my API definition = Menu analogy with:

  • API Providers = Restaurant Owners
  • Developer Portal = Restaurant
  • APIs = Food & Drink Items
  • API Definition + Docs = Menu
  • API Consumers (Apps) = Restaurant  Customers (People)
  • Freemium - Samples Out Front
  • Partner Access = Catering
  • Terms of Service = We Reserve The Right To Refuse - No Substitutions
  • API Service Providers = Restaurant Equipment Providers

Obviously it is work in progress. I will need better stories and visuals to get the concept across. Then once I start working through the different types of restaurants (ie. American, Chinese, Mexican, Fusion, etc), I'm sure I'll flush out more, while also potentially hitting some dead ends.

Explaining what an API is, let alone the versatility and flexibility of an API, is really hard. I need as many tools in my API toolbox to help get people up to speed, understanding what an API is, and how it can apply to their business. We'll see where I go with the restaurant analogy, who knows it might be a dead end. It proved to be extremeley valuable in helping people understand API copyright, and I'm hopefull it will do the same for APIs in general.

Let me know your thoughts...


Chief Data Officer Needs To Make The Department Of Commerce Developer Portal The Center Of API Economy

Today, the U.S. Secretary of Commerce Penny Pritzker (@PennyPritzker), announced that the Department of Commerce will hire its first-ever Chief Data Officer. I wanted to make sure that when this new, and extremely important individual assumes their role, they have my latest thoughts on how to make the Department of Commerce developer portal the best it possibly can be, because this port will be the driving force behind the rapidly expanding API driven economy.

Secretary Pritzker does a pretty good job of summing up the scope of resources that are available at Commerce:

Secretary Pritzker described how the Department of Commerce’s data collection – which literally reaches from the depths of the ocean to the surface of the sun – not only informs trillions of dollars of private and public investments each year and plants the seeds of economic growth, but also saves lives.

I think she also does a fine job of describing the urgency behind making sure Commerce resources are available:

Because of Commerce Department data, Secretary Pritzker explained, communities vulnerable to tornadoes have seen warning times triple and tornado warning accuracy double over the past 25 years, giving residents greater time to search for shelter in the event of an emergency.

To understand the importance of content, data and other resources that are coming out the Department of Commerce, you just have to look at the list of agencies that are underneath Commerce, who already have API initiatives:

Then take a look at the other half, who have not launched APIs:

The data and other resources available through these agencies, underneath the Department of Commerce, reflect the heart of not just the U.S. economy, but the global economy, which is rapidly being driven by APIs, which are powering stock markets, finance, payment providers, cloud computing, and many other cornerstones of our increasingly online economy.

Look through those 13 agencies, the resource they manage are all vital to all aspects of the economy telecommunications, patents, weather, oceans, census, to other areas that have a direct influence on how markets work, or don't work.

I’m all behind the Department of Commerce, hiring a Chief Data Officer (CDO), but my first question is, what will this person do? 

This leader, Secretary Pritzker explained, will oversee improvements to data collection and dissemination in order to ensure that Commerce’s data programs are coordinated, comprehensive, and strategic.

Yes! I can get behind this. In my opinion, in order for the new CDO to do this, they will have to quickly bring all of the agencies /developer program up to a modern level of operation. There is a lot of work to be done, so let's get to work exploring what needs to happen. 

A Central Department of Commerce Developer Portal To Rule Them All
Right now the Department of Commerce developer portal at commerce.gov/developer, is just a landing page. An after thought, to help you find some APIs--not a portal. The new CDO needs to establish this real estate as the one true portal, which provides the resources other agencies will need for success, while also providing a modern, leading location for developers of web, mobile, Internet of things applications, and data journalists, or analyst to come to find the data they need. If you need a reference point, go look at Amazon Web Services, SalesForce, eBay or Googe’s developers areas—you should see this type of activity at commerce.gov/developer.

Each Agency Must Have Own Kick Ass Developer Portal
Following patterns set forth by their parent, Department of Commerce portal, each agency underneath needs to posses their own, best of breed developer portal, providing the data, APIs, code, and other resources, that public and private sector consumers will need. I just finished looking through all the available developer portals for commerce agencies, and their is no consistency between them in user experience (UX), API design, or resources available. The new CDO will have to immediately get to work on taking existing patterns from the private sector, as well as what has been developed by 18F, and set a establish common patterns that other agencies can follow when designing, developing and managing their own agencies developer portal.

High Quality, Machine Readable Open Data By Default
The new CDO needs to quickly build on existing data inventory efforts that has been going on at Commerce, making sure any existing projects, are producing machine readable data by default, making sure all data inventory is available within their agency's portal, as well as at data.gov. This will not be a one time effort, the new CDO needs to make sure all program and project managers, also get the data steward training they will need, to ensure that all future work at the Department of Commerce, associated agencies, and private sector partners produces high quality, machine readable data by default.

Open Source Tooling To Support The Public And Private Sector
Within each of the Commerce, and associate agency developer portals, there needs to be a wealth of open source code samples, libraries and SDKs for working with data and APIs. This open source philosophy, also needs to be applied to any web or mobile applications, analysis or visualization that are part of Commerce funded projects and programs, whether they are from the public or private sector. All software developed around Commerce data, and receive public fundeing should be open source by default, allowing the rest of the developer ecosystem, and ultimately the wider economy to benefit and build on top of existing work.

Machine Readable API Definitions For All Resources
This is an area, that is a little bit leading edge, even for the private sector, but is rapidly emerging to play a central role in how APIs are designed, deployed, managed, discovered, tested, monitored, and ultimately integrated into other systems and applications. Machine readable API definitions are being used as a sort of central truth, defining how and what an API does, in a machine readable, but common format, that any developer, and potentially other system can understand. Commerce needs to ensure that all existing, as well as future APIs developed around Commerce data, possess a machine readable API definition, which will allow for all data resources to be plug and play in the API economy.

Established An Assortment Of Blueprints For Other Agencies To Follow
The new Commerce CDO will have to be extremely efficient at establishing successful patterns that other agencies, projects and programs can follow. This starts with developer portal blueprints they can follow when designing, deploying and managing their own developer programs, but should not stop there, and Commerce will need a wealth of blueprints for open source software, APIs, system connectors, and much, much more. Establishing common blueprints, and sharing these widely across government will be critical for consistency and interoperability--reducing the chances that agencies, or private sector partners will be re-inventing the wheel, while also reducing development costs.

Establish Trusted Partner Access For Public And Private Sector
Open data and APIs, do not always mean publicly available by default. Private sector API leaders have developed trusted partner layers to their open data and API developer ecosystems, allowing for select, trusted partners greater access to resources. An existing model for this in the federal government, is within the IRS modernized e-file ecosystem, and the trusted relationships they have with private sector tax preparation partners like H&R Block or Jackson Hewitt. Trusted partners will be critical in Department of Commerce operations, acting as private sector connectors to the API economy, enabling higher levels of access from the private sector, but in a secure and controlled way that protects the public interest.

Army of Domain Expert Evangelists Providing A Human Face
As the name says, Commerce spans all business sectors, and to properly "oversee improvements to data collection and dissemination in order to ensure that Commerce’s data programs are coordinated, comprehensive, and strategic”, the CDO will need another human layer to help increase awareness of Commerce data and APIs, while also supporting existing partners and integrators. An army of evangelists will be needed, possessing some extremely important domain expertise, across all business sectors, that Department of Commerce data and resources will touch. Evangelism is the essential human variable, that makes the whole open data and API algorithm work, the new CDO needs to get to work writing a job description, and hiring for this army—you will need an 18F, but one that is dedicated to the Department of Commerce.

Department of Commerce As The Portal At The Center Of The API Economy
The establishment of CDO at the Department of Commerce is very serious business, and is a role that will be central to how the global economy evolves in the coming years. The content, data, and digital resources that should, and will be made available at commerce.gov/developer and associated agencies, will be central to the health of the API driven economy.

Think of what major seaports have done for the the economy over the last 1000 years, and what role Wall Street has played in the economy over the last century—this is the scope of the commerce.gov/developer portal, which is ultimately the responsibility of the new Department of Commerce Chief Data Officer.

When the new CDO get started at the Department of Commerce, I hope they reach out to 18F, who will have much of what you need to get going. Then sit down, read this post, as well my other one on, An API strategy for the U.S. government, and once you get going, if you need any help, just let me know—as my readers know, I’m full of a lot of ideas on APIs.


Low Hanging Fruit For API Discovery In The Federal Government

I looked through 77 of the developer areas for federal agencies, resulting in reviewing approximately 190 APIs. While the presentation of 95% of the federal government developer portals are crap, it makes me happy that about 120 of the 190 APIs (over 60%) are actually consumable web APIs, that didn't make me hold my nose and run out of the API area. 

Of the 190, only 13 actually made me happy for one reason or another:

Don't get me wrong, there are other nice implementations in there. I like the simplicity and consistency in APIs coming out of GSA, SBA, but overall federal APIs reflect what I see a lot in the private sector, some developer making a decent API, but their follow-through and launch severeley lacks what it takes to make the API successful. People wonder why nobody uses their APIs? hmmmmm....

A little minimalist simplicity in a developer portal, simple explanation of what an API does, interactive documentation w/ Swagger, code libraries, terms of service (TOS), wouild go a looooooooooooong way in making sure these government resources were found, and put to use. 

Ok, so where the hell do I start? Let's look through theses 123 APIs and see where the real low hanging fruit for demonstrating the potential of APIs.json, when it comes to API discovery in the federal government.

Let's start again with the White House (http://www.whitehouse.gov/developers):

Only one API made it out of the USDA:

Department of Commerce (http://www.commerce.gov/developer):

  • Census Bureau API - http://www.census.gov/developers/ - Yes, a real developer area with supporting building blocks. (Update, News,( App Gallery, Forum, Mailing List). Really could use interactive document though. There are urls, but not active calls. Would be way easier if you could play with data, before committing. (B)
  • Severe Weather Data Inventory - http://www.ncdc.noaa.gov/swdiws/ - Fairly basic interface, wouldn’t take much to turn into modern web API. Right now its just a text file, with a spec style documentation explaining what to do. Looks high value. (B)
  • National Climatic Data Center Climate Data Online Web Services - http://www.ncdc.noaa.gov/cdo-web/webservices/v2Oh yeah, now we are talking. That is an API. No interactive docs, but nice clean ones, and would be some work, but could be done. (A)
  • Environmental Research Division's Data Access Program - http://coastwatch.pfeg.noaa.gov/erddap/rest.html - Looks like a decent web API. Wouldn’t be too much to generate a machine readable definition and make into a better API area. (B)
  • Space Physics Interactive Data Resource Web Services - http://spidr.ngdc.noaa.gov/spidr/docs/SPIDR.REST.WSGuide.en.pdf - Well its a PDF, but looks like a decent web API. It would be some work but could turn into a decide API with Swagger specs. (B)
  • Center for Operational Oceanographic Products and Services - http://tidesandcurrents.noaa.gov/api/ - Fairly straightforward API, Simple. Wouldn’t be hard to generate interactive docs for it. Spec needed. (B)

Arlington Cemetary:

Department of Education:

  • Department of Education - http://www.ed.gov/developers - Lots of high value datasets. Says API, but is JSON file. Wouldn’t be hard to generate APIs for it all and make machine readable definitions. (B)

Energy:

  • Energy Information Administration - http://www.eia.gov/developer/ - Nice web API, simple clean presentation. Needs interactive docs. (B)
  • National Renewable Energy Laboratory - http://developer.nrel.gov/ - Close to a modern Developer area with web APIs. Uses standardized access (umbrella). Some of them have Swagger specs, the rest would be easy to create. (A)
  • Office of Scientific and Technical Information - http://www.osti.gov/XMLServices - Interfaces are pretty well designed, and Swagger specs would be straightforward. But docs are all PDF currently. (B)

Department of Health and Human Services (http://www.hhs.gov/developer):

Food and Drug Administration (http://open.fda.gov):

Department of Homeland Security (http://www.dhs.gov/developer):

Two losse cannons:

 Department of Interior (http://www.doi.gov/developer):

Department of Justice (http://www.justice.gov/developer):

Labor:

  • Department of Labor - http://developer.dol.gov/ - I love their developer area. They have a great API, easy to generate API definitions. (A)
  • Bureau of Labor Statistics - http://www.bls.gov/developers/ - Web APIs in there. Complex, and lots of work, but can be done. API Definitions Needed. (B)

Department of State (http://www.state.gov/developer):

Department of Transportation (http://www.dot.gov/developer):

Department of the Treasury (http://www.treasury.gov/developer):

Veterans Affairs (http://www.va.gov/developer):

Consumer Finance Protectection Bureau:

Federal Communications Commission (http://www.fcc.gov/developers):

Lone bank:

  • Federal Reserve Bank of St. Louis - http://api.stlouisfed.org/ - Good API and area, would be easy to generate API definitions. (B)

General Services Administration (http://www.gsa.gov/developers/):

National Aeronautics and Space Administration http://open.nasa.gov/developer:

Couple more loose cannons:

Recovery Accountability and Transparency Board (http://www.recovery.gov/arra/FAQ/Developer/Pages/default.aspx):

Small Business Administration (http://www.sba.gov/about-sba/sba_performance/sba_data_store/web_service_api):

Last but not least.

That is a lot of potentially valuable API resource to consume. From my perspective, I think that what has come out of GSA, SBA, and White House Petition API, represent probably the simplest, most consistent, and high value targets for me. Next maybe the wealth of APis out of Interior and FDA. AFter that I'll cherry pick from the list, and see which are easiest. 

I'm lookig to create a Swagger definition for each these APIs, and publish as a Github repository, allowing people to play with the API. If I have to, I'll create a proxy for each one, because CORS is not common across the federal government. I'm hoping to not spend to much time on proxies, because once I get in there I always want to improve the interface, and evolve a facade for each API, and I don't have that much time on my hands.


Looking At 77 Federal Government API Developer Portals And 190 APIs

I spent most of the day yesterday, looking through 77 of the developer portals listed on the 18F Github portal. While I wanted to evaluate the quality and approach of each of the agencies, my goal for this review cycle was to look for any APIs that already had machine readable API definitions, or would be low hanging fruit for the creation of Swagger definitions, as part of my wider API discovery work.

I had just finished updating all my API Evangelist Network APIs to use verion 0.14 of APIs.json, and while I wait for the search engine APIs.io to update to support the new version, I wanted to see if I could start the hard work of applying API discovery to federal government APIs. 

Ideally all federal agencies would publish APIs.json on their own, placing it within the root of their domain, like they do with data.json, and provide an index of all of their APIs. Being all to familiar with how this stuff work, I know that if I want this to happen, I will have to generate APIs.json for many federal agencies first. However for the APis.json to have their intended impact, I need many of the APIs to have machine readable API definitions that I can point to--which equals more work for me! yay? ;-(

My thinking is that I will look through all of the 77 developer areas, and resulting APIs looking for the low hanging fruit. Basically I would grade each API on its viability to be included in my federal government API discovery work. I spent minimal amount of time look at each API, and in some cases looking for the API, before giving up. I would inspect the supporting developer area, and the actual interface for complexity, helping me understand how hard it would be to hand craft a Swagger spec, and APIs.json for each agency and their APIs. 

(warning contains my raw un-edited notes from doing this research, not suitable for children)

As I went through, I wrote a couple of notes:

  • National Climate Data Center is nice looking, and is a high profile--they should have a kick ass API!
  • Inversely NOAAA Climate Data Online very simple, clean and well done.
  • Department of Education is acceptable as dev area, only because of Socrata
  • National Renewable Energy Laboratory just gets it. They just get it.
  • Some of these are billed as developer areas, but really just a single API. It is a start I guess. 
  • I love me some Department of Labor. Their developer area and API is freak'n cool!
  • MyUSA citizen API has oAuth!!! WTF. How did I not notice this before? Another story, and work here.
  • MyUSA has really good, simple, high value, API resources. 
  • NASA ExoAPI not just API cool, but space cool!!
  • FOIA needs a fucking API. No excuses. FOIA needs an API!
  • Some APIs it might be better to go back to data source and recreate API from scratch, they are so bad.

I wanted to share some of my notes, before the long list of developer areas, and their APIs. There are some specific notes for each APIs, but much of it is very general, helping grade each API, so I can go back through the list of B grade or higher APIs, and figure out which are candidates for me to create a Swagger API definition, APIs.json and ultimately adding to APIs.io. 

For this work I went down the 77 federal agency links, which were billed as developer areas, but many were single APIs. So when a developer area resulted in multiple APIs, I grouped them together, and many of the agencies who have a single API I will group together, and include my commentary as necessary. I'm leaving the URLs visible to help as a reference, show the craziness of some of them, and because it would have been sooooo much work to apply all of them.

Let's start with the White House(http://www.whitehouse.gov/developers):

Next up is the USDA (http://www.usda.gov/wps/portal/usda/usdahome?navid=USDA_DEVELOPER), which is a hodgepodge of service, no consistency whatsoever between services in interface, supporting content or anything. 

Overall I give USDA a D on all their APIs. A couple might be high value sources, and going after, but definitely not low hanging fruit for me. It would be easier to tackle as independent project for generating brand new APIs.

Next up is Department of Commerce (http://www.commerce.gov/developer), who definitely high some higher value resources, as well as some health API initiatives. 

Next with the Department of Defense (http://www.defense.gov/developer/)There are 8 things billed as aPIs, with a variety of datasets, API like things, and web services available.  Not really sure whats up? (D)

We have one by itself here:

Then the department of education who is just riding their data.gov work as API area:

  • Department of Education - http://www.ed.gov/developers - Lots of high value datasets. Says API, but is JSON file. Wouldn’t be hard to generate APIs for it all and make machine readable definitions. (B)

Next some energy related efforts:

  • Department of Energy - http://www.energy.gov/developers - Lots of datasets. Pretty presentation.  Could use some simple APIs,. Wouldn’t be much to pick high value data sets and create cache of them. (C)
  • Energy Information Administration - http://www.eia.gov/developer/ - Nice web API, simple clean presentation. Needs interactive docs. (B)
  • National Renewable Energy Laboratory - http://developer.nrel.gov/ - Close to a modern Developer area with web APIs. Uses standardized access (umbrella). Some of them have Swagger specs, the rest would be easy to create. (A)
  • Office of Scientific and Technical Information - http://www.osti.gov/XMLServices - Interfaces are pretty well designed, and Swagger specs would be straightforward. But docs are all PDF currently. (B)

Moving on to the Department of Health and Human Services (http://www.hhs.gov/developer), which all of their APis are somewhat cosistent, and provide simple resources:

The Food and Drug Administration (http://open.fda.gov) is one of the agencies that is definitely getting on board with APIs, they have some pretty nice implementations, but there there are some not so nice ones that need a lot of work:

Next up the Department of Homeland Security (http://www.dhs.gov/developer), where they have three APIs (its a start):

Then we have two agencies that have pretty simple API operations, so I'll group together:

Then we have several API developer efforts under the Department of Interior (http://www.doi.gov/developer):

Now we have some APIS coming out of law enforcement side of government, starting with Department of Justice (http://www.justice.gov/developer):

Now we get to one of my favorite afforts in the federal government

  • Department of Labor - http://developer.dol.gov/ - I love their developer area. They have a great API, easy to generate API definitions. (A)
  • Bureau of Labor Statistics - http://www.bls.gov/developers/ - Web APIs in there. Complex, and lots of work, but can be done. API Definitions Needed. (B)

Next we have the API efforts from Department of State (http://www.state.gov/developer):

Moving on to the Department of Transportation (http://www.dot.gov/developer):

Now let's head over to the Department of the Treasury (http://www.treasury.gov/developer):

The Department of Veterans Affairs (http://www.va.gov/developer) has some hope, because of the work I did in the fall.

Moving to another on of my favorite agencies, well quasi gov agencies:

One agency that appears to be on radar, but I really can't tell what is going on API wise:

  • Environmental Protection Agency - http://www.epa.gov/developer/ - There is a really nice layout to the area, with seemingly a lot of APIs, and they look like web APIs, but it looks like one API being represented as a much of methods? Would be too much work, but still hard to figure out WTF. (C)

The the Federal Communications Commission (http://www.fcc.gov/developers) has a lot of APIs going on, in various states of operation:

All by itself on the list, we have a bank one lonely bank:

  • Federal Reserve Bank of St. Louis - http://api.stlouisfed.org/ - Good API and area, would be easy to generate API definitions. (B)

The General Services Administration (http://www.gsa.gov/developers/) definitely is ahead of the game when it comes to API design and deployment:

Once I reached the National Aeronautics and Space Administration http://open.nasa.gov/developer I found some really, really cool APIs:

Grouping a couple loose agencies together:

The Recovery Accountability and Transparency Board (http://www.recovery.gov/arra/FAQ/Developer/Pages/default.aspx) has some APIs to look at: 

The Small Business Administration (http://www.sba.gov/about-sba/sba_performance/sba_data_store/web_service_api) has some nice APIs that are consistent and well presented:

Lasty, we have a couple of loose agencies to look at (or not):

Ok that was it. I know there are more APIs to look at, but this is derived from the master list of federal government developer portals. This gives me what I need. Although, I'm a little disappointed I have less than 5 Swagger definitions, after looking at about 190 APIs. 

After looking at all of this, I'm overwhelmed. How do you solve API discovery for a mess like this? Holy shit!

I really need my fucking head examined for taking this shit on. I really am my own worst enemy, but I love it. I am obsessed with contributing to a solution for API discovery that will work in private sector, as well as a public sector mess like this. So where the hell do I start?  More to come on that soon...


The edX API

This post should tell you about how behind I am in my storytelling—this story is from an event I attended in Arlington TX, on April 30th, and May 1st. While in Arlington, I spoke to a group of professionals who were crafting an online data & analytics course. A couple of the participants were from edX, the online course platform partnership between MIT, Harvard, UC Berkeley and other universities.

Over the course of two days, I had a cance ask the question, where was the edX API? Seemed like an obvious question, to which Emily Watson, the program manager at edX, responded, “Its on our roadmap”! An answer I get from many online companies, but Emily pulled up their roadmap on the wiki, and indeed it was on their roadmap.

I told Emily, I’d review the edX platform, and provide some feedback, that they could incorporate into their strategy. This one is easy, and is my basic feedback for any company with an online website and/or application—you make everything currently available on your site or application, available via an API. A quick glance at edX, these would be:

  • Courses - The 175+ courses available on edX, should also be available in a machine readable format, via the edX API.
  • Faculty & Staff - The 400+ faculty and staff involved in producing edX courses should be available in a machine readable format, via the edX API.
  • Schools & Partners- All edX partners should be available in a machine readable format, via the edX API.

You start with the low hanging fruit, by establishing three, simply designed web APIs for those already, publicly available resources, then move on to providing the essential business building blocks for any API:

  • Dedicated API Portal - Simple, dedicated portal for edX API integrations, that isn’t just for developers. It should be easy enough for anyone to learn about the edX API, without too much technical detail front and center.
  • Simple Landing Page - The home page of the edX API portal should be dead simple, explaining what the API does, providing easy one click access to get to whatever any API, or potential API consumer will need.
  • Self-Service Registration - The edX API needs to be accessible 24/7, and developers should be able to register without approval, and at least get minimum access to the system to begin playing with resources, even if it takes approval to get higher levels of access. Modern API management solutions like 3Scale work very well in managing this layer of access, and service composition.
  • Interactive Documentation - The standard for all APIs in 2014 is to provide interactive documentation, making learning about an API, a hands on experience. There are some common approaches to defining an API using machine readable formats like Swagger, which will automatically generate interactive documentation for consumers.
  • Code Samples & Libraries - Developers need all the help they can in getting up and running with an API, and right after interactive API documentation, code libraries and samples are critical to onboarding API consumers.
  • Blog W/ RSS - An active blog, with RSS for syndication provides the necessary stories and resulting SEO that helps communicate the value an API offers, and keeps new and existing consumers in tune with API operations.
  • Twitter Presence - An active Twitter presence is common place for leading API platforms, and edX will need an active Twitter account to support its other communications, and support operations.
  • Support Ticket System - An easy to use, trackable support system for the edX API platform will be essential to establishing a feedback loop with API consumers. Github issue management works very well, for an out of the box, API support ticket system for all public APis.
  • Discussion Forum - Discussion forums are common place in APIs, and provide potential indirect, community support, that can help new and existing API consumers find what they are looking for. Don’t limit yourself to a local discussion forum, there are SaaS solutions, as well as existing developer driven communities like Stack Overflow.

Next make sure and cover the basic political building blocks:

  • Terms of Service - Provide simple, liberal terms of service that incentivize integrations and development.
  • Content Licensing - Make sure all licensing for any content is explained front and center.
  • Code Licensing - Clearly license all client side code samples, libraries and event potentially server side code for the API.
  • API Licensing - Make sure the API definition is openly licensed, so that it can be re-used, re-mixed in many ways--consider putting into API Commons.
  • Branding Requirements - Provide clear direction when it comes to branding requirements around API integrations.
  • Rate Limits - Help developers understand what rate limits are in place to keep system stable, and what ways there are to get more access.
  • Pricing - Are there any costs for API access? If there are, make sure and clearly explain what is free, and at what points do resources cost.
  • Partner Tiers - Establish multiple layers of API access, allowing trusted partners to achieve higher rate limits, and write access to resources.

That should do it! ;-) As you can see the APIs themselves are just the starting point. It is too late for edX to be API-first, which would have been much easier, but you have just get your basic APIs, the supporting business and political building blocks, and get down to the hard work of evangelizing, managing, and iterating on the edX API. The learning doesn't start until you have the API up, being consumed, and have engage users who are providing feedback for the roadmap.

Once a basic edX API is up and running, and edX has some experience under their belt in managing the API community, then some thought can be put into creating a read / write API, allowing access to student information via oAuth, as well as letting trusted partners publish content, and student data into the system. A read / write system could be done in a single release, but since edX is a little behind curve on getting API up, I recommend cutting teeth on a read only system for version 1.0.

In reality, this advice for designing, deploying, managing, and evangelizing an API applies to any company with an online presence. It is a pretty proven formula extracted from watching multiple API leaders like Twitter, Google and Amazon. In this particular scenario, edX needs to get on the ball--it is difficult for me to imagine being an online education platform that operates between multiple universities in 2014, and not having an active API. ;-)


Deploying APIs Using Heroku And 3Scale Add-On

I am playing around with deploying APIs using multiple cloud platforms, and using popular container solutions. Next up is quickly deploying one of my utility APIs, to Heroku, complete with access control, traffic reports, and supporting analytics with 3Scale infrastructure.

Application - Simple Screen Capture API
I’ve been working through an operational harness for deploying all my APIs, and the API I use to push forward my approach to API deployment, is a screen capture API. This is what I will be deploying on Heroku, creating a simple application, that will take screen captures of web pages that I pass to it.

3Scale Heroku Add-On
To make API deployment on Heroku easier, 3Scale has created a Heroku add-on that you can easily deploy for any application you have running on the PaaS platform. With just a single click I added the 3Scale Heroku add-on to my screen capture API--now I can apply 3Scale API infrastructure to my API.

API Access Control
Using 3Scale I can control who has access to my screen capture API, by setting rate limits and quotas, and if I want I can even charge for access. I can compose any type of service levels for my API that I want. I will create tiers for internal access, external partners, and a public tier where I charge a fee for each API call.

API Traffic Report and Analytics
With all of my screen capture API traffic managed with 3Scale, I can now see traffic analytics and reports, broken down any account, application or for specific service. Any developer who signs up for my screen capture API, will get access to the reports for their account usage as well.

Ready-To-Go Developer Portal
One of the benefits of using 3Scale for deploying your API on Heroku, is that you get a developer portal to go with your APIs. This is where developers will to learn about your API, register for API keys, and view reports on their usage.

Free To Get Started
One of the main reasons I use 3Scale to mange my APIs, is that even though I’m on the paid tier now, 3Scale allows me to fire up new APIs, manage access and receive reports at no charge. I don’t pay until my APIs start generating traffic, and if my business model is sound, I should be covered. You get up to 250 user accounts with a traffic volume 150,000 API calls per day before you have to move up to the next tier.

Multi-lingual APIs In Cloud
3Scale provides instructions for deploying APIs using Ruby, Node.js, and Java, allowing you to the language you prefer for API deployment.

Drive API Code From Github
I will be driving the development of the code for my screen capture API using Github. Heroku allows me connect my application to a specific Github repository, allowing me to manage all my API deployment centrally using Github.

I already have a PHP version of my screen capture API, that uses the Slim REST framework, which wraps around PhantomJS, which does the actual screen capture. Now I can take my standard utility grade, screen capture API and easily deploy on Heroku, for my own internal use, for a partner, or to any wholesale customer who would like to deploy a screen capture API in their own Heroku account.

Next I’m going to look at various ways to deploy the same screen capture API using Amazon Web Services (AWS). There are several options for deploying APIs on AWS, and hopefully I can find the easiest, lowest cost way to get an API deployed, by playing with several approaches.

Disclosure: 3Scale is an API Evangelist partner.


Developer Portals On Github: Example From Socrata

Civic data platform Socrata has launched a new developer area to support developers who are building apps on top of city, county, state and federal data, as well as the publishers of this data. A release of a new developer area is always worthy of note in my tracking of the API space—anytime I see a tweet or blog post about the release of new developer area, I will visit to see what is going on.

The first thing I find particularly interesting about Socrata’s approach is that they published as a Github repository, using Github Pages. We are seeing more government developer areas move onto Github, because of the easy site hosting with Github Pages, and the benefits of managing your projects with Git, as well as social benefits of Github and its underlying API.

Application Developers
Socrata provides a wealth of information for application developers, including a getting started guide, robust API documentation, client libraries, and D3.js assistance for those looking to build visualizations on top of government data.

Data Publishers
On the other side of the coin, Socrata also provides data publishers with what they need to get started, clean and organize their data, and the ability publish or sync with the Socrata platform, providing R, Ruby and Java libraries that really jumpstart integration.

The second thing I really like about the new Socrata portal is that it is more educational than it is technical, and tailored to onboard the next generation of government data publishers and consumers. Socrata doesn’t just provide details about API endpoints, they provide information on what is JSON and HTTP verbs. From years of experience, Socrata has a firm grasp of the amount of education the space needs to get up to speed, and it shows in their developer portal.

To provide some critique of the implementation, the formatting of the text wasn’t the easiest to consume. For some reason the font-size, density and layout made it hard to spend too scan and spend time with. I’d recommend taking a look at FullContact API in separate browser tab, then flip back and forth, to demonstrate what I mean. Next I think Socrata could go further with contrasting things between developers and non-developers—the potential is there to really onboard people who don’t write code, an example of this in action is over at Dwolla.

As part of my API monitoring, I think I will have to add some sort of tag for APIs that deploy their developer area via Github, so I can keep track of interesting implementations. When your developer area is published on Github, I just think it invites collaboration, and participation from the community.


Do US Government Web APIs Require System Interconnectivity Agreements?

I'm been so busy with work lately, I haven't been able to maintain my usual rhythm of blog posts on API Evangelist. The good news is I'm doing some interesting work that I'm able to pull stories from. This post is from a forum post I made on the US Government API forum i frequent, which has some very interesting conversations about APIs in the federal government.

In a recent post from Brian over at DC3 (Defense Cyber Crime Center), an interesting question was asked: Do US Government Web APIs Require System Interconnectivity Agreements? I will let you visit the conversation and see more detail around his question, as well as some of the other responses, but here were some of my thoughts:

Web / http APIs do not fit earlier definitions of “system interconnectivity agreements”, which represents the technical and fundamental shift between network connections, SOAP APIs and this new world of web APIs.
 
Web APIs were successful in part because of the loosely coupled nature of HTTP, leveraging a client / server and request / response. There is interconnectivity, but not the tight coupling and governance of previous network protocols and APIs. 

 APIs are the contract! Each API endpoint provides access to a resource, then with accompanying management building blocks, you dial in that contract.
 
At the API provider level, you can enforce / encourage interoperability using common web api approaches:

  • API Definitions - Machine readable API definitions like SwaggerAPI Blueprint and RAML can provide templates that enforce / encourage common blueprints that providers can follow when publishing APIs to centralized or decentralized API management platforms, establishing base contracts for interoperability, as well as underlying data models.
  • Terms of Use - Common terms of use can be established for management, allowing provides to contract their own TOS that is tailored for specific resources, but derived from common patterns. You see this emerging in private sector with Swedish API License.
  • Licensing - Much like TOS, common licensing blueprints can be provided, allowing data and resource licensing to be tailored for resources, but pulled from existing licensing pools. Many API providers are providing a base stack of licensing arrangements that meet their needs as well as consumers.
  • Service Level Agreements (SLA) - Same as TOS, service level agreements can be forged allowing API providers to meet expectations of consumers. Some SLAs are loose and some are tight, depending on goals.
  • Security - Common tooling and practices around the security of APIs need to be established, using common approaches like API Keys, oAuth, SSO and other common standards already in use. As with API interface patterns, data models, etc, each API provider should not be allowed to bring their home brew security to the table.
  • Service Composition - A practice referred to as “service composition” is a common part of all API management platforms currently available. Service composition allows API providers to build service tiers and compose products from API resources, establishing different levels of access and usage that can be designed for internal, partner or public access to resources. Ie. Service tier A allows read only access to APIs with specific rate limits, while service tier B allows read / write access with unlimited. Service composition reflects and extends the contract that an API is. 
  • Analytics - Real-time metrics give API providers with living views into API registration and usage, allowing an organic view into how resources are truly being used, allowing for real-time adjustments, enforcements, and service level composition adjustments to be made. This will allow providers to respond to adverse conditions by tightening control and stimulate innovation by loosening control as needed.

All of these provider level building blocks work in concert to standardize design, deployment and management of common API resources. They provide a common backbone that make APIs a living contract, that is flexible enough to work with many different resources and agencies, without being too rigid—providing the ability to innovate, while still establishing desired levels of governance, which will vary from agency to agency and resource to resource.

At the API consumer level, you can enforce / encourage interoperability using common web api approaches:

  • Portal - Each API will have one or many portals where you access the APIs and supporting documentation. The availability of a portal, whether its public or private can reflect access and interoperability goals for API providers. Portal availability is the gateway to interoperability enforcement / encouragement.
  • Registration - Even if you have access to an API resource, and it is publicly available, most APIs require registration to obtain required application keys or obtain necessary oAuth credentials. Registration can be self-service, invite only or require approval, providing all levels of enforcement / encouragement of agreements.
  • Security - Each API provider has designated appropriate API security levels derived from a common pool of tools and standards. Consumers are guided by API provider security standards, operate within API key restrictions, oAuth identity and access restrictions, and designated service composition frameworks that are linked to security access levels. Security is not just technology, it transcends business and politics of API interoperability.
  • Terms of Service - Every API consumer is bound by the API TOS at the point of registration, and will be legally required to agree / adhere to future changes. 
  • Best Practices - Best practices provide a plain english explanation of legal TOS for all API consumers, ensuring that all API consumers truly execute on TOS that are set forth, cause you know, nobody reads TOS.
  • Service Level Agreements (SLA) -  - Service Level Agreements can be extended to all API consumers as part of their service level composition, registration and TOS. SLAs provide necessary real-time expectations of system performance.
  • Analytics - Real-time metrics are provided to API consumers letting them know the reality of their  API consumption, where they exist within service level agreements, service composition, and other aspects of API interoperability and agreements.

All the building blocks listed above provide the two-sides of the web API coin. Modern API initiatives from Amazon, Google, Twitter and thousands of other companies are proving that loosely coupled, modular approaches to API design, deployment and management provide a flexible, agile approach to interoperability that isn't as rigid as classic API approaches or networking protocols.

All of these building blocks work in concert to orchestrate interoperability that protects the interest of API providers and consumers, and even 3rd party intermediaries. You even seen this approach to interoperability move from the technical, to meaningful reciprocity across providers, as we've seen with newer generation of automation providers like If This Then That and Zapier—building on legacy ETL concepts, but bringing into a new global, Internet era.

We have to establish case studies that will shift decision makers away from more rigid approaches. Without them we won't be able to achieve the flexibility that web APIs bring, and are left with a heavy handed Tech + Legal governance.

So to directly answer your questions:

1) Do US Gov Web APIs *require* system interconnectivity agreements?  Which policies and conditions? 

NO!

2) Can system interconnectivity agreements be mitigated to a common agreement instead of agency-specific? (i.e.  single information sharing agreement, and/or Acceptable Use Policies, common Terms of Service per API provider )

YES!

3) Which policies/guidance can be leveraged prevent the specific use of "system interconnectivity agreements" into the rest of the US Gov Web API space? 
The patterns exist in private sector and are slowly emerging in government. We just need to work to identify, standardardized and establish the case studies we need to steer away from the  specific use of "system interconnectivity agreements".

What Is Next For The US Government API Strategy, Getting Technical

Data.gov has continued to evolve, adding data sets, agencies and features. With recent, high profile stumbles like with Healthcare.gov, it can be easy to fall prey to historical stereotypes that government can't deliver tech very well. While this may apply in some cases, I think we can confidently get behind the movement that is occurring at Data.gov, with 176 agencies working to add 54,723 data sets in the last 12 months.

I feel pretty strongly that before we look towards the future of what the roadmap looks like for Data.gov, we need to spend a great deal of time refining and strengthening what we currently have available at Data.gov, and across the numerous government agency developer areas. Even with these feelings, I can't help but think about what is needed for the next couple years of Data.gov, in a more technical sense.

Maybe I'm biased, but I think the next steps for Data.gov is all API. How do we continue evolving Data.gov, and prepare to not just participate, but lead in the growing API economy?

Management Tools for Agencies
We need to invest in management tools for agencies, commercial providers as well as API Umbrella. Agencies need to be able to focus on the best quality data sets and API designs, and not have to worry about the more mundane necessities of API management like access, security, documentation and portal management. Agencies should have consistent analytics, helping them understand how their resources are being accessed and put to use. If OSTP, OMB, GSA and the public expect consistent results when it comes to open data and APIs from agencies, we need to make sure they have the right management tools.

Endpoint Design Tools for Data Sets
Agencies should be able to go from dataset to API without much additional work. Tools should be made available for easily mounting published datasets, then allow non-developers to design API endpoints for easily querying, filtering, accessing and transforming datasets. While data download will still be the preferred path for many developers, making high value datasets available via APIs will increase the number of people who access, that may not have the time to deal with the overhead of downloads.

Common Portal Building Blocks
When you look through each of the 50+ federal agency /developer areas, you see 50+ different approaches to delivering the portals that developers will depend on to learn about, and integrate with each agencies APIs. A common set of building blocks is needed to help agencies standardize how they deliver their developer portals. Each individual approach might make sense within each agency, but as a consumer when you try to work across agencies it can be a confusing maze of API interactions.

Standard Developer Registration
As a developer I need to establish a separate relationship with each federal agency. This quickly becomes a barrier to entry, one that will run off even the most seasoned developers. We want to incentivize developers to use as many federal APIs as possible, and by providing them with a single point of registration, and a common credential that will work across agencies will stimulate integrations and adoptions.

Standard Application Keys
To accompany the standard developer registration, a standard approach to user and application keys is needed across federal agencies. As a user, I should be able to create a single application definition and receive API keys that will work across agencies. The amount of work required to develop my application and manage multiple API keys will prevent developers from adopting multiple federal agency APIs. Single registration and application keys will reduce the barriers to entry for the average developer when looking to build on top of federal API resources.

Developer Tooling and Analytics
When integrating with private sector APIs, developers enjoy a wide range of tools and analytics that assist them in discovering, integrating, managing and monitoring their application's integration with various APIs. This is something that is very rare in integration with federal government APIs. Standard tooling and analytics for developers needs to become part of the standard operating procedures for federal agency /developer initiatives, helping developer be successful in all aspects of their usage of government open data and APIs.

Swagger, API Blueprint, RAML
All APIs produced in government should be described using on of the common API definitions formats that have emerged like Swagger, API Blueprint and RAML. These all provide a machine readable description of an API and its interface that can be used in discovery, interactive documentation, code libraries and SDKs and many other uses. Many private sector companies are doing this, and the federal should follow the lead.

Discoverability, Portable Interfaces and Machine Readable by Default
As with open data, APIs need to be discoverable, portable and machine readable by default. Describing APIs in Swagger, API Blueprint and RAML will do this. Allowing APIs to be presented, distributed and reused in new ways. This will allow each agency to publish their own APIs, but aggregators can take machine readable definitions from each and publish in a single location. This approach will allow for meaningful interactions such as with budget APIs, allowing a single budget API site to exist, providing access to all federal agencies budget without having to go to each /developer area, but there are many more examples like this that will increase API usage and extend the value of government APIs. If you are familiar with data.json, think api.json.

API Hubs
A new breed of API directories have emerged. API portals like Mashape and API Hub don’t just provide a directory of APIs, they simplify API management for providers and integration for API consumers. Federal agencies need to make their APIs friendly to these API hubs, maintaining active profiles on all platforms and keeping each API listed within the directories and actively engaging consumers via the platforms. Federal agencies shouldn’t depend on developers coming to their /developer areas to engage with their APIs, agencies need to reach out where developers are already actively using APIs.

Consistent API Interface Definition Models
Within the federal government each API is born within its own agencies silo. Very little sharing of interface designs and data models occurs across agencies, resulting in APIs that may do the same thing, but can potentially do it in radically different ways. Common APIs such as budget or facility directories should be using a common API interface design and underlying data model. Agencies need to share interface designs, and work together to make sure the best patterns across the federal government are used.

Webhooks
In the federal government APIs are often a one-way street allow developers to come and pull information. To increase the value of data and other API driven resources, and help reduce the load on agencies servers, APIs need to push data out to consumers, reducing the polling on APIs and making API integration much more real-time. Technologies such as the Webhook which allows API consumer to provide a web URL, in which agencies can push newly published data, changes and other real-time events to users, are being widely used to make APIs much more of a two-way street.

Hypermedia
As the world of web APIs evolve there are new approaches emerging to delivering the next generation of APIs, and hypermedia is one of these trends. Hypermedia brings a wealth of value, but most importantly it provides a common framework for APIs to communicate, and provide essential business logic and direction for developers, helping them better use APIs in line with API provider goals. Hypermedia has the potential to not just make government assets and resources available, but ensure they are used in the best interest of each agency. Hypermedia is still getting traction in the private sector, but we are also seeing a handful of government groups take notice. Hypermedia holds a lot of potential for federal agencies, and the groundwork and education around this trend in APIs needs to begin.

Evangelists Present
The first thing you notice when you engage with an government API, is nobody is home. There is nobody to help you understand how it works, overcome obstacles when integrating, there is no face to the blog posts, the tweets or the forum replies. Federal government APIs have no personality. Evangelists are desperately needed to bring this critical human element to federal government APIs. All successful private sector APIs have an evangelist or an army of evangelists, spreading the word, supporting developers, and making things work. We need open data and API evangelists at every federal agency, lettings us know some one is home.

Read / Write
I know this is a scary one for government, but as I said above in the webhooks section—APIs need to be a two way street. There are proven ways to make APIs writeable without jeopardizing the integrity of API data. Allow for trusted access, let developers prove themselves. There is a lot of expertise, “citizen cycles”, and value available in a developer ecosystem. When a private sector company uses federal government data, improves on it, the government and the rest of the ecosystem should be able to benefit as well. The federal government needs to allow for both read and write on APIs—this will be the critical next step that makes government APIs a success.

These are just 14 of my recommendations for the next steps in the API strategy for the federal government. As I said earlier, none of this should be done without first strengthening what we all have already done in the federal government around open data and APIs. However, even though we need to play catch up on whats already there, we can’t stop looking towards the future and understand what needs to be next.

None of these recommendations are bleeding edge or technology just for technology sake. This is about standardizing how APIs are designed, deployed and managed across the federal government, emulating what is already being proven to work in the private sector. If the federal government wants to add to the OpenData500, and establish those meaningful stories needed to deliver on the promise of open data and APIs--this is whats needed.

With the right leadership, education and evangelism, open data and APIs can become part of the DNA of our federal government. We have to put aside our purely techno-solutionism view of #OpenGov and realize this is seriously hard work, with many pitfalls, challenges, politics and that in reality it won’t happen over night. However, if we dedicate the resources needed, we can not just change how government works, making it machine readable by default, we can forever alter how the private and public sector partners.


Generating The Utility APIs I Need For Each Developer Portal

I'm working on an API for Free Application for Federal Student Aid (FAFSA) form. I'm working my way through a document from Department of Education called the 2013-2014 Application Processing System Specifications for Software Developers.

I'm isolating various datasets, and data models, trying to get a feel for all the API resources I will need for the FAFSA API. The specifications provides several tables of data and data models for me to use, including the actual fields for the FAFSA form.

One dataset that is provided in the PDF is a list of states. It makes sense that a list of states is included for any developer who is building a federal student aid application. And I have a states API available elsewhere, that I've used in several other applications, so building this API will for the project will be easy.

Next I'm thinking..how do I replicate this utility API easily for an external API project like this? One of the architectural considerations for the FAFSA API is to make an API that anyone could fork and easily setup for their own use. That includes the state API resource.

This got me thinking about APIs in general. When it comes to APIs, they are ften attached to brands, like the Twilio API or Twitter API. We do not think of single API definitions as ubiquitous, as utilities or as commodities yet. However there are a lot of common API resources like states, IP addresses, counties, cities and many more, that are essential to many web and mobile applications.

Sure these datasets are widely available, but you need to actually design and deploy the actual API. How do we make utility APIs like this deployable anywhere? Anytime? I should be able to select from a library of common API definitions and easily fork, then deploy to my AWS EC2, AWS Cloud Formation or OpenShift platform. I shouldn't have to rely on 3rd party services for all of these resources.

Of course, not all APIs should be this portable, but there are a number of very common APIs that could. I should be able to launch a new API portal for a public audience or even a private development group, and easily publish the unique, as well as the utility APIs they will need to be successful, in a single, easy to navigate API area.


IRS Modernized e-File (MeF): A Blueprint For Public & Private Sector Partnerships In A 21st Century Digital Economy (DRAFT)

Download as PDF

The Internal Revenue Service is the revenue arm of the United States federal government, responsible for collecting taxes, the interpretation and enforcement of the Internal Revenue code.

The first income tax was assessed in 1862 to raise funds for the American Civil War, and over the years the agency has grown and evolved into a massive federal entity that collects over $2.4 trillion each year from approximately 234 million tax returns.

While the the IRS has faced many challenges in its 150 years of operations, the last 40 years have demanded some of the agency's biggest transformations at the hands of technology, more than any time since its creation.

In the 1970s, the IRS began wrestling with the challenge of modernizing itself using the latest computer technology. This eventually led to a pilot program in 1986 of an new Electronic Filing System (EFS), which aimed in part to gauge the acceptance of such a concept by tax preparers and taxpayers.

By the 1980s, tax collection had become very complex, time-consuming, costly, and riddled with errors, due to what had become a dual process of managing paper forms while also converting these into a digital form so that they could be processed by machines. The IRS despereatly needed to establish a solid approach that would enable the electronic submission of tax forms.

It was a rocky start for the EFS, and Eileen McCrady, systems development branch and later marketing branch chief, remembers, “Tax preparers were not buying any of it--most people figured it was a plot to capture additional information for audits." But by 1990, IRS e-file operated nationwide, and 4.2 million returns were filed electronically. This proved that EFS offered a legitimate approach to evolving beyond a tax collection process dominated by paper forms and manual filings.

Even Federal Agencies Can't Do It Alone

Even with the success of early e-file technology, the program did not get the momentum it needed without the support of two major tax preparation partnerships--H&R Block and Jackson-Hewitt. These helped change the tone of EFS efforts, making it more acceptable and appealing to tax professionals. It was clear that e-File needed to focus on empowering a trusted network of partners to submit tax forms electronically, sharing the load of tax preparation and filing with 3rd party providers. And this included not just the filing technology, but a network of evangelists spreading the word that e-File was a trustworthy and viable way to work with the IRS.

Bringing e-File Into The Internet Age

By 2000, Congress had passed IRS RRA 98, which contained a provision setting a goal of an 80% e-file rate for all federal tax and information returns. This, in effect, forced the IRS to upgrade the e-File system for the Internet age, otherwise they would not be able meet this mandate. A working group was formed, comprised of tax professionals and software vendors that would work with the IRS to design, develop and implement the Modernized e-File(MeF)-Program-Information) system which employed the latest Internet technologies, including a new approach to web services which used XML that would allow 3rd party providers to submit tax forms in a real-time, transactional approach (this differed from the batch submissions required in a previous version of the EFS).

Moving Beyond Paper One Form At A Time

Evolving beyond a 100 years of paper process doesn't happen overnight. Even with the deployment of the latest Internet technologies, you have to incrementally bridge the legacy paper processes to a new online, digital world. After the deployment of the MeF, the IRS worked year by year to add the myriad of IRS forms to the e-File web service, allowing software companies, tax preparers, and corporations to digitally submit forms into IRS systems over the Internet. Form by form, the IRS was being transformed from a physical document organization to a distributed network of partners that could submit digital forms through a secure, online web service.

Technological Building Blocks

The IRS MeF solution represents a new approach to using modern technology by the federal government in the 21st century Internet age. In the last 15 years, a new breed of Internet enabled software standards have emerged that enable the government to partner with the private sector, as well as other government agencies, in ways that were unimaginable just a decade ago.

Web Services

Websites and applications are meant for humans. Web services, also known as APIs, are meant for other computers and applications. Web services has allowed the IRS to open up the submission of forms and data into central IRS systems, while also transmitting data back to trusted partners regarding errors and the status of form submissions. Web services allow the IRS to stick with what it does best, receiving, filing and auditing of tax filings, while trusted partners can use web services to deliver e-Filing services to customers via custom developed software applications.

Web services are designed to utilize existing Internet infrastructure used for everyday web operations as a channel for delivering trusted services to consumers around the country, via the web.

An XML Driven Communication Flow

XML is a way to describe each element of IRS forms, and its supporting data. XML makes paper forms machine readable so that the IRS and 3rd party systems can communicate using a common language, allowing IRS to share a common set of logic around each form, then use what is known as schemas, to validate the XML submitted by trusted partners against a set of established business rules that provide enforcement of the IRS code. XML gives the ability for IRS to communicate with 3rd party systems using digital forms, applying business rules to reject or accept the submitted forms, which then can be stored in an official IRS repository in a way that can be viewed and audited by IRS employees (using stylesheets which make the XML easily readable by humans).

Identity and Access Management (IAM)

When you expose web services publicly over the Internet, secure authentication is essential. The IRS MeF system is a model for securing the electronic transmission of data between the government and 3rd party systems. The IRS has employed a design of the Internet Filing Application (IFA) and Application to Application (A2A) which are features of the Web Services-Interoperability (WS-I) security standards. Security of the MeF system is overseen by the IRS MITS Cyber Security organization which ensures all IRS systems receive, process, and store tax return data in a secure manner. MeF security involves an OMB mandated Certification and Accreditation (C&A) Process, requiring a formal review and testing of security safeguards to determine whether the system is adequately secured.

Business Building Blocks

To properly extend e-File web services to partners isn't just a matter of technology. There are numerous building blocks required that are more business than technical, ensuring a healthy ecosystem of web service partners. With a sensible strategy, web services need to be translated from tech to business, allowing partners to properly translate IRS MeF into e-filing products that will deliver required services to consumers.

Four Separate e-Filing Options

MeF provided the IRS with a way to share the burden of filing taxes with a wide variety of trusted partners, software developers and corporations who have their own software systems. However MeF is just one tool in a suite of e-File tools. These include Free File software that any individual can use to submit their own taxes, as well as free fillable digital forms that individuals can use if they do not wish to employ a software solution.

Even with these simple options, the greatest opportunities for individuals and companies is to use commercial tax software that walks one through what can be a complex process, or to depend on a paid tax preparer who employ their own commercial versions of tax software. The programmatic web service version of e-file is just one option, but it is the heart of an entire toolkit of software that anyone can put to use.

Delivering Beyond Technology

The latest evolution of the e-file platform has technology at heart, but it delivers much more than just the transmission of digital forms from 3rd party providers, in ways that also make good business sense:

  • Faster Filing Acknowledgements - Transmissions are processed upon receipt and acknowledgements are returned in near real-time, unlike the once or twice daily system processing cycles in earlier versions
  • Integrated Payment Option - Tax-payers can e-file a balance due return and, at the same time, authorize an electronic funds withdrawal from their bank accounts, with payments being subject to limitations of the Federal Tax Deposit rules
  • Brand Trust - Allowing MeF to evolve beyond just the IRS brand, allowing new trusted commercial brands to step up and deliver value to consumer, like TurboTax and TaxAct.

Without improved filing results for providers and customers, easier payment options and an overall set of expectations and trust, MeF would not reach the levels of e-Filing rates mandated by Congress. Technology might be the underpinning of e-File, but improved service delivery is the thing that will seal the deal with both providers and consumers.

Multiple Options for Provider Involvement

Much like the multiple options available for tax filers, the IRS has established tiers of involvement for partners to be involved with the e-File ecosystem. Depending on the model and capabilities, e-File providers can step up and be participate in multiple ways:

  • Electronic Return Originators (EROs) - ERO prepare returns for clients or have collected returns from taxpayers who have prepared their own, then begin the electronic transmission of returns to the IRS
  • Intermediate Service Providers - These providers process tax return data, that originate from an ERO or an individual taxpayer, and forward to a transmitter.
  • Transmitters - Transmitters are authorized to send tax return data directly to the IRS, from custom software that connect directly with the IRS computers
  • Online Providers - Online providers are a type of transmitter that sends returns filed from home by taxpayers using tax preparation software to file common forms
  • Software Developers - write the e-file software programs that follow IRS specifications for e-file.
  • Reporting Agents - An accounting service, franchiser, bank or other person that is authorized to e-file Form 940/941 for a taxpayer.

The IRS has identified the multiple ways it needed help from an existing, evolving base of companies and organizations. The IRS has been able to design its partner framework to best serve its mission, while also delivering the best value to consumers, in a way that also recognizes the incentives needed to solicit participation from the private sector and ensure efforts are commercially viable.

Software Approval Process

IRS requires all tax preparation software used for preparing electronic returns to pass the requirements for Modernized e-File Assurance Testing (ATS). As part of the process software vendors notify IRS via an e-help Desk, that they plan to commence testing, then provide a list of all forms that they plan to include in their tax preparation software, but do not require that vendors support all forms. MeF integrators are allowed to develop their tax preparation software based on the needs of their clients, while using pre-defined test scenarios to create test returns that are formatted in the specified XML format. Software integrators then transmit the XML formatted test tax returns to IRS, where an e-help Desk assister checks data entry fields on the submitted return. When IRS determines the software correctly performs all required functions, the software is approved for electronic filing. Only then are software vendors allowed to publicly market their tax preparation software as approved for electronic filing -- whether for usage by corporations, tax professionals and individual users.

State Participation

Another significant part of the MeF partnership equation is providing seamless interaction with the electronic filing of both federal and state income tax returns at the same time. MeF provides the ability for partners to submit both federal and state tax returns in the same "taxpayer envelope", allowing the IRS to function as an "electronic post office" for participating state revenue services -- certainly better meeting the demands of the taxpaying citizen. The IRS model provides an important aspect of a public / private sector partnership with the inclusion of state participation. Without state level participation, any federal platform will be limited in adoption and severely fragmented in integration.

Resources

To nurture an ecosystem of partners, it takes a wealth of resources. Providing technical, how-to, guides, templates and other resources for MeF providers is essential to the success of the platform. Without proper support, MeF developers and companies are unable to keep up with the complexities and changes of the system. The IRS has provided the resources needed for each step of the e-Filing process, from on-boarding, to how to understanding the addition of the latest forms, and changes to the tax code.

Market Research Data

Transparency of the MeF platform goes beyond individual platform operations, and the IRS acknowledges this important aspect of building an ecosystem of web service partners. The IRS provides valuable e-File market research data to partners by making available e-file demographic data and related research and surveys. This important data provides valuable insight for MeF partners to use in their own decision making process, but also provides the necessary information partners need to educate their own consumers as well as the general public about the value the e-File process delivers. Market research is not just something the IRS needs for its own purposes; this research needs to be disseminated and shared downstream providing the right amount of transparency that will ensure healthy ecosystem operations.

Political Building Blocks

Beyond the technology and business of the MeF web services platform, there are plenty of political activities that will make sure everything operates as intended. The politics of web service operations can be as simple as communicating properly with partners, providing transparency, or all the way up to security, proper governance of web service, and enforcement of federal laws.

Status

The submission of over 230 million tax filings annually requires a significant amount of architecture and connectivity. The IRS provides real-time status of the MeF platform for the public and partners, as they work to support their own clients. Real-time status updates of system availability keeps partners and providers in tune with the availability of the overall system, allowing them to adjust availability with the reality of supporting such a large operation. Status of availability is an essential aspect of MeF operations and overall partner ecosystem harmony.

Updates

An extension of MeF platform status is the ability to keep MeF integrators up-to-date on everything to do with ongoing operations. This includes providing alerts when the platform needs to tune-in platform partners to specific changes with tax law, resource additions, or other relevant news of operations. The IRS also provides updates via an e-newsletter, providing a more asynchronous way for the IRS MeF platform to keep partners informed about ongoing operations.

Updates over the optimal partner channels are an essential addition to real-time status and other resources that are available to platform partners.

Roadmap

In addition to resources, status and regular updates of platform status of the overall MeF system, the IRS provides insight into where the platform is going next, keeping providers apprised with what is next for the e-File program. Establishing and maintaining the trust of MeF partners in the private sector is constant work, and requires a certain amount of transparency -- allowing partners to anticipate what is next and make adjustments on their end of operations. Without insight into what is happening in the near and long term future, trust with partners will erode and overall belief in the MeF system will be disrupted, unraveling over 30 years of hard work.

Governance

The Modernized e-File (MeF) programs go through several stages of review and testing before they are used to process live returns. When new requirements and functionality are added to the system, testing is performed by IRS's software developers and by IRS's independent testing organization. These important activities ensure that the electronic return data can be received and accurately processed by MeF systems. Every time an IRS tax form is changed and affects the XML schema, the entire development and testing processes are repeated to ensure quality and proper governance.

Security

Secure transmissions by 3rd parties with the MeF platform is handled by the Internet Filing Application (IFA) and Application to Application (A2A), which are part of the IRS Modernized System Infrastructure, providing access to trusted partners through the Registered User Portal (RUP). Transmitters using IFA are required to use their designated e-Services user name and password in order to log into the RUP. Each transmitter also establishes a Electronic Transmitter Identification Number (ETIN) prior to transmitting returns. Once the transmitter successfully logs into the RUP, a Secure Socket Layer (SSL) Handshake Protocol allows the RUP and transmitter to authenticate each other, and negotiate an encryption algorithm, including cryptographic keys before any return data is transmitted. The transmitter’s and the RUP negotiate a secret encryption key for encrypted communication between the transmitter and the MeF system. As part of this exchange, MeF will only accommodate one type of user credentials for authentication and validation of A2A transmitters; username and X.509 digital security certificate. Users must have a valid X.509 digital security certificate obtained from an IRS authorized Certificate Authority (CA), such as like VeriSign or IdenTrust, then have their certificates stored in the IRS directory using an Automated Enrollment process.

The entire platform is accredited by the Executive Level Business Owner, who is responsible for the operation of the MeF system, with guidance provided by the National Institute of Standards (NIST). The IRS MITS Cyber Security organization and the business system owner are jointly responsible and actively involved in completing the IRS C&A Process for MeF, ensuring complete security of all transmissions with MeF over the public Internet.

A Blueprint For Public & Private Sector Partnerships In A 21st Century Digital Economy

The IRS MeF platform provides a technological blueprint that other federal agencies can look to when exposing valuable data and resources to other agencies as well as the private sector. Web services, XML, and proper authentication can open up access and interactions between trusted partners and the public in ways that were never possible prior to the Internet age.

While this web services approach is unique within the federal government, it is a common way to conduct business operations in the private sector -- something widely known as Service Oriented Architecture (SOA), an approach that is central to a healthy enterprise architecture. A services oriented approach allows organizations to decouple resources and data and open up very wide or granular levels of access to trusted partners. The SOA approach makes it possible to submit forms, data, and other digital assets to government, using XML as a way to communicate and validate information in a way that supports proper business rules, wider governance, and the federal law.

SOA provides three essential ingredients for public and private sector partnership:

  • Technology - Secure usage of modern approaches to using compute, storage and Internet networking technology in a distributed manner
  • Business - Adherence to government lines of business, while also acknowledging the business needs and interest of 3rd party private sector partners
  • Politics - A flexible understanding and execution of activities involved in establishing a distributed ecosystem of partners, and maintaining an overall healthy balance of operation

The IRS MeF platform employs this balance at a scale that is unmatched in federal government currently. MeF provides a working blueprint can be applied across federal government, in areas ranging from the veterans claims process to the financial regulatory process.

The United States federal government faces numerous budgetary challenges and must find new ways to share the load with other federal and state agencies as well as the private sector. A SOA approach like MeF allows the federal government to better interact with existing contractors, as well as future contractors, in a way that provides better governance, while also allowing for partnership with the private sector in ways that goes beyond simply contracting. The IRS MeF platform encourages federal investment in a self-service platform that enable trusted and proven private sector partners to access IRS resources in predefined ways -- all of which support the IRS mission, but provide enough incentive that 3rd party companies will invest their own money and time into building software solutions that can be fairly sold to US citizens.

When an agency builds an SOA platform, it is planting the seeds for a new type of public / private partnership whereby government and companies can work together to deliver software solutions that meet a federal agency's mission and the market needs of companies. This also delivers value and critical services to US citizens, all the while reducing the size of government operations, increasing efficiencies, and saving the government and taxpayers money.

The IRS MeF platform represents 27 years of the laying of a digital foundation, building the trust of companies and individual citizens, and properly adjusting the agency's strategy to work with private sector partners. It has done so by employing the best of breed enterprise practices from the private sector. MeF is a blueprint that cannot be ignored and deserves more study, modeling, and evangelism across the federal government. This could greatly help other agencies understand how they too can employ an SOA strategy, one that will help them better serve their constituents.

You Can View, Edit, Contribute Feedback To This Research On Github


If you think there is a link I should have listed here feel free to tweet it at me, or submit as a Github issue. Even though I do this full time, I'm still a one person show, and I miss quite a bit, and depend on my network to help me know what is going on.