Tuesday, May 7, 2013

Adventures in OAuth for Securing REST API Services

So you want to build a REST API that the rest of the world can use.  Maybe you have an internal application that can benefit from integrating with your external customers' and vendors' systems.  Even if the services you plan to build have no outside users, you may have internal groups that may need to consume these resources.  Clearly, securing these resources are a top priority and using an open standard has its many benefits over creating something proprietary.  However, once you take a precursory glance at OAuth, you realize there's a lot more to it than some token exchanges and signatures.  In fact, there are more than one version floating around with various different flows depending on the type of entity trying to consume the service.  Feeling overwhelmed, you wonder if maybe just hacking together something that works for your specific solution will be adequate.  However, deep down inside you know it will come back to bite you if you go down that path.  And after all, your a software engineer, this should be in you wheelhouse, right?  So you hunker down and start reading...

Taking a Step Back

After reading a lot of articles, code, RFCs, and specific implementations (Yahoo, Google, Twitter, to name a few), I decided to try to decompose the problem into a few pieces:

  • Access Authorization - This is concerned with controlling access to the resources provided by the API. There are generally two levels of authorization:

    • Client - This level identifies the party authorized to access your services. The only context is the client and there is no user-based data available with this level of access. The purpose of this control is to be able to attach policies to the client with the intention of call limits, usage metering, and restricting the available services.

    • User/Owner - This level enables creating a context to provide access to user-based data. The purpose of this control is to ensure clients can only access user data that the owner of the data has explicitly granted access.  The process of granting/revoking access is not concerned with verifying the user's identity.

  • Users Authentication - This is the process of verifying the user is actually owner of the data they want to access. Various strategies can be employed to verify the data owner's identity.  Because it is typically involved in the authorization process, there is some level of coupling required to properly transition between the services and change state.

  • Request Verification - This area is concerned with ensuring that requests are authentic. The goal is to thwart attacks that might allow access to resources to another party than who should have the access.  This process can be difficult to implement since both the client and server libraries must agree on the strategy and it puts more burden on the server to maintain some state on prior requests.

A Closer Look

The next step is to look at existing standards to see how each problem is addressed.  Since OAuth is the predominant standard in use, its a good starting point.  You can argue over specifics of how version 1 or 2 do things better or worse but the primary concern I have is to look at how each area is addressed to try to establish a common denominator among solutions.


Here's a very typical flow for authorizing access to protect owner data.  The primary goal is not to reveal anything about the owner except what's available through the scope defined in the final access key.  The combination of consumer/owner defines the relationship and level of access.  All the versions of OAuth define a authorization flow to gain access to protected user's data.  While there are clear differences between flows, the main take-away is that this all boils down to three main steps: Initialize-> Authorize -> Finalize.

OAuth 1.0 Authentication Flow


There's nothing new about validating a user's identity.  Just about every application does it and various strategies exist to perform this task.  What you end up with depends on the sensitivity of the data your protecting.  Keeping the details of how you protect accounts, manage passwords, and access rights should be neatly tucked away in this service.  It does need to be aware of an authorization request and properly affirming the authentication was successful so that process can continue.


It is important to verify the authenticity of each request made to your API.  Assuming that all inbound requests are original and not manipulated is naive.  OAuth 1.0 dedicates a large portion of the specification to constructing verifiable requests.  The burden is on the service provider to implement the strategies detailed in the specification.  Even if you chose to use another authorization solution (like OAuth2), adding a signature and nonce/timestamp to the request makes a lot of sense.  Granted, there is definitely some complexity and performance concerns with adding this layer of validation but all the effort in securing the authorization and validating the user's credentials seems pointless if you don't check that the requests being processed are even valid.

Putting It All Together

All three of these areas contribute to an overall solution to secure REST API services.  Depending on the version of OAuth you intend to use, the standard will address some, but not all of these areas.  The challenge is to architect a solution that properly distinguishes each mechanism and does overly couple each component to the other.  Additionally, since user interaction is implied in several of these areas, a presentation layer is necessary to enable user input.  In a pure services solution, these components will need to be properly separated to preserve the distinction between the control logic and the presentation.

After segmenting the solution into these different buckets, I'm still faced with realizing both server-side and client-side components to implement the different strategies.  As I've reviewed different solutions, most have the presentation is mixed with the services logic.  This has made it difficult to find solutions that can be readily plugged into my environment without compromising the architecture I'm trying to achieve.  My goal is to have this solution sit as an initial layer in front of both the client and server logic so its mostly transparent to the rest of the application but provides a very simple interface that can be used to verify the system is in the correct state for the given context (public vs private access, access scope, etc).  Ensuring that services are distinct from presentation is a top priority.  As a starting point, I've identified the following major components that need to be built or integrable off-the-shelf:


Browser-based using a BackboneJS framework to consume REST API services and render presentation and manage user interaction.

  • OAuth Authorization Flow Controller - ensures all necessary access tokens are maintained, publishes some events, intercepts requests and adds necessary verification elements.

  • OAuth Adapters - implementation details for specific versions of OAuth and variations found in specific provider's solutions.

  • URI and Verification Utilities - helper libraries that implement the low-level processing required in an OAuth-based solution.

  • Authentication Controller - implements a login/authorization page for use with both external and internal consumers


Use a Ruby-based stack.  Two distinct servers - one to deliver assets to the browser-base client and the other to implement the REST API services.

  • OAuth Provider - expose services to implement the authorization flow

  • Identity/Authentication - expose services to provide user identify verification and information

  • OAuth Verification - rack middleware to verify request authenticity and setup the user context for all downstream handlers

  • Authorization Library - common interface for all service to interact with to manage state and provide information

All of these points are discussions within themselves and different solutions exist that can address each of them.  At this juncture, this is just a rough sketch of the direction that seems to make sense now.  The ultimate solution depends on the available tools and libraries that currently exist and finding an appropriate balance between how coupled the different parts of the solution need to be to retain maintainability and flexibility.

Until then, the adventure continues...