Authentication caching: How it reduces enterprise network congestion

Michael Cobb explores the pros and cons of authentication caching and whether the practice can truly calm network strain.

We're exploring a concept called authentication caching in which an appliance sits on the network edge and caches authentication requests. If you have a high volume of external users, the caching reduces internal traffic by minimizing hits to authentication, application and database servers. It's an interesting concept, but how mainstream is it, and is there anything we should be wary of?

Ask the expert

Michael Cobb is ready to answer your application security questions. Submit them now via email!

Caching is a key technology for alleviating the computational and economic burdens faced by today's overstrained network infrastructures. Nearly all Web applications benefit from having static content cached, but Web 2.0 applications consist of dynamic, personalized content, so any caching strategy must be able to deliver both static and dynamically generated content to improve response times for feature-rich pages. High-traffic websites can benefit from authentication caching, as it provides additional relief for back-end servers and handles all Authorization requests on behalf of protected application and database servers. The service typically runs on a machine close to the network perimeter and can also include a load-balancing component to provide a point-of-presence node that directs network traffic flow, which reduces congestion and balances the load on various other services and systems.

When a network makes use of authentication caching, a user's request to the application server is only forwarded once they are authorized and verified as having the necessary permission to access the requested content. That authentication decision is then cached, reducing the need to constantly look up information that changes infrequently. Authentication is a relatively costly operation to perform, so by reducing the number or authentication lookups, overall Web application performance is improved and delays in page delivery times are reduced. In terms of popularity, authentication caching is quite common on high-volume websites, as enterprises can serve dynamic content faster, to more users and using fewer computing resources.

Authentication caching systems nearly always support auditing and create logs of invalid login and access attempts, as well as details of user activity and actions. It is critical to ensure the server is resourced adequately to handle the expected number of concurrent users. Enterprises should also monitor the session timeout value used -- this is the period of time after which inactive users must re-authenticate to regain access. Longer authentication cache timeout values can increase security risks. However, smaller timeout values can affect performance, since the server needs to access the user registry more frequently.

Depending on the product deployed, it may be necessary to modify how user-customized content is generated or displayed on a page, such as loading it via a second HTTP request performed via Ajax. This can impact the initial cost of deploying authentication caching due to additional development and testing. Often, pages are broken down into cacheable and non-cacheable page fragments and assembled into HTML pages as required when requested by end users. For example, Drupal provides caching for authenticated users by setting an extra cookie when the user logs in, putting the user's roles hashed in a key. It can then serve cached pages based on these roles without having to check permissions each time.

Dig Deeper on Identity and access management

Enterprise Desktop
Cloud Computing