Twitter is one of the most ubiquitous social networks out there. As such, it's not uncommon to see a list of related tweets accompanying an article or blog post to provide additional context into the subject matter at hand.
I recently implemented a Twitter widget on a Drupal site that has a significant amount of traffic, and had to weigh my options careful with regard to several concerns:
- Twitter might be unavailable at any point
- The service limits the number of requests that can be made in an hour, and this is based on the requestor's external IP address
- Fetching and processing incoming feed data in a production environment could be taxing
Here is a list of some of the options I considered, and the associated pros and cons I considered while determining my ultimate solution.
This option provides the most real-time display of Twitter content possible, since it is fetching data live from Twitter at the moment of the page request. All processing is done client side, so there is virtually no impact on server performance. Using the widget requires very little technical experience to implement, and doesn't take much time to do so. With some basic development skill, it's possible to alter the list or account used based on taxonomy terms or node fields in order to provide additional context to the tweets displayed. Implementation of this widget is CMS-agnostic.
Twitter Pull Module
There is a useful Drupal module called Twitter Pull that does exactly that -- it pulls Twitter feeds for you and displays the desired tweets in a block.
This fetches content and caches the last good state, which will prevent blank blocks. There will always be content. It is still possible to use account and/or list overrides based on theme settings, taxonomy terms, node fields, etc. When cached, the page actually renders a little faster in the browser, since the content is already present. Takes little time to enable the module and place the block, and requires the least technical know-how of any of the options here.
This causes Drupal to have to fetch the content periodically (we can set this cache TTL). This adds (according to tests performed in a development environment) an additional 0.3 seconds of load time when the tweets block is not presently cached. This increase in time is due to Drupal having to make HTTP requests to Twitter. This option also has the ill effect of being slower to update content, vs. the near-real time nature of client-side updates. Since all Twitter requests will come from the web server, the rate limit will apply to all aggregate users instead of each client having their own rate limit -- this will only be a problem if there are a significant number of user+list combinations across the site; hence making multiple uncached requests to Twitter. This will be especially worrisome on shared hosting environments, as the API requests will count against the external IP address of the server of all the web sites, and not just against the individual web site. If multiple sites use the same module, the API requests count against all sites together.
Custom CDN Integration
Assuming the CDN is stable, this is the most reliable option in terms of always delivering content. Even if the hourly API limit is reached, the CDN will hold on to the last known good state and serve that until the next hour when new content can be fetched again. CDNs are distributed, so, while some users may be affected by the hourly limit, users in a different region (where the CDN request will come from a different IP address) won't be affected.
This requires much more customization than any other option, to say nothing of the increase in technical skill required. There may be bandwidth costs associated with serving the proxied Twitter content from the CDN. Content from Twitter will be delayed as the CDN cache TTL (though, short as it may be) will prevent real-time requests from Twitter if the CDN already has cached responses.