For the last two months, I’ve been delaying updates of the front-page and tag pages after publishing new posts. In the same period, RSS subscribers have gained a 12 hour exclusivity period on new posts. The post pages themselves have been published immediately so it can be accessed or shared by any subscriber.
This post is about a small-scale publishing strategy I’ve adopted on this site and the reasons behind it. This post isn’t for you if you are don’t maintain a website or blog nor find content distribution and paid content interesting.
Why do this?
With this new publishing scheme, the post pages are published and will remain accessible by anyone with the direct link to it. However, the link is published in the RSS feed half a day earlier than its published on the front-page. Thus creating exclusive access to discover the new post through the RSS feed, or someone/-thing who is a subscriber.
Artificial scarcity is a common online marketing scheme for creating value out of nothing. Creating an extra incentive using artificial scarcity to onboard new subscribers is something I’ve considered for a long time. RSS subscribers are some of the most valuable readers a site can have. Subscribers have opted into getting a steady stream of updates from the site. They have already decided to like its contents and they are very likely to become return visitors. Gaining new subscribers thus becomes very important to the success of any blog.
The shareability of the posts is itself an interesting feature. Readers who are aware of the scheme, I theorize, will be more likely to share a link to the post as they know they would be linking to something not everyone has access to. They are resharing something exclusive. As website operators know, backlinks and attention on social media is key to any website’s success.
So, if you are an Atom feed subscriber, you may be reading this article hours before everyone else!
Secondary to gaining new subscriptions, I also find myself rooting for the RSS technology itself. Having an open and direct subscription standard is an important tool for the independent web (“indieweb”) and anyone who wants to publish on the web. RSS is a powerful and important tool, and I believe it’s worth preserving. The new publishing scheme I’m using is also an attempt to revitalize and renew interest in the technology.
How is it done?
This site is a static site generated using Nikola. Meaning it consists of static documents generated as files on one machine and synced as-is to a server for public consumption. There are plenty of publishing tools like this and the methods I describe here can be easily be adapted to work with any static publishing system.
The deployment process (actual script below) does the following steps: Publish everything but a few exceptions from output on the local machine to blog/public/ on the remote server (including the new posts and the RSS feeds but excluding webpages that link to the new post.)
The older copies of the delayed files will remain untouched on the server. Then it copies over the delayed files (the webpages linking to the new post) to a separate staging directory. Finally, a new task is set up on the server to move the delayed files from the inaccessible staging directory to the public server directory twelve hours from now.
Below is the deployment commands I use with this site (NIKOLA_DEPLOY
commands if you are using Nikola.) It is mostly self-explanatory to anyone familiar with the rsync and ssh tools.
The scheduling is handled by the at
queue (atq
.) This queued task is preserved across reboots and interruptions assuming the system is somewhat sane. The files may be published too early if multiple deployments occur within the twelve-hour window.
However, I don’t consider this a problem as I only make a few posts per month anyway. The first task will execute and publish all updates on time. Any later tasks will fail as the files will already have been published. This can be fixed by caching each scheduled deployment in its own timestamped directory instead of relying on one staging directory.
No privilege escalation is required beyond those already required for publishing files.
I’ll quickly mention that dynamic publishing tools like the ever-popular WordPress can achieve the same kind of system with some very small modifications. Using WordPress as an example, their loop function can be tweaked to exclude posts no older than 12 hours (in home.php or index.php.)
Monetization potential?
This publishing method also holds some potential for larger special-interest websites or popular blogs who seek to monetize their readerships.
Website paywalls — favored by unspecialized general news websites — is a publishing scheme were articles and other content is only available to users who are logged in and have paid a monthly fee. This strategy doesn’t work on the web as all website depends on discoverability through search engines and social media.
A search engine can’t easily see through a paywall and who are likely to penalize the ranking of sites requiring payment. Links to sites requiring payment are also dead ends to any non-subscriber. So, what sane person would share a link to it?
I believe a limited exclusivity period followed by general availability/discoverability is a much better approach than the blanket paywall approach. Maintaining shareability and general availability is super important. Blocking everyone out of a website goes against some fundamental principles of the open Internet.
I can imagine a theoretical website giving free subscribers — via email, RSS, push notifications, or whatever technology fits their audience — a six-hour exclusive. That subscription could then be upgraded to a paid Gold subscription with 18 hours exclusivity, or even a more expensive Platinum subscription with 48-hour exclusivity.
True fans of a website adopting this publishing scheme would be more likely to pay. As they would get the content they like earlier and they could share it with friends. As demonstrated in this post, it isn’t an especially technically complicated paywall to set up or maintains either. Especially because leaks in the paywall is considered a benefit and not a problem.
Patient readers or those who can’t afford nor use whatever payment method is required — not everyone has a credit card — could simply wait a bit longer and still get access. The information would still be accessible and the website would still be the main source where everyone read the content. Content scraping — the process of stealing content from one website and publishing it on another with ads benefiting the thief — would also be less appealing as the website who publishes the complete content first is favored in search rankings.
There are some interesting possibilities here that I believe should be more thoroughly explored.