How to Incorporate Humans.txt into Your WordPress Site

Have you ever asked yourself "Who made it?" or "How did they make it?" looking at some pretty (or awful) website somewhere on the web. Of course there are plenty of tools to make your guess but what if developers have a straightforward way to sign the work they’ve done even on client sites. That is the case where humans.txt comes in as a simple, legal and unobtrusive solution.

What is Humans.txt?

Humans.txt is a quite new initiative started by a Spanish team of enthusiasts to bring a "client-friendly" method for developers to indicate their authorship on a site. The main idea is very simple. Include all information about "humans behind the site" and tools used by them into a plain text file, name it humans.txt and put it in the root of your site (near the robots.txt). That will make it available under the request of http://yourdomain/humans.txt and by providing somewhere on site a link to this file you can make it discoverable.

What is the reason for doing this? When developing sites for clients, especially in the corporate field, it’s not always possible to leave a direct footprint on it. But using humans.txt provides a simple and unobtrusive way to present the information about authors for visitors. Even Google does it, why should we not?

Some Important Questions Before We Start

With this tutorial you will learn how to add dynamically generated humans.txt to the WordPress installation. That means you don’t have to actually create a file, edit and upload it each time you need to make changes. But why is it a better solution? When you’re dealing with just one or a few sites the file is good enough.

However when you have a significant range of clients’ sites to support, keeping all information concise and correct could be an unnecessary pain for such a simple task. So we are going to do it the WordPress way - we are going to make the content of humans.txt generated by our code, make it filterable for some probable future needs and serve it under the appropriate request.

Sounds good, but may be there is a plug-in for that? Of course, the WordPress repository has already had a number of plug-ins to solve this problem. But encapsulating every particular task into a separate plug-in on several sites will have finally led you to an enormous number of plug-ins to watch, keep up to date, test for compatibility with your own code etc. So we will do it ourselves.

Another point worth mentioning is a place to include our code - where should it go? We will not wrap it into a separate plug-in because of the reason mentioned above. So, should we put it into functions.php of the active theme? That’s the simplest option available, but there is a better one.

If you have not already adopted the concept of a custom functionality plug-in to store all your tweaks and customisations for a particular WordPress installation, you should probably familiarise yourself with it, because it’s a significant improvement to a WordPress developer’s workflow. And it’s the most suitable place to include our code.

The Code

Let us first produce a simple plan to understand what we are going to do:

generate the content for humans.txt and make it filterable

make WordPress recognise a humans.txt request to prevent a 404 error

upon request display the generated content instead of loading a template file

include a link to humans.txt file in the head section of the site

bonus: include humans.txt badge with a link to a file in the footer of the site

We have also a few requirements to check:

WordPress has to be installed in the site root (because humans.txt should be in the root)

pretty permalinks have to be active in order to use dynamically rewritten URL

Inside do_humans we take care of sending correct headers without output. do_humanstxt action allows us to modify this part in the future. Then we create some default content for displaying and register humans_txtfilter hook to allow changes to this content when needed. Then, after taking care of line endings, we output the result.

Humanstxt.org provides us with various graphics to use if possible, so we just download one of them into our functionality plug-in folder (but make sure you use the correct path to this graphic file on your installation). Then we display this file as a linked image:

Introducing New URL to WordPress

The frl_humanstxt_init function registers the humans.txt url with WordPress. Let’s explain it step by step.

First of all we check for the root installation of WordPress by parsing the home URL of our site. Then we also check for the permalink_structure option that stores the structure of the posts’ permanent links. It could be treated as an indicator that "pretty permalinks" are active on our installation, which is an obligatory requirement for our technique.

When we use pretty permalinks, every requested link on a site - http://yourdomain.com/some_query - is dynamically rewritten by the server into more programmatically obvious form http://yourdomain.com/?query_var=value. The rule that allows WordPress to find a match between "pretty" and "ugly" forms of the same page is a rewrite rule.

Every installation of WordPress generates the set of such rules in a form of regular_expression_to_match => actual_url_to_fetch and stores them into rewrite_rules option. WordPress Rewrite API allows developers to add their own rules to the default set to make WordPress recognise custom URL structures. With add_rewrite_rule function we are doing exactly that. The request http://yourdomain.com/humans.txt will be rewritten as http://yourdomain.com/?humans=1 with our rule.

After adding our rewrite rule for the first time we also should force WordPress to regenerate the whole set of rules and write them into the DB for future use. That could be achieved with the flush_rewrite_rules function which, for performance sake, we execute only after checking for the absence of the rule in question in rewrite_rules option.

//add 'humans' query variable to WP
$wp->add_query_var('humans');

What next? After transforming the requested URL into programmatically understandable form WordPress parses requested variables to make a query to DB and stores the results in a global object $wp (and later $wp_query). If some unknown variable is found it will not be stored and will not be able to affect the query.

To avoid this in our case we register the humans query variable with WordPress using add_query_var method of environmental class WP. After that WordPress will understand this variable and, being requested, it will be stored in environmental objects and will be accessible for other functions. (For further details concerning WordPress parsing mechanism please refer to excellent article by Ozh).

It’s a good time also to check for the existence of functions for printing the humans.txt link(s) and hook them into wp_header and wp_footer respectively. That allows us to be sure that our links will appear only when humans.txt URL works properly.

add_action('init', 'frl_humanstxt_init');

Finally we hook our frl_humanstxt_init function into init action to be sure that all our modifications will be carried out early enough.

Template Loading

After setting up a rewriting mechanism our code creates is_humans conditional tag to help us determine the humans.txt request. It uses get_query_var function that allows us to retrieve the value of a particular query variable stored in global object $wp_query. Happily we have taken care of that beforehand.

Now, when WordPress can understand the humans.txt request we should force it to load our dynamic content instead of the regular template (to simulate the regular .txt file behaviour). So we interfere in the WordPress process of template loading with our function frl_humanstxt_load.

It uses the previously mentioned is_humans conditional tag to determine the humans.txt request, then it calls to do_humans to output the content. Finally, we hook it into template_redirect action.

And that is all - we have completed our plan. Now on visiting http://yourdomain.com/humans.txt URL people should see something like this:

Conclusion

This tutorial has demonstrated how you can use the power of WordPress APIs, actions and filters to introduce a completely new URL to the system, make the dynamically generated content appear there simulating the behaviour of the simple .txt file. And all that we perform to support fresh and friendly initiative stating that "We are People, not Machines".

Anna Ladoshkina is a developer with a primarily statistical and analytical background so she is interested first of all in approaches and techniques that allow her to implement this knowledge in her web projects.