I don't usually tweet about #SEO, but this is important. Is a static robots.txt file good for WordPress performance? Let's find out!
To display the default robots.txt, a fresh WordPress install will: 🧵
Run 15 SQL queries: load all options, can_compress_scripts, WPLANG, query last 10 posts, query the recent posts widget options, query terms, taxonomies, etc., for found posts, all post metadata for found posts, recent comments widget, recent entries widget and a few others.
Call file_get_contents() 61 times to register some core Gutenberg blocks, that's in addition to json_decode() each file, and about 100 calls to file_exists() on those files.
It will call gettext translations: 1917 times for regular strings, and 875 times for strings with context. I'm so lucky I'm using the default locale. Oh and exactly 0 of those strings are used in robots.txt.
Check whether the front page has been set as a static page, and whether the request is_front_page() or is_home(). Also is_single(), is_feed(), is_admin(), is_category(), is_search(), the list goes on.
Check whether the user is logged in, 14 times, and whether we need to display an admin bar, also heartbeat settings. It will also attempt to read the user session, and create 3 nonces. Reminder: this is an anonymous request.
Escape some HTML 78 times. Reminder: robots.txt is a plain/text file, there's no HTML. It will check whether the admin needs to be forced SSL. It will also initialize smilies, and Twenty Twenty One "dark mode"
Run 83 unique actions (one of them is do_robotstxt) and apply 530 unique filters (one of them is robots_txt).
All combined, that's > 42,000 function calls at 5.46 megs peak memory, about 100 ms wall time. So yes, by all means, please use a static robots.txt file.