I wish there were a mechanism for avoiding adding of some inner links as like as spam-list for outer links. If user tries to save a text with references to some pagenames, he gets a message which strongly recommends him not to do it. There should be also a line with explanation why it's such and what to do, e.g.
[[Recieve]] - misspell, use [[Receive]] ↓ instead
- Automatically prevent duplicating categories
- Without any bots.
- Automatically prevent wikification and creation of misspelled pagenames
- (As in example). This will force user to correct spelling on input; and another advatage in comparison to
create=sysopmethod: misspelled pages will still be findable through search string, since the "spam-listed" pages are working redirects.
- Prevent attempts to repost known non-noticeable topics
- User will see a red link and creates page; some ones even ignore the deletion log message, others need to look in the discussion.
Technically, unreferable pages may look much like normal redirects, or not even redirects. Any user to be able to create such one just by writing
#BADREDIRECT instead of
#REDIRECT in page's first line and description in the second line. If there is no reference after
#BADREDIRECT, the page doesn't work in search as redirect.
Explanatory why there is no need in such page is shown (unlikely content of redirect), and is included into message on attempt to create link to this page. By special template JS links are included to insert some bits of text into edit form for convenience.
This error message can be suppressed by option or ignored, and it's of course is not shown to bots (but why not to include it on demand into some queries). There can be done a trick as for normal spam-list, that if "bad" pages were referred before edit, they don't bring warnings.
The disadvantage is that engine needs to look up an extra database field while resolving links on saved page.