Extension:ArchiveLinks/Project/UserStories

Theme: Render external links with "cache" link in MediaWiki articles.
 * Render external links differently. ✅
 * Render external links differently based on configuration in LocalSettings.php ✅
 * Create a sample config that would work with www.archive.org. ✅
 * Create a sample config that would work with wikiwix.org. ✅
 * Create a sample config that would work with a local spidering system.
 * Internationalize any UI that needs it (the word that appears in the "archive" link, anything else?) ✅

Theme: queueing links for spidering
 * On article save, get external links. ✅
 * On article save, get external links, place into a queue. ✅
 * Write another program that can consume links from the queue and prints it to the screen. ✅
 * Ensure that another program invoked at the same time doesn't contend with the other one.
 * Create a permanent blacklist for domains we don't want to spider. ✅ (the blacklist table is checked but there is no UI to populate it)
 * Ensure that any Wiki administrator can edit this blacklist.
 * Ensure that we don't queue such links for archival. ✅

Theme: spidering a link and storing HTML.
 * Expand the program above to invoke wget to spider the link.
 * Store these files in a permanent manner

Theme: linking to the stored HTML
 * Create web handler for archived links stored locally.
 * Make it show a header, like Google Cache, with placeholder for content.
 * If the local file doesn't exist, use JavaScript to load the URL. (into an iframe?)
 * If JavaScript can't find it, show an error message.
 * Make it so the archive links show the expected locally archived links.

Theme: putting it all together
 * Create a way for links to be spidered automatically
 * With exponential backoff, if the URL 404s
 * Develop heuristics to decide if you should re-spider a link, or just use previously cached items.
 * Develop heuristics to not spider & store links if determined to have malware.