User:BDavis (WMF)/Notes/Possible Work

=Things that I may or may not end up working on=
 * Add thumb.php to Vagrant -- DONE: see 78409, 52756 & 78830
 * needs to use thumb_handler. That didn't work for me initially but it will be important.
 * Full version of "old" image not purged from varnish cache when file deleted -- DONE: see 79914
 * Unknown error: "stasherror" for >100 MB PDF file upload to Commons
 * Add pdfhandler to role::multimedia
 * Add VARNISH role to vagrant
 * vagrant in general


 * Requests_for_comment/Refactor_on_File-FileRepo-MediaHandler


 * Using the job queue for metadata updates on ?action=purge for large files (or not triggering it at all). Actually the automatic metadata updates on versions changes should use the queue to (though that's not an issue right now too much).
 * Purging pages that use commons files when they change.
 * Fixing DatabaseBase::generalizeSQL. This is a bit tricky since it can't be slow either.
 * Reduce/remove toUnsigned usage in IP class (for code cleanup and it's better for 32 bit systems)
 * Use delayed jobs to stagger cache purges instead of ignoring them via $wgMaxBacklinksInvalidate
 * Include $wgPhpCli field in the installer
 * Make BagOStuff support an initial value param in incr
 * getDescriptionRenderUrl uses https for internal GET requests going through the whole stack...it would be nice to split up the URL used internal from the one given to users so the former can bypass https at the least. I think this contributed to downtime in the past.
 * Audit and document parts of the code that handle large files poorly (like deletion/move/metadata update)


 * change thumb caching so that it's all CDN and not in replicated store
 * bucket storage so a single purge gets variants and we don't need a list of file names


 * upload from url has session store stuff like chunked upload that would be nice to kill
 * duh ^ kill it in chunked upload too (memcached)


 * Extension:GWToolset
 * commons:Commons:GLAMToolset_project