Jump to navigation Jump to search

There is a Javascript solution that uses the API to iterate over all revisions for a given article. This is a rather heavy process and can only be done on singe pages during closer analysis. If it is going to be used on every page an other solution must be found.

During rendering the results from the contribution analysis is requested from memcached, and if found and not to old the results will be reused on the rendered page. If not available a delayed job will be set up to do the analysis. After the analysis are done it is stored in memcached for later retrieval and the html cache are invalidated. This will again initiate a new regeneration of the page on next request.

If there are no result available from a previous analysis a replacement will be posted on the html page instead and a delayed API request will later pull an update from the servers. While waiting for the analysis to be run a marker (the job id) is set in memcached so no new job is initiated by the API until the first one is run.

Some contributors must be identified especially, at least the five topmost contributors should be listed and those should include the initial creator of the page and the major contributor. A more complete analysis of the contributors should be available at a separate page. At that page all contributors should be listed that has been the main contributor to any revision kept in the trunk, and also the five topmost contributors. One possibility is to list those that is the main contributors at any given time.

As pages grows in size, with a fixed resolution, and with a known size of the main contributor, it is possible to calculate a minimum allowed change before a more thorough analysis must be done. Depending on wetter the user have made previous contributions the limits will be slightly different, but if its simplified to a single limit we will most of the time do only a small error. Even if a more thorough analysis is not initiated due to a too small change, results from previous analysis might not be available and the analysis must be done anyhow.

Configurations must be available for methods and properties, like the number of bins in the fingerprint.

See also[edit]