Topic on Help talk:Extension:Linter

Valerio Bozzolan (talkcontribs)
SSastry (WMF) (talkcontribs)

Can you explain the use case a bit more? We currently provide a list of pages filtered by namespace (via the UI and the API). What use case would be enabled by indexing them?

Valerio Bozzolan (talkcontribs)

It helps bots. E.g. looking for pages with a template and that lint error / for something in the source and that lint error, etc.

PerfektesChaos (talkcontribs)

It also helps human beings.

If I am interested in category:XYZ or articles containing the phrase fooBar I can edit and fix them, but i do not like all this other stuff.

Much appreciated feature.

Izno (talkcontribs)

AWB can load search results.

I don't know that it can presently load from the API or the UI.

Anomalocaris (talkcontribs)

Here is an example of the search that should be available. According to https://en.wikipedia.org/w/index.php?title=Barack_Obama&action=info, there is 1 Misnested Tags lint error. (There should be, but is not, a direct way to get more information about this error, with one click on this Information page.) If the user goes to the list all articles with Misnested tags at https://en.wikipedia.org/wiki/Special:LintErrors/misnested-tag?namespace=0, it should be possible to search there for the article Barack Obama, so one can get more information about the misnested tag and fix it.

Smalyshev (WMF) (talkcontribs)
Smalyshev (WMF) (talkcontribs)

You can look at GeoData extension which does it.

197.218.84.238 (talkcontribs)

Hmm, does the search API allow the use of generators?

It would be much more future proof to get a generic list of pages and then direct the search engine to look through those and only those. This would be useful almost everywhere,stuff like Special:wantedpages, and maybe pages with a specific Page props.

That would probably greatly reduce the need for hacky insource:// stuff. Anyway, allowing the search engine to access lints seems like a great idea.

Separately though, the lint page could use its own filter on page titles.

Smalyshev (WMF) (talkcontribs)

Ah, another note - if you follow the route I outlined above, you'd probably need to run some scripts to update index mappings. Ping people on #wikimedia-discovery about how to do it if you need help.

197.218.84.238 (talkcontribs)

> direct way to get more information about this error

It wouldn't help much unless the list was ungrouped first. Right now it says stuff like "p-wrap-bug: 6". Clicking that could only take you to a random lint error.

Fortunately, it is already possible (and trivial) to do what you want using the linter API without even visiting the info page. There should be a userscript that does that already.

Anomalocaris (talkcontribs)
197.218.84.238 (talkcontribs)

It is coming from :

{{Navboxes
|list1 =
{{US Presidents}}
{{United States presidential election, 2008}}
{{United States presidential election, 2012}}
{{Democratic Party (United States)}}
{{Nobel Peace Prize laureates}}
{{Time Persons of the Year}}
{{United States Senators from Illinois}}
{{Patriot Act}}
{{Grammy Award for Best Spoken Word Album 2000s}}
}}
197.218.84.238 (talkcontribs)

Or more specifically :

{{United States presidential election, 2012}}

Oh, and to get the exact spot I wrote a simple userscript. It is not particularly userfriendly, you'd have to paste it on the browser console.

Anomalocaris (talkcontribs)

Thanks, but I didn't ask where the error was. I asked, "Please show me how to find the misnested tag."

197.218.89.23 (talkcontribs)
SSastry (WMF) (talkcontribs)

Hello anonymous friend, :-) It might be helpful if you could convert that into a friendly gadget / userscript and share it so it is equally easy for others to use.

197.218.89.23 (talkcontribs)

Howdy. It seems like we've gone way off topic... (It is still a good idea to expose lint errors to the search engine).

Anyway, to make it really user friendly one would need a html validator. This is something which probably no wikimedia API currently provides , and which requires one to spend time scratching their head trying to figure out where the error might be. Testing html validity using an external tool is certainly possible but that will be very unreliable.

Also the lint API still doesn't currently support retrieving a revision. So one has to basically send the whole wikitext to the lint API making it somewhat slower. Although quite useful for testing out fixes. Maybe lint errors should be stored in page props or as a javascript variable when the editor loads.

Arguably those improvements should be in the linter tool itself.