Project:Support desk

Jump to: navigation, search

About this board

Edit description
vde   Welcome to MediaWiki.org's Support desk, where you can ask MediaWiki questions!

There are also other places where to askCommunication: IRCCommunication#Chat, mailing listsMailing lists, Q&A etc.

Before you post

Post a new question

  1. To help us answer your questions, please always indicate which versions you are using (reported by your wiki's Special:Version page):
    • MediaWiki
    • PHP
    • Database
  2. Please include the URL of your wiki unless you absolutely can't. It's often a lot easier for us to identify the source of the problem if we can look for ourselves.
  3. To start a new thread, click "Start a new topic".
By clicking "Add topic", you agree to our Terms of Use and agree to irrevocably release your text under the CC BY-SA 3.0 License and GFDL

Error creating thumbnail: Unable to save thumbnail to destination

59
Star Warden (talkcontribs)

Hey. I keep getting this error button and I don't know how to fix it. It can be seen here: http://dragon-mania-legends-wiki.mobga.me/Ruins, here: http://dragon-mania-legends-wiki.mobga.me/Habitats and here: http://dragon-mania-legends-wiki.mobga.me/Category:Images:Calendar_Events

I've tried setting a temp folder ($wgTmpDirectory = "$IP/images/temp";) and a debug file ($wgDebugLogFile = "/var/log/mediawiki/debug-{$wgDBname}.log";) as suggested and even made the temp folder writable by the server using chown, but that did not work.

The manual on how to debug specifies that this log can contain sensibile information, thus I won't post all of its contents and would like to know what could I share here from it that would help in solving this issue?

I mention the console is displaying some weird errors, but I cannot interpret them, sadly.

MarkAHershberger (talkcontribs)

Looking at your Special:NewFiles, it looks like you are able to upload images without problem, but those particular images cause a problem.

I can use thumb.php to create thumbnails of various sizes (see also).

thumb.php appears to work on images that show the problem.

Meanwhile, fails to produce thumb.php some thumbnails for the problematic images.

Maybe you can see what is produced in the debug log when one of the failing thumb.php links is called?

Star Warden (talkcontribs)

I found no recent instance of the cupcake thumb (at least the date isn't displayed when such calls are made, so it might be recent), thus I also included something more recent. The cupcake thumb is at the very top, the others are separated by a large space: http://pastebin.com/Yg6V1Zr6 Hope it helps!

MarkAHershberger (talkcontribs)

I'm at a loss. I don't really know how to help you without access to your machine. Sorry. Perhaps you can check the permissions on all the directories in your images folder to see if one or two has funky permissions.

Ciencia Al Poder (talkcontribs)

What if you set $wgShowExceptionDetails to true temporarily and open the problematic thumb.php URL?

Star Warden (talkcontribs)

That string was already set to true even before the problem occurred. But here's the strange thing. If you go and check my first two links (ruins and habitats), you'll see that the problem was solved on its own...

Here's a screenshot of the permissions that all folders seem to have: http://prnt.sc/em8bhd I selected all of them, and made sure they are all set to that. Do I need to change anything?

I forgot to mention that when renaming the file, the issue seems to disappear (of course, there's a reason I named them the way I did, in the first place, thus this isn't a good solution or the solution for that matter). And, in some cases, the issue is triggered only if I get over a certain pixel size while adding it to the article. Would manually moving the images (on the server) from one folder to another solve it? Is it even safe to do that?

MarkAHershberger (talkcontribs)

The permissions look fine if that user is the user that runs as the web server. You could also change the permissions on images to the numeric code 777 and recurse through the images directory.

You could move the images directory, but you would need to change a few configuration options like Manual:$wgUploadPath or Manual:$wgUploadPath. You might also need to use Manual:Img auth.php to stream images.

Star Warden (talkcontribs)

I am not sure if that user is the one that runs as the web server.. I was using the root account. How do I find out who it is?

Also, the second option you gave me is the same as the first one. I was asking if I could only move the images that cause trouble, not the whole directory. Seems kinda pointless to move the whole thing just for a few troublemakers.

MarkAHershberger (talkcontribs)
  • I don't recognize the tool you are using for ftp. But what does the full listing of the mediawiki directory show? Maybe that would have user names on it.
  • If you can physically remove the primary uploaded problematic file, then I think you could re-upload it. That might solve the problem.
Star Warden (talkcontribs)
  • I use FileZilla Client. I am not sure what you're referring to. Is this it:http://prnt.sc/emeu0n ?
  • By physically removing it, you mean to manually delete it from the server itself or from the wiki? Because if it's the latter, it doesn't work.
MarkAHershberger (talkcontribs)
  • That has the information I wanted. It looks like you're on a Debian server and the permissions are set correctly.
  • I meant from the server itself.
  • Could you use FileZilla to get the listing of the images/thumb/9/90 and images/thumb/9/90/The_Winds_of_November_Event_%2816.11.14%29.jpg directory?
Star Warden (talkcontribs)

I managed to get there and the folder has the image in 3 different sizes: 200px, 300px and 400px. Interestingly enough, this folder seems to have another owner than the typical one: http://prnt.sc/emrhbx I checked other problematic files, thinking the owner might be at fault, but they had the typical owner, so I guess the different owner theory can (for now, at least) be ruled out.

Should I delete the image from both its permanent place and the thumb folder?

MarkAHershberger (talkcontribs)

The root owner is definitely going to cause a problem. Delete that directory and its contents.

Star Warden (talkcontribs)

That solved the thumbnail problem, but only for that image.

Star Warden (talkcontribs)

Ehm, I was thinking of something. Sometimes, some special pages (usually uncategorised files) won't update, no matter what, regardless if I run updateSpecialPages.php (which runs daily on a cron job) no matter what. So I force it with refreshLinks.php which, in turn, creates a ton of jobs which are run by runJobs.php later on. From time to time, some jobs just get stuck, not allowing themselves to be run. I posted about it 2 months ago (https://www.mediawiki.org/wiki/Topic:Tjumyamw7pqzvjr1) but got no response. So I looked into the database and found the stuck jobs in the job table. So I just manually deleted them (and have been doing that ever since). Could this be the problem behind it or is it unrelated? I remember this particular issue happened on the 1.26.2 installation, then a failed server upgrade brought the wiki down for a little while, we moved servers and that issue was fixed (this was last summer).

MarkAHershberger (talkcontribs)

It could be related.

You need to make sure that runJobs.php and refreshLinks.php are running as the www-data user and not the root user.

Star Warden (talkcontribs)

I am sorry, I am not good at this. How do I do that? I haven't used any other user besides my personal one and the root one.

MarkAHershberger (talkcontribs)

How do you normally get the jobs to run? How are they being run right now?

Star Warden (talkcontribs)

With cron. When I set it up, I opened putty, logged in on root, entered the command: crontab -e then I added this string: */120 * * * * cd (path to wiki) && php maintenance/runJobs.php and saved. But after using refreshLinks, I run it manually. It usually doubles the number of jobs, initially, then starts to decrease them.

MarkAHershberger (talkcontribs)

If the only cron jobs you have are those that are running for the wiki, then run this sequence of commands as root:

# crontab -l > cron.txt         # to save the current crontab
# crontab -r                    # to delete the current crontab for root
# crontab -u www-data cron.txt  # to give the crontab entries to the www-data user.

If you have other crontab entries, skip the second step (which removes all the entries for root) and edit cron.txt so that only wiki maintenance scripts are left in the file before running the third command.

In any case, the first command will back up all your root cron commands if there is a mistake made.

Star Warden (talkcontribs)

Before proceeding, I used grep CRON /var/log/syslog to see the other jobs. There are 3 other that are not in my current crontab (when using crontab -e), but they all are run by root. Two seem to be more server-related: http://prnt.sc/emzniy

Is it safe to still do all the steps?

Also, I did the first step, but I got no confirmation of whether the procedure was successful or not? How do I check?

Star Warden (talkcontribs)

Ah, also, deleting images from the server then re-uploading them doesn't get rid of the error (because it uploads them to the same place in which they were before), BUT, I think it might just be related to root being the owner. So far, all images that I checked in images/thumb that had the issue, had root as the owner. It seems this issue occurs only with thumbnails.

MarkAHershberger (talkcontribs)

You can see if the first step worked by looking for cron.txt in the directory where you ran the file.

In fact, since you have other cron jobs running as root, skip the second step and just edit the cron.txt file to have the wiki-related jobs.

After that, erase those jobs from the root crontab using crontab -e and then run the third command to move the cronjobs to the www-data user.

Star Warden (talkcontribs)

I am not exactly sure what I am doing wrong. Technically, I run the command in the base directory (that is, I didn't change the folder after opening putty), but there is no such file there. Then I looked in /var/spool/cron/crontabs, which has the crontab that I can access through the crontab -e and crontab -l commands, but that's the only file in there (googling also lead me to the same location). There's also another crontab in /etc which lists all the cronjobs. But no sign of cron.txt in any of those three spots....

MarkAHershberger (talkcontribs)

If you run "crontab -u root -l > cron.txt" as root, it should create the cron.txt file for you.

Star Warden (talkcontribs)

I did that, still wasn't in any of those three locations, but I used find / -name and found it in /root. Inside it, there are the two jobs: runJobs.php and updateSpecialPages.php and the default "# Edit this file to introduce tasks to be run by cron". Is this the right file?

MarkAHershberger (talkcontribs)

That is it. It makes sense that it is in /root since that is the root user's home directory. Probably typing ls after running the command would have shown it.

Star Warden (talkcontribs)

Awesome. So let me get this straight before doing anything. I need to edit that file to have the wiki-related jobs going. It already has two jobs (updateSpecialPages and runJobs). This means I need to add the other three remaining jobs from /etc/crontab?

These are:
17 * * * * root cd / && run-parts --report /etc/cron.hourly
25 6 * * * root test -x /usr/sbin/anacron || ( cd / && run-parts --report /etc/cron.daily )
47 6 * * 7 root test -x /usr/sbin/anacron || ( cd / && run-parts --report /etc/cron.weekly )
52 6 1 * * root test -x /usr/sbin/anacron || ( cd / && run-parts --report /etc/cron.monthly )

MarkAHershberger (talkcontribs)

Just the first one. And "root" should not be there. You should probably paste your cron.txt before running the command to load it.

Star Warden (talkcontribs)

Is this correct: http://pastebin.com/2QGi7Wjf What do I do with the last 3 cronjobs and with the other two crontabs (from /var/spool and /etc)?

EDIT: There was another job that I missed which was added as custom, apparently:

# OJG CUSTOM JOBS ADDED FOR THE WIKI:
*/15 * * * *	root	cd /(path to wiki)/ && /usr/bin/php maintenance/rebuildFileCache.php >> /var/log/(log folder)/rebuildFileCache.log 2>&1
5 2 * * *     root    cd /(path to wiki)/ && /usr/bin/php maintenance/runJobs.php >> /var/log/(log folder)/runJobs.log 2>&1

It seems there is a second runJobs which is a tad bit different than the first runJobs. Doesn't really seem to be doing its job properly, though. Should I just remove it?

MarkAHershberger (talkcontribs)

First, do not run crontab -r. Use crontab -e to edit the root crontab files and remove only the wiki crontab entries. The rest should remain.

You should have a file, cron.txt, with the following based on the information you've given here:

# OJG CUSTOM JOBS ADDED FOR THE WIKI:
*/15 * * * * www-data  cd (path to wiki) && php maintenance/rebuildFileCache.php   >> (log folder)/rebuildFileCache.log 2>&1
*/15 * * * * www-data  cd (path to wiki) && php maintenance/runJobs.php            >> (log folder)/runJobs.log 2>&1
0    0 * * * www-data  cd (path to wiki) && php maintenance/updateSpecialPages.php >> (log folder)/updateSpecialPages.log 2>&1

Move the cron.txt file to /etc/cron.d/wiki-cron-jobs.

This will allow Ubuntu's cron system to run them. Anytime you need to change them, you can safely edit that file.

I think this will get you where you need to be. Sorry for the confusion.

MarkAHershberger (talkcontribs)

And to reduce the repetition, you can use shell variables in that file, so you could do the following:

LOGDIR=/var/log/(log folder)
WIKIDIR=(path to wiki)

# OJG CUSTOM JOBS ADDED FOR THE WIKI:
*/15 * * * * www-data  cd $WIKIDIR && php maintenance/rebuildFileCache.php   >> $LOGDIR/rebuildFileCache.log 2>&1
*/15 * * * * www-data  cd $WIKIDIR && php maintenance/runJobs.php            >> $LOGDIR/runJobs.log 2>&1
0    0 * * * www-data  cd $WIKIDIR && php maintenance/updateSpecialPages.php >> $LOGDIR/updateSpecialPages.log 2>&1
Star Warden (talkcontribs)

I did that and moved the .txt to where you suggested (I had to manually create the wiki-cron-jobs folder). Here's the entire paste: http://pastebin.com/jY6sfM5f

Now, one question. What do I do with the remaining cronjobs that I wrote here (https://www.mediawiki.org/w/index.php?title=Topic:Tn0u0v07qa9cb9v8&topic_showPostId=tn9kgkwpt0b8dob9#flow-post-tn9kgkwpt0b8dob9)? They weren't in the crontab file that I could open with crontab -e while logged in on root. They are in the crontab that's found in /etc. This is its contents: http://pastebin.com/vDZtEWML

MarkAHershberger (talkcontribs)

I'm sorry, I should have been clearer. I wanted you to move cron.txt to a file called /etc/cron.d/wiki-cron-jobs. If you've created a directory, I'm not sure it will work.

Remove these lines from the file you found under /etc:

# OJG CUSTOM JOBS ADDED FOR THE WIKI:
*/15 * * * *    root    cd /(wiki path) && /usr/bin/php maintenance/rebuildFileCache.php >> /var/log/wiki_logs/rebuildFileCache.log 2>&1
5 2 * * *     root    cd /(wiki path) && /usr/bin/php maintenance/runJobs.php >> /var/log/wiki_logs/runJobs.log 2>&1

The others should be left as-is.

Star Warden (talkcontribs)

Well, there was no such file there. Cron.txt currently rests in /etc/cron.d/wiki-cron-jobs folder. I used grep CRON /var/log/syslog to see the recent jobs, but not jobs from cron.txt seem to have been executed... what should I do?

MarkAHershberger (talkcontribs)

Run the following commands:

$ mv /etc/cron.d/wiki-cron-jobs/cron.txt /etc/cron.d/wiki-jobs
$ rmdir /etc/cron.d/wiki-cron-jobs

Both should run without problem and you should see cron jobs executing shortly after that.

Star Warden (talkcontribs)

Hey. Thanks! But there seems to be a problem... the jobs aren't run. Using grep CRON /var/log/syslog it does show me the runJobs.php has been executed, but the number does not go down, unless I manually run it. What might be the cause?

Star Warden (talkcontribs)

Actually, it seems none of the maintenance scripts are working, not only php runJobs.php.

MarkAHershberger (talkcontribs)

Could you run the following and tell me the output:

$ grep wiki-jobs /var/log/syslog
Star Warden (talkcontribs)

This is what I get: http://prnt.sc/eo1iyo

MarkAHershberger (talkcontribs)

So, it is loading properly. I was worried that I had given you a bad one or that it had been corrupted some way. For example, I created a file with the wrong owner and got the following:

$ grep test /var/log/syslog
Mar 24 12:17:01 web cron[369]: (*system*test) WRONG FILE OWNER (/etc/cron.d/test)
$

I have set up a wiki with a file similar to what I asked you to set up now and when I look for www-data in my syslog, I see the following:

$ grep www-data /var/log/syslog
...
Mar 24 12:30:02 web CRON[25865]: (www-data) CMD ( cd $WIKIDIR && php maintenance/rebuildFileCache.php   >> $LOGDIR/rebuildFileCache.log 2>&1)
Mar 24 12:30:02 web CRON[25867]: (www-data) CMD ( cd $WIKIDIR && php maintenance/runJobs.php            >> $LOGDIR/runJobs.log 2>&1)
...

Do you see anything similar?

Star Warden (talkcontribs)

It seems I do: http://prnt.sc/eo7t6b

Star Warden (talkcontribs)

If the command seems to be executed correctly, it means that wikidir and logdir are set correctly, right? I am not sure if it's safe to post them here (that's why I always censor them), but I wrote them something like:

LOGDIR=/var/log/foldername/ WIKIDIR=/mainfolder/foldername/

I didn't add /maintenance/ at the end of wikidir since the cronjob automatically switches to the maintenance folder due to php/maintenance. Plus, the command is basically the same to how I had it in the root crontab (minus adding information to the log, as well as having the name of the wiki path written instead of wikidir).

I take it that the above steps are correct?

MarkAHershberger (talkcontribs)

Yep, it looks like all the steps are correct. Check your log files to see what is being put in them.

Star Warden (talkcontribs)

From the outside, when looking at 'last modified' it seems that rebuildFileCache.log has been last modified on the 23rd while runJobs.log on the 18th. Those are the only two logs inside there.... The owner/group is root root. The first log is huge, I copied pasted the last pages (132 in word) and replaced the path to the wiki with 'wikidir':http://pastebin.com/quGJUa50

As for runJobs = http://pastebin.com/rWejzRu7

MarkAHershberger (talkcontribs)

So, for rebuildFileCache, it looks like it is failing. As the message says, "Set $wgShowExceptionDetails = true; and $wgShowDBErrorBacktrace = true; at the bottom of LocalSettings.php to show detailed debugging information." so we can see what is going on.

runJobs looks fine.

MarkAHershberger (talkcontribs)

oh, and make both log files owned by www-data so that the cron jobs will be able to update them. You should also change the ownership on the log directory.

Star Warden (talkcontribs)

Both file logs and the folder itself are owned by www-data now. I am not sure how runJobs looks fine since I can see that the jobs aren't being run.

Maybe the wiki-jobs file itself has to be owned by www-data? Most of the folders on the server are owned by root.

Off-topic: most of the wiki folder is owned by www-data, but some folders are owned by root. Should I pass ownership to www-data?

And this is what rebuildcache log shows now: http://pastebin.com/SPeztVnt

MarkAHershberger (talkcontribs)

You still haven't set the variables to show exceptions.

Star Warden (talkcontribs)

I actually did. This is what the bottom of localsettings looks like: http://pastebin.com/Sej3nm9g

Also, it's owned by root.

MarkAHershberger (talkcontribs)

interesting... it doesn't seem to have made a difference for your rebuildcachelog.

Star Warden (talkcontribs)

Maybe I should give those files different permissions? Their current permissions are -rw-r--r--

Also, what should I do to get the runJobs script running? Or all of the scripts, for that matter. None seems to be working.

MarkAHershberger (talkcontribs)

Here are the permissions on my file:

$ ls -l /etc/cron.d/test 
-rw-r--r-- 1 root root 442 Mar 24 12:15 /etc/cron.d/test
$

The content is what I gave above.

My log files show that the cron jobs are running.

$ zgrep maintenance /var/log/syslog.1 | tail -53 | head -3
Mar 26 00:00:01 web CRON[20974]: (www-data) CMD ( cd $WIKIDIR && php maintenance/rebuildFileCache.php   >> $LOGDIR/rebuildFileCache.log 2>&1)
Mar 26 00:00:01 web CRON[20977]: (www-data) CMD ( cd $WIKIDIR && php maintenance/runJobs.php            >> $LOGDIR/runJobs.log 2>&1)
Mar 26 00:00:01 web CRON[20979]: (www-data) CMD ( cd $WIKIDIR && php maintenance/updateSpecialPages.php >> $LOGDIR/updateSpecialPages.log 2>&1)
$

I'm not sure how I can help you in this forum.

Star Warden (talkcontribs)

So I fixed it. All I did was to replace wikidir and logdir with their respective folders. Strange fix.

Star Warden (talkcontribs)

Now, all that remains is the stuck jobs. Do I manually remove them from the database or what should I do? So far, that's the only solution I found to get rid of them.

MarkAHershberger (talkcontribs)

If the jobs are running and the stuck ones are not clearing, then, yeah, I think that is what you have to do.

Hrm... I am thinking I should have put export in the cron file. *sigh*...

Star Warden (talkcontribs)

Then I guess this settles it, finally.

Thank you very much for all the help and the patience! Much appreciated.

Off-topic: is there a MediaWiki-recommended list of permissions and owners/groups that folders on a server should have?

MarkAHershberger (talkcontribs)

We can certainly start creating such a list. Seems like something that should be in an FAQ.

Star Warden (talkcontribs)

That would be great. Thanks for now! Your help is really appreciated. Going to mark this as resolved now.

129.236.228.255 (talkcontribs)

Hi, I'm really new to using API's and I want to make sure I don't accidentally overload the system with requests and get suspended. The program I am creating will most likely be making a lot of requests, how can I prevent overloading the system? If I make the requests in sequence and not all at once, is this enough? Thank you so much for the help!

MarkAHershberger (talkcontribs)

There are a TON of API users. While I won't say there are no rules about respectul use -- a user agent with your info is a good start -- I would say that at this point you shouldn't worry about overloading the system. Make sure your API user has a connection to you (user agent info and account used) and you should be good for now. Parallel requests are (probably) fine. You can learn more as you go.

Reply to "Using the API respectfully"

What is recommended way to maintain version wise document in Mediawiki?

4
182.69.186.19 (talkcontribs)

We have version wise product documentation and new version if document is released with every version. We want to know what is best way to do in Mediawiki and in Mediawiki with every release we will have to create a copy of database and war folder which will eventually result into huge database.

Thanks in advance.

83.135.225.118 (talkcontribs)

MediaWiki only stores references to uploaded files inside the database. E.g. a page inside the Image: namespace will be created on th first upload of a new image. Also there will be a row in the image table. All these database records do not really have a notable size. Creating an article with a long text will increase database size more.

The files themselves will be saved inside the file system of the server, most likely inside the images/ folder. You can upload new versions "over" the old version of a file. Or you can upload them as "new" files - under a new name.

In any case, the new and the old file will stay inside the file system. What would be possible is to use the eraseArchivedFile.php maintenance script. This script can be used on archived files. This archiving will not actually remove the file from the file system. The maintenance script however will permanently remove the old version of an archived file. In that case, the file will be removed from the file system. This process is not recoverable. So the size of the installation will go down, but the old version of the file will be gone.

182.69.186.19 (talkcontribs)

Thanks for reply!

The way we want to work with versions is:

If URL is v6/somoepage then it goes to documentation of software V6

If url is v9/somepage then it goes to documentation of software V7

MarkAHershberger (talkcontribs)

I heard a discussion at EMWCon Spring 2017 about how a company was using PonyDocs (an extension to MW with their customizations) to maintain references to different versions. Perhaps they could help you with what you want?

(They mentioned they wanted to make their changes available, but needed a use case. You seem to have one.)

Reply to "What is recommended way to maintain version wise document in Mediawiki?"

How to convert many structures to PNG from a sdf file?

2
Summary by MarkAHershberger

This is the MediaWiki Support Desk and your question doesn't look like it is related to MediaWiki.

42.109.136.143 (talkcontribs)

Hello all,

I wish to know, how to convert the .sdf file which has 40,000 structures to individual png files using Open Babel GUI.

When I tried converting, I got 40,000 png images which show invalid image written in red colour with the black background.

Open Babel will go into not responding state once I clicked convert and it won't recover even the conversion is finished.

Thanks in advance

MarkAHershberger (talkcontribs)

This is the MediaWiki Support Desk and your question doesn't look like it is related to MediaWiki.

Cannot download VisualEditor for 1.23, 404 not found

2
193.105.201.11 (talkcontribs)

Cannot download VisualEditor for 1.23, 404 not found

MarkAHershberger (talkcontribs)

Reported: https://phabricator.wikimedia.org/T161689

Reply to "Cannot download VisualEditor for 1.23, 404 not found"
143.116.116.91 (talkcontribs)

Hi,

I can't get any images to load. When I attempt, I only get spinning wheel and then the whole system freezes. Thanks.

MarkAHershberger (talkcontribs)

That does not sound like a problem with MediaWiki. If you think it is, then can you please explain why you think MediaWiki is the culprit?

Reply to "Images"
Poppisue (talkcontribs)

When I try to embed image and click upload, it is doing nothing. This is while writing in Media Wiki Talk. Using Chrome Browser OS Windows 10

MarkAHershberger (talkcontribs)

It isn't clear what you are trying to do, or where you are trying to do it.

Could you provide more details of what you're doing and point to the wiki if it is public?

Reply to "Image Embed"
74.58.149.224 (talkcontribs)

Hi, I got the musician Infobox to display correctly after exporting it from wikipedia and importing it to my installation. Unfortunately I just realized it's in english; so totally useless for my project. Fortunately there's a french version of all the templates I need on fr.wikipedia!

https://fr.wikipedia.org/wiki/Mod%C3%A8le:Infobox_Musique_(artiste)

I've imported it to my site and it doesn't show up correctly.

Link to my site for example: http://archives.musixplore.ca/index.php?title=Champignons

I've tried the given example on the Infobox Musique (artiste) page and also a customized one from another band page which both doesn't work.

I've tried importing a bunch of templates and it's still not working and I'm out of idea.

Your help is greatly appreciated!

MarkAHershberger (talkcontribs)

It looks like everything is correct, but the layout is wrong. If that is the case, you should probably copy what you need (or all of) https://fr.wikipedia.org/wiki/MediaWiki:Common.css

Reply to "Infobox display issue"

Unable to connect to gerrit.wikimedia.org with ssh

3
Goldengide1 (talkcontribs)

I have been able to connect to gerrit.wikimedia.org in the last two week but early last week I had to reinstall my wampserver so I tried to connect again based on the instructions on Gerrit/Tutorial. This time it didnot connect it kept bringing up Permission denied (publickey). Please someone tell me what I am not doing right

OS: Windows 10

Goldengide1 (talkcontribs)

This is my debug message:

$ ssh -p 29418 myusername@gerrit.wikimedia.org -v

OpenSSH_7.3p1, OpenSSL 1.0.2j  26 Sep 2016

debug1: Reading configuration data /etc/ssh/ssh_config

debug1: Connecting to gerrit.wikimedia.org [208.80.154.85] port 29418.

debug1: Connection established.

debug1: identity file /c/Users/Gideon/.ssh/id_rsa type 1

debug1: key_load_public: No such file or directory

debug1: identity file /c/Users/Gideon/.ssh/id_rsa-cert type -1

debug1: key_load_public: No such file or directory

debug1: identity file /c/Users/Gideon/.ssh/id_dsa type -1

debug1: key_load_public: No such file or directory

debug1: identity file /c/Users/Gideon/.ssh/id_dsa-cert type -1

debug1: key_load_public: No such file or directory

debug1: identity file /c/Users/Gideon/.ssh/id_ecdsa type -1

debug1: key_load_public: No such file or directory

debug1: identity file /c/Users/Gideon/.ssh/id_ecdsa-cert type -1

debug1: key_load_public: No such file or directory

debug1: identity file /c/Users/Gideon/.ssh/id_ed25519 type -1

debug1: key_load_public: No such file or directory

debug1: identity file /c/Users/Gideon/.ssh/id_ed25519-cert type -1

debug1: Enabling compatibility mode for protocol 2.0

debug1: Local version string SSH-2.0-OpenSSH_7.3

debug1: Remote protocol version 2.0, remote software version GerritCodeReview_2.13.4-13-gc0c5cc4742 (SSHD-CORE-1.2.0)

debug1: no match: GerritCodeReview_2.13.4-13-gc0c5cc4742 (SSHD-CORE-1.2.0)

debug1: Authenticating to gerrit.wikimedia.org:29418 as 'Goldengide1'

debug1: SSH2_MSG_KEXINIT sent

debug1: SSH2_MSG_KEXINIT received

debug1: kex: algorithm: ecdh-sha2-nistp256

debug1: kex: host key algorithm: ssh-rsa

debug1: kex: server->client cipher: aes128-ctr MAC: hmac-sha2-256 compression: none

debug1: kex: client->server cipher: aes128-ctr MAC: hmac-sha2-256 compression: none

debug1: sending SSH2_MSG_KEX_ECDH_INIT

debug1: expecting SSH2_MSG_KEX_ECDH_REPLY

debug1: Server host key: ssh-rsa SHA256:j7HQoQ6fIuEgDHjONjI2CZ+2Iwxqgo2Ur5LbPqBgxOU

debug1: Host '[gerrit.wikimedia.org]:29418' is known and matches the RSA host key.

debug1: Found key in /c/Users/Gideon/.ssh/known_hosts:1

debug1: rekey after 4294967296 blocks

debug1: SSH2_MSG_NEWKEYS sent

debug1: expecting SSH2_MSG_NEWKEYS

debug1: rekey after 4294967296 blocks

debug1: SSH2_MSG_NEWKEYS received

debug1: SSH2_MSG_SERVICE_ACCEPT received

debug1: Authentications that can continue: publickey

debug1: Next authentication method: publickey

debug1: Offering RSA public key: c:/wamp/www/mediawiki/.ssh/id_rsa

debug1: Authentications that can continue: publickey

debug1: Offering RSA public key: /c/Users/Gideon/.ssh/id_rsa

debug1: Authentications that can continue: publickey

debug1: Trying private key: /c/Users/Gideon/.ssh/id_dsa

debug1: Trying private key: /c/Users/Gideon/.ssh/id_ecdsa

debug1: Trying private key: /c/Users/Gideon/.ssh/id_ed25519

debug1: No more authentication methods to try.

Permission denied (publickey).

MarkAHershberger (talkcontribs)

It looks like you are using a different key than the one you provided before. I think you need to go to https://gerrit.wikimedia.org/r/#/settings/ssh-keys and upload your key.

Reply to "Unable to connect to gerrit.wikimedia.org with ssh"

Implement interactive customer evaluation on a wiki page

4
Ufoldager (talkcontribs)

Hello,

I am interested in knowing whether or not it is possible to implement an interactive customer evaluation on a wiki page?

Like, a normal wiki page with different form field elements that the user can fill out to assess themselves and their performance in different aspects?

Then they click a submit button, and the answers gets sent to a specific administrator/email account?

Thank you for your help!

Ciencia Al Poder (talkcontribs)

Possible? Yes, if you know PHP to program it yourself or by hiring a developer. MediaWiki doesn't provide that functionality, and there's no extension that I'm aware of which could provide similar functionality.

Have in mind that MediaWiki is a software about documentation. What you want is probably provided by a CRM software.

Ufoldager (talkcontribs)

Yes, this is what I expected. I know the intention behind MediaWiki as a platform purely for documentation purposes.

I was hoping, however, that there might be some kind of way to do this manually inside the wiki pages using special commands/functions/whatever (I am no programmer myself).

I guess the only other option is to simply create a wiki page, put a description inside it and add a link to a PDF file that the user can then fill out, save locally, and then email the document.

Do you have an idea for a more interactive solution perhaps? Just curious. :)

Thank you for your response!

Ciencia Al Poder (talkcontribs)

That's the only option, I guess...

Reply to "Implement interactive customer evaluation on a wiki page"