Extension talk:Cargo

Where are errors defined for creating a new page with a property that is set to be unique?
I'm trying to create short id's that are more user friendly for use in a url. However, since the likelihood of a collision is more of a concern than with RFC compliant UUID's. I need to handle that error but I'm not sure how the error is expressed. How can I check, after creating a page with a potential collision, what the error array looks like?


 * I would say that the easiest and most foolproof way is to just check the value of that field and see if it's null or not - assuming it would only be null in the event of an error. Yaron Koren (talk) 02:29, 1 June 2018 (UTC)

== What would be the best way to keep track of the page's revision id such that I don't overwrite the page when my extension's local revision id is less than the page's current revision id. Magic words don't work. ==

I'd like to get the revision data when I do my initial  so I can reduce the overall number of api queries if possible. However, if it's not doable I can just make a revisions api request on page load.


 * Nice subject line. :) Is this a Cargo question? It sounds like a general MediaWiki question. Yaron Koren (talk) 16:40, 3 June 2018 (UTC)


 * Well there is this  but I wasn't sure if that was the only approach towards determining potential conflicts. Also it says it's a date but is that actually a datetime?


 * Yes, it's actually a datetime. Yaron Koren (talk) 18:15, 4 June 2018 (UTC)

Updating Cargo table's columns causes "no database exists" error
I'm using Cargo with the PageForms extension. After making my forms I decided to add some fields. After modifying the template associated with a Cargo table and recreating the table from the template's page, I can see the new fields' columns in the Cargo table in PhpMyAdmin. However, when I try to create a new page using a form referencing the template tied to the updated Cargo table, I get an "Error: No database table exists named x" message. Here's the odd part: When I go into PhpMyAdmin and manually change the Cargo main_table's name to "cargo__x", the form works! However, when I try to access the table using the Special:CargoTables page on my wiki, I get this error: "This table is registered, but does not exist! (not defined by any template)." How can I fix this error?


 * Could the problem be with the #cargo_store call in the template? Is the "_table" value set to the same thing in #cargo_declare and #cargo_store? If that's not the issue - could there be something unusual in the table name, like non-Latin characters, or punctuation? Yaron Koren (talk) 20:27, 3 June 2018 (UTC)


 * Strangely enough it went away and works fine now. May have been a caching error.

Why does HOLDS implement RIGHT OUTER JOIN?
I'm utilizing the very helpful HOLDS command, but I've having confusion in why it uses RIGHT OUTER JOIN as opposed to LEFT OUTER JOIN because when I try to join with an empty array of IDs I would just expect it to still return the object from the first table.

I have a table Action_Items and I'm joining it with Updates because Action_Items can have n updates. However, if I do HOLDS with Action_Items.updates that is empty the overall result is an empty array as opposed to the originally queried Action_Items information.

UPDATE: I changed it to a   by just editing   and got the result I expected.

_pageData (and other tables) not storing values for a page
Hello, we're running into an issue with this page in that no Cargo tables are being stored for it, even _pageData. Tried null editing and even deleting and recreating the page with no change. Meanwhile other pages are working fine. Any ideas on how to resolve? --pcj (talk) 17:38, 6 June 2018 (UTC)


 * Is it only that page for which the problem is happening? If I had to guess, I would guess that the massive amount of data in that page is causing the database connection to time out while the code tries to insert all the values. Or are there pages that are succeeding with even more data? Yaron Koren (talk) 01:32, 7 June 2018 (UTC)


 * Haven't found any other page that has this problem (yet). We have a page where 793 rows are stored in a single table, which is much more then this particular page should create (which is (besides page data) 1 row in items, 1 row in skill gems, 40 rows in skill_levels, 4 in skill_stats_per_level, 1 row in item_purchase_costs, 1 row in item_sell_prices, 1 row in skill).
 * There are other pages with the same template that work as well.
 * On the other hand, it seems that it somehow stops somewhere in between, which would speak for the timeout thing, since it creates various rows and doesn't prune the old one, which is probably due to it not really finishing the process (so I assume anyway) see this for example
 * Not sure if it is related, but I've also noticed that the intermediate tables for list fields suffer from the duplication/data not being pruned problem that was fixed in cargo a while ago. Perhaps that is somehow related? OmegaK2 (talk) 11:41, 7 June 2018 (UTC)


 * Alright. I didn't know about the problem with duplication in the helper/intermediate tables - that's very good to know. I don't think that's the cause of this issue, but who knows. What I would suggest is removing parts of that problem page until you can get it to save its data, to try to isolate what exactly is the cause of the error. I wouldn't be surprised if it's some character or other unusual value. Sorry to make you do that work, if you end up doing it - I can't think of a better approach. Yaron Koren (talk) 16:16, 7 June 2018 (UTC)


 * It seems I might have figured the cause of this particular issue. There is an unique field for one of the tables - if I remove the data and then add it back in it works. I'm guessing the underlying problem here might be that it tries to insert into the table where a row with the unique already exists on the same page before deleting the old row and as a result gets some kind of error and stops all subsequent actions. I'm not sure why deleting/restoring the entire page didn't work though (I assume it should have nuked all the entries in tables for the pages). OmegaK2 (talk) 22:34, 7 June 2018 (UTC)


 * Oh... that's a great find. It's not totally surprising: there was a fairly recent change in Cargo's data storage to make it one big DB transaction, and that's probably causing unrelated inserts to not get run when one insert runs into a "unique" problem. I'll have to look into that. Yaron Koren (talk) 23:21, 7 June 2018 (UTC)

Issues querying fields with extended characters in the name?
We're running into some issues querying data from a table with fields with extended characters in the column name. For this table, querying for funções or "jogabilidade_de_heróis.descrição" fails while just "descrição" or jogabilidade_de_heróis.complexidade succeeds. The issue is perhaps more apparent when used on this page. Any thoughts on a resolution or workaround? --pcj (talk) 13:50, 7 June 2018 (UTC)

Special:CargoExport JSON output has footer warning
Back with another weird error. I'm using Cargo's Special:CargoExport feature to get JSON output for a javascript widget. Ever since I deleted some tables via PhpMyAdmin, when I try to manually get the JSON output using variables in the URL, I get the JSON output, but then also this bit at the end:

It's annoying because I've been using a $.getJSON to convert the output to JSON and that doesn't work anymore. I know I could use a javascript workaround (i.e. convert the weird output to a string and strip away everything from the  onward) but I'd rather treat the cause and not the symptoms to avoid other weirdness. Any ideas?


 * Is there any other error message, or is that it? Usually that "headers already sent" error comes after something else has been printed on the screen. Yaron Koren (talk) 23:30, 7 June 2018 (UTC)


 * No, the JSON prints perfectly right before in the [, { bracket formats. It's just these annoying break tags and error after that. Another update: The problem seems to be caused by the number of Cargo fields included in the export. When I remove one of the fields, the error described above goes away and the JSON loads smoothly. When I try to add back a field, it breaks down again.


 * Oh, that's very interesting. Is it one specific field, or does it happen if you remove any field? If it's the latter, how many fields are you querying? Yaron Koren (talk) 23:21, 8 June 2018 (UTC)


 * It seems to be 4-5 max, unless I include a field that contains a lot of text -- then the length doesn't matter, I get the error regardless. The issue seems to be the length of the text inside the field, or possibly the special characters inside. Is there a way to increase this limit?


 * I'm guessing that this is somehow due to one or more specific values... is this on a public site? Yaron Koren (talk) 18:29, 10 June 2018 (UTC)


 * Unfortunately not -- it's an internal site for a team. I made a test Cargo table to try to recreate the problem. With a string length of about 120 words (700 characters), I could only export one field at a time -- any more would produce the error. Could this be an export limitation from within MediaWiki itself?
 * Went with the workaround, using $.get and then stripping away the Cargo errors. If anyone finds a solution in the future I'd be grateful.

Parsing email addresses and dates
Hello all

I'm using Cargo tables to store users' email addresses, and dates. I'd like to parse these to show the email addresses and links, and to change the format of the dates from YYYY/MM/DD. Is this possible?

This is my query showing email addresses:

and for the dates, it's simply this:

| Date

Any help or pointers would be much appreciated! --Melat0nin (talk) 12:52, 10 June 2018 (UTC)


 * For emails, how do you want to change their display? And for dates, are you talking about displaying them in a query, or just displaying the template parameter (as you have in your example)? Yaron Koren (talk) 18:03, 10 June 2018 (UTC)


 * Thanks for the reply. I managed to implement what I was looking for using the CONCAT and DATE_FORMAT SQL commands. Now I'm trying to implement a recursive LEFT OUTER JOIN...--Melat0nin (talk) 13:52, 11 June 2018 (UTC)

PostgreSQL and replacement tables
When the checkbox to use a replacement table is enabled:

I think there may be concurrency errors or attempts by Cargo to manually assign IDs over using sequences.

Jun 12 06:31:02 db postgres[90423]: [7-1] ERROR: duplicate key value violates unique constraint "cargo_tables_template_id_key" Jun 12 06:31:02 db postgres[90423]: [7-2] DETAIL: Key (template_id)=(9908) already exists.

New __NEXT tables are created in the cargo database: public | cargo__ships__NEXT        | table | azurlane_wiki public | cargo__ships__NEXT___files | table | azurlane_wiki

But Cargo throws exceptions:

[exception] [4ff8ae73bec7d8b9f2d2807b] /Special:CargoTables/ships?_replacement  MWException from line 196 of /var/www/azurlane.koumakan.jp/w/extensions/Cargo/includes/CargoUtils.php: Error: Table ships__NEXT not found.

The only way to fix this is to manually drop these __NEXT tables as Cargo is unable to do anything with the table in question. Trying to delete the table and recreate data without dropping these NEXT tables obviously fails.

Cargo Admin User Class?
Would this make sense to have the extension create? To grant  and any other right that Cargo only gives to sysops currently. --RheingoldRiver (talk) 11:42, 13 June 2018 (UTC)


 * I don't think it makes sense for MediaWiki extensions to define their own user groups, except in rare cases, but you could do it yourself on your wiki (or have the admins do it) by just adding something like the following to LocalSettings.php:


 * That by itself is enough to define a new user group in MediaWiki. Yaron Koren (talk) 16:15, 13 June 2018 (UTC)

unique field not working?
I was trying to recreate a table on Gamepedia with a unique param, and instead of recreating it just got hung up on the spinning wheel indefinitely. I tried recreating on a sandbox wiki here, and same thing - regardless of how small we set the size it still won't create. You can check revision history to see things we did, basically every iteration without a unique param recreated no issue, but with the unique it failed with infinite loading circle. Do you know what might be going on? Thanks --RheingoldRiver (talk) 16:03, 13 June 2018 (UTC)