Extension talk:External Data

Invalid JSON on #get_web_data
I plan on doing some upgrades to mediawiki and possibly php as a shot in the dark, but before doing that I figured I would ask here.

Here is the exact json of the table I'm trying to display. The data itself is junk just to play with.

[{"IPAddress":"192.168.000.001","Type":"Server","Device":null,"Location":null,"Comment":null,"HostnameDNS":null,"OSFirmware":null,"LastVerified":"1\/7\/2021","PlantNetwork":"999Office"},{"IPAddress":"192.168.000.002","Type":"Server","Device":"Virtual Server","Location":"ERA - Server Room","Comment":"File and Print Server","HostnameDNS":"fileserver","OSFirmware":"Server 2016","LastVerified":"1\/7\/2021","PlantNetwork":"999Office"},{"IPAddress":"192.168.000.003","Type":"Server","Device":"Virtual Server","Location":"ERA - Server Room","Comment":"Video Server","HostnameDNS":"videoserver","OSFirmware":"Linux","LastVerified":"1\/7\/2021","PlantNetwork":"999Office"},{"IPAddress":"192.168.000.004","Type":"Server","Device":"Physical Server","Location":"ERA - Server Room","Comment":"Web Server","HostnameDNS":"webserver","OSFirmware":"IBM OS\/400 7.2","LastVerified":"1\/7\/2021","PlantNetwork":"999Office"},{"IPAddress":"192.168.000.005","Type":"Server","Device":"Physical Server","Location":"ERA - Server Room","Comment":"Mainframe","HostnameDNS":"mainframe","OSFirmware":"IBM OS\/400 7.2","LastVerified":"1\/7\/2021","PlantNetwork":"999Office"}]

I'm getting this from a really generic json_encode( $data ); function in select_ipam.php, which queries a Microsoft SQL server database. I have already verified that my header information is set correctly as another user pointed out.

header("Content-Type: application/json; charset=utf-8"); echo( json_encode( $data ) );

I also tried using the "use jsonpath" approach but it just gives me the same thing.


 * I'm not sure what the exact problem you're seeing is, but I would recommend switching to the latest version of External Data - the version you're running is rather old. Yaron Koren (talk) 00:46, 8 January 2021 (UTC)


 * I was able to get it working with some different syntax not using a template and a version update. Now if I can just figure out how to get it to update more frequently. When I click edit and save the page, without making any other changes it updates the table, but otherwise it doesn't want to grab new data from the php post.  Are there any plans to integrate an "ajax mode" that would allow the table data to be refreshed say every 10 seconds without a full page reload?


 * Okay, that's good. No... that sounds like it would require a whole different approach - JavaScript-based instead of PHP-based. Yaron Koren (talk) 20:18, 8 January 2021 (UTC)

get_web_data error after updating mediawiki
updating to mediawiki 1.35.1 and using the latest external data  results in my template: erroring with:

Could not get URL http://localhost/api.php?action=query&list=categorymembers&cmtitle=Category:CHOPs&format=xml&cmlimit=500 after 3 tries.

Clicking the url though returns the expected XML..

--Wuestenarchitekten (talk) 21:47, 13 January 2021 (UTC)


 * Silly question, but did this work before? Yaron Koren (talk) 14:00, 14 January 2021 (UTC)

#get_db_data with 'join on' statement returning blank table
Apologies in advance, I am somewhat new to SQL but I have some experience. I am attempting to use #get_db_data to pull data from a local server DB to display sports statistics. One of the tables has a field (Position) that I need for the wikitable, so I was hoping to join these two tables using the 'PlayerID' field. I can't do this server-side because it would add considerable time to processing and, again, I don't have enough experience.

Using the aliases throughout and executing the query returns no syntax issues or DB errors, but #for_external_table returns an empty table. I get the appropriate number of rows in the table, so I assume it's returning nulls or default values. Without the ability to debug there's no way to track down where in the process it's losing the values.

Here's the code I wrote:

Documentation is limited for the 'join on' statement for this extension, and there aren't even any reported issues from the past (I've checked the archives). Should I create a template and use #display_external_table instead? Any help would be appreciated.

2603:8001:4948:DF00:7D69:EB88:902:704 09:50, 26 January 2021 (UTC)gplehner


 * I don't know. My guess is that the problem is in the "data" parameter, somehow - that External Data is failing to find the external variables listed. I would try taking out those table aliases, i.e. keeping the original table names... there's a chance that that will fix the problem. Yaron Koren (talk) 15:11, 26 January 2021 (UTC)


 * Thanks, I had time to come back to this today. Removing the table aliases from the "data" parameters returned Error 1052. I incrementally removed each alias until I narrowed the issue to two parameters, team and playerID. Both tables have a field with the name 'Team' and 'PlayerID'. It doesn't seem to matter which one I request in the "data" parameter, both print out the default/'else' values I set. The 'data' parameter certainly is understanding the alias names.


 * I should also mention that I switched to using '#display_external_table' in an attempt to pass data parameters that way but it didn't work either. Thanks for your response & suggestions. 45.51.157.21 08:21, 28 January 2021 (UTC) -gplehner


 * If you have database server access you should create a VIEW on the database and then do the select on that view.
 * In the view you can join whatever table you want without touching the tables.
 * See: https://www.w3schools.com/sql/sql_view.asp
 * In your case you create a view called PlayerSeasonStats. --Felipe (talk) 13:14, 28 January 2021 (UTC)


 * I meant removing table aliases from the entire query, by the way, not just from "data". Yaron Koren (talk) 16:26, 28 January 2021 (UTC)

Alex Mashin (talk) 06:30, 3 August 2021 (UTC)
 * I'm seeing the same issue in the current version. "data" doesn't seem to be able to access fields when they are prefixed with their table alias. Seems this behavior has introduced with REL1_31
 * Can you reproduce undesired behaviour on this database?

function to check whether any line of external data matches a string
My use case for External data is reading a single csv file, and then displaying different tables for lines that contain video/pdf/image etc, indicated by a type column. While I can use {{#ifeq:{{#external_value:type}} to check the first line of data, the header, has been read, I can't decide to only display the video table if videos are available. How hard would it be to have a function that checked all entries in a particular column against a search string, returning true if any matched? At the moment I have lots of tables with headers and no content. Vicarage (talk) 16:48, 7 March 2021 (UTC)


 * If I understand this question correctly, you may be able to accomplish this with the Variables extension - instead of displaying the table directly, set it in a variable, and then only display that variable (and its header) if it's non-empty. Yaron Koren (talk) 02:12, 8 March 2021 (UTC)

Thanks, I'll look at that. My initial workaround was to create a counts.csv file, with no header, and check against {{#ifexpr:{{#external_value:imagecount}} > 0, where imagecount=column 4 of the file. Vicarage (talk) 11:57, 8 March 2021 (UTC)

retrieving a template reference from external_data.
I'd like to be able to retrieve a json object that contains one key ('data') and a string that contains mediawiki formatted text. Is there a way to parse that retrieved string, so that it displays on the page as formatted data, rather than plain text?

Example: I'd like to retrieve  from an external source, and have it render the template

173.16.242.134 04:23, 8 March 2021 (UTC) Alex Mashin (talk) 14:54, 24 July 2021 (UTC)
 * Try using Lua bindings and.

display_external_table should do nothing if its template function returns nothing
In http://wiki.johnbray.org.uk/Problem1 I read a 2 column file, and want to put items from the second column in a table ONLY when the first column has value "good". The template function has a switch so nothing is returned, but display_external_table adds a blank line to the last table box, which spoils the format. Using delimiter= just makes the table wider.

As this is a key use case for display_external_table, it would seem to make sense to just do nothing if there was nothing to display. Vicarage (talk) 09:52, 10 March 2021 (UTC)

Fatal exception of type "InvalidArgumentException" When trying to access mysql database.
Recently updated to MediaWiki 1.35 from 1.31, and updated the extension to the latest version for 1.35. Previously the get_db_data queries were working fine, but now they are throwing an Invalid Argument Exception.

When I run a debug log I get the following output, related to one of the exceptions.

[exception] [YGRqrMgm-gbpiKaSuVS3ewABaww] /index.php?title=Template:CementKiln&action=submit InvalidArgumentException from line 68 of /home1/wikiwast/public_html/includes/libs/rdbms/database/domain/DatabaseDomain.php: Prefix must be a string. I have re-loaded the extension several times, and tried a couple of different databases.
 * 1) 0 /home1/wikiwast/public_html/includes/libs/rdbms/database/Database.php(291): Wikimedia\Rdbms\DatabaseDomain->__construct(string, NULL, NULL)
 * 2) 1 /home1/wikiwast/public_html/includes/libs/rdbms/database/DatabaseMysqlBase.php(111): Wikimedia\Rdbms\Database->__construct(array)
 * 3) 2 /home1/wikiwast/public_html/includes/libs/rdbms/database/Database.php(433): Wikimedia\Rdbms\DatabaseMysqlBase->__construct(array)
 * 4) 3 /home1/wikiwast/public_html/extensions/ExternalData/includes/EDUtils.php(238): Wikimedia\Rdbms\Database::factory(string, array)
 * 5) 4 /home1/wikiwast/public_html/extensions/ExternalData/includes/EDParserFunctions.php(377): EDUtils::getDBData(string, string, array, string, array, NULL, array)
 * 6) 5 /home1/wikiwast/public_html/includes/parser/Parser.php(3340): EDParserFunctions::doGetDBData(Parser, string, string, string, string)
 * 7) 6 /home1/wikiwast/public_html/includes/parser/Parser.php(3047): Parser->callParserFunction(PPFrame_Hash, string, array)
 * 8) 7 /home1/wikiwast/public_html/includes/parser/PPFrame_Hash.php(253): Parser->braceSubstitution(array, PPFrame_Hash)
 * 9) 8 /home1/wikiwast/public_html/includes/parser/Parser.php(2887): PPFrame_Hash->expand(PPNode_Hash_Tree, integer)
 * 10) 9 /home1/wikiwast/public_html/includes/parser/Parser.php(1556): Parser->replaceVariables(string)
 * 11) 10 /home1/wikiwast/public_html/includes/parser/Parser.php(651): Parser->internalParse(string)
 * 12) 11 /home1/wikiwast/public_html/includes/content/WikitextContent.php(374): Parser->parse(string, Title, ParserOptions, boolean, boolean, NULL)
 * 13) 12 /home1/wikiwast/public_html/includes/content/AbstractContent.php(590): WikitextContent->fillParserOutput(Title, NULL, ParserOptions, boolean, ParserOutput)
 * 14) 13 /home1/wikiwast/public_html/includes/EditPage.php(4282): AbstractContent->getParserOutput(Title, NULL, ParserOptions)
 * 15) 14 /home1/wikiwast/public_html/includes/EditPage.php(4187): EditPage->doPreviewParse(WikitextContent)
 * 16) 15 /home1/wikiwast/public_html/includes/EditPage.php(2965): EditPage->getPreviewText
 * 17) 16 /home1/wikiwast/public_html/includes/EditPage.php(701): EditPage->showEditForm
 * 18) 17 /home1/wikiwast/public_html/includes/actions/EditAction.php(71): EditPage->edit
 * 19) 18 /home1/wikiwast/public_html/includes/actions/SubmitAction.php(38): EditAction->show
 * 20) 19 /home1/wikiwast/public_html/includes/MediaWiki.php(527): SubmitAction->show
 * 21) 20 /home1/wikiwast/public_html/includes/MediaWiki.php(313): MediaWiki->performAction(Article, Title)
 * 22) 21 /home1/wikiwast/public_html/includes/MediaWiki.php(940): MediaWiki->performRequest
 * 23) 22 /home1/wikiwast/public_html/includes/MediaWiki.php(543): MediaWiki->main
 * 24) 23 /home1/wikiwast/public_html/index.php(53): MediaWiki->run
 * 25) 24 /home1/wikiwast/public_html/index.php(46): wfIndexMain
 * 26) 25 {main}

the LocalConfig.php entries are..

$edgDBServer['engy']="localhost"; $edgDBServerType['engy']="mysql"; $edgDBName['engy']="wikiwast_energy"; $edgDBUser['engy']="wikiwast_energy"; $edgDBPass['engy']=" "; The query I used was, being called from a template. Where the value of EPR is passed to it through the template call. All the data values are correct and contained within the mysql database.

I am at a loss as to what the problem is, I hope that someone might see this and be able to help.


 * For External Data - and really any extension whose compatibility policy is "Master maintains backwards compatibility", which includes all of my extensions - you should not use a REL_ branch, but instead just use the latest code - either the truly latest code, or the latest released version. Using either one may fix this specific problem. Yaron Koren (talk) 13:12, 31 March 2021 (UTC)
 * Thanks a Million Yaron that has fixed the issue. Will remember that in future.

Usage in DE:WP
Hy! This extension seems not to be installed on any Wikipedia, even on Mediawiki not! Tried a usecase on DE:WP, without success. Any possibility to get it enabled on a local Wikipedia? Regards, Uwe Martens (talk) 20:44, 25 April 2021 (UTC)


 * That's outside of my domain. But it would be great if this extension could be installed on a Wikimedia site! Yaron Koren (talk) 12:26, 26 April 2021 (UTC)


 * Thanks for your response! I'll discuss it with a responsible admin. I hoped one of them is active here, but anyway, best regards, Uwe Martens (talk) 13:07, 26 April 2021 (UTC)

No data with XML namespace
Hi Yaron, I bumped into something weird. It seems that External Data is not always able to fetch data when the XML file contains a URL for xmlns. I guess the reason could be, in some cases, that the URL is not accessible (you'd be surprised how often that happens) or in others, that the URL redirects to a new one. Lots of sites still use xmlns="http://www.w3.org/1999/xhtml", with that URL redirecting to the more secure https://www.w3.org/1999/xhtml/. I've created a snippet that you can try out for yourself:

   Jo Nesbø 

Cavila 11:13, 19 August 2021 (UTC)


 * Use  and add it to the XPath query:
 * res: Alex Mashin (talk) 11:43, 19 August 2021 (UTC)
 * Alternative:
 * res: Alex Mashin (talk) 11:52, 19 August 2021 (UTC)
 * res: Alex Mashin (talk) 11:52, 19 August 2021 (UTC)
 * res: Alex Mashin (talk) 11:52, 19 August 2021 (UTC)


 * Thanks, that seems to work! And thanks also for the patch. Cavila 17:39, 19 August 2021 (UTC)

Extension Page
So, the great dismemberment of this page is afoot. But I don't think that merely moving its sections to subpages is enough. The structure of the documentation needs refactoring, whether it is stored in one page or split between several. Alexander Mashintalk 14:05, 30 September 2021 (UTC)

Alexander Mashintalk 16:35, 30 September 2021 (UTC)
 * What kind of things specifically do you recommend? Yaron Koren (talk) 16:17, 30 September 2021 (UTC)
 * I think, a more logical structure of the manual, regarding text formats, would be: fetching - preparsing - parsing - mapping - displaying data. At this point, everything about parsing text is told within the section about, although it also applies to and . The  is also unbalanced. The Lua part should be better integrated. Anyway, refactoring the manual will not be an easy task, due to the complexity and "non-linearity" of the subject, but I think it is necessary.


 * Oh, I forgot that #get_program_data also uses those same formats. Yes, it does seem to make sense to move all the information about the different formats into a new page, including related information like using XPath and so on. I don't know if that's what you refer to as preparsing or parsing, or both (I'm guessing both). Similarly, it probably makes sense to move the information about caching data into a new page - right now it's in the documentation for #get_web_data, for no strong reason. And it probably makes sense to create a page for mapping as well, i.e. the handling of the "data" parameter - I thought at first that this was too simple to justify a separate page, but I see now that quite a few features have been added to it. What do you mean by the #get_db_data part being unbalanced? Was it the incorrect header sizes? (I just fixed that.) I disagree with integrating Lua better - I think, at least for now, that the use of Lua is rare enough that it's fine to require anyone who wants to use Lua for External Data to learn about the parser functions first, even if they're not planning to use the parser functions. Yaron Koren (talk) 17:58, 30 September 2021 (UTC)


 * Well, I've now made a few changes to the documentation, based on this discussion; let me know what you think. Yaron Koren (talk) 21:49, 1 October 2021 (UTC)
 * The structure makes more sense now. Alexander Mashintalk 17:53, 2 October 2021 (UTC)

Call database function
It is possible or will ever be implemented a way to call a database function? I'm actually working on PostgreSQL. Alexander Mashintralk 12:11, 18 December 2021 (UTC) Alexander Mashintalk 15:11, 19 December 2021 (UTC)
 * The MediaWiki RDBMS library quotes all table names; the parentheses get inside double quotes and are treated by the database server as a part of the identifiers; so, currently, no.
 * If you update the extension now, you can define a prepared statement for your connection to the PostgreSQL database, as described here. It does not get quoted.

One missing XML element means no data is retrieved at all
I'm using something along these lines:

If the XML contains elements xxx, yyy and zzz, all works fine.

If the XML is missing anything, for example xxx, it'll say:
 * For the variables:  etc.
 * For the variables:  etc.

Is there any way to ensure that yyy and zzz are set even when xxx is missing from the XML source? gets rid of the first error message but has no effect on whether yyy and zzz are set. Thanks. Alexander Mashintalk 04:11, 15 August 2022 (UTC)
 * Try the solution described here.

Substitution of Output?
Hello! Is there a way to get the display parser functions to work with substitution?. Normally if I wanted to write the output of a parser function to an article I would include  or something. When I do this with #display_external_table or #for_external_table there is no output. I figure I'm either doing it wrong or it's not possible on account of how things naturally operate. Thanks - Lbillett (talk) 13:00, 10 February 2022 (UTC) Alexander Mashintalk 04:21, 15 August 2022 (UTC)
 * The documentation page you referred to says:
 * Substitution is a separate process that is performed before expansion of any non-substituted templates, parser functions, variables or parameters.
 * Therefore, your substitution of  will be carried out before   is executed, that is, when there is no data. You can try to safe substitute a template that includes both   and.

File path works fine, Directory not working
My issue seems exactly the same as this one mentioned in 2014. Using the file path works fine, but switching to the directory mode (for CSV files) fails completely with "Directory does not have file".

Screenshot: https://i.imgur.com/nRLRWxr.png

My page source:

LocalSettings.php:

$wgExternalDataSources['CSV']['path'] = 'CSV/';

(I've tried with and without the / after CSV)

weapons.csv, which is inside my CSV folder:

Name,Type,Ammunition,Capacity "Revolver","Weapon","Revolver Round","6"

Thoughts:


 * Could XAMPP be the culprit? Do I need some obscure PHP extension enabled to make this work? I had to enable "extension=intl" in my php.ini just to get Mediawiki installed. GregariousJB (talk) 03:39, 4 March 2022 (UTC)

Database configuration style changed from 2.x to 3.x
After much digging, I noticed that of Extension:External Data/Databases removed the information about how to configure External Data 2.x. There's no mention of this in the revision history (edit summaries), or the article body itself. It was a long shot that I found any mention of the old style at all, in this Git commit.

Was the removal of the old-style config from /Databases just accidental, or is External Data 2.x now non grata and you don't want to clutter up the documentation with any mention of it?

I'd be happy to restore the missing examples, something like

"Note: Version 2.x of External Data used a configuration style like $edgDBName['ID'] = '…' …"

unless there's a reason not to. I'm stuck at 2.x for now (for reasons), and there must be other users in the same predicament. --Ernstkm (talk) 21:05, 9 March 2022 (UTC)


 * Sorry for not responding before. It's tricky to try handle different versions in the same documentation, especially as the documentation gets more complex. Yes, do feel free to add notes about the old ways of doing things - I think that would be helpful. Yaron Koren (talk) 15:56, 28 March 2022 (UTC)

data from mysql not get refreshed on a page
Hi,

Mediawiki 1.35.2

ExternalData 3.0 (70116a9)

Mariadb / mysql database source

After i modify the content in the database it does not get refreshed on the wiki page. new content is presented ONLY after i re-save the page (without any changes on it). Is it expected?

I would imagine the current content of the database is whown on a page every time it is opened or refreshed.

2A01:110F:E4E:6A00:482A:A928:A8EF:E396 16:06, 21 March 2022 (UTC)


 * My experience with MediaWiki so far is that it's fairly aggressive about caching (that is, it is conservative about re-rendering pages if it doesn't have to). I assume this stems from its primary duty as a worldwide encyclopedia, where page rendering performance is a big concern. So what you're describing sounds like normal MediaWiki behavior to me. It will eventually update the document, just not when you were expecting.


 * As a workaround
 * if you are already using Extension:Cargo on your wiki, you should see a "Purge cache" option in the "More" menu/tab at the top
 * it may be called something else in your language
 * Semantic MediaWiki provides the same, but it's called "Refresh" in the menu
 * you can also add  to the end of the URL in the address bar and press Enter.
 * In any case, this "purge" action should re-render the view(s) of your data, updating the page for all visitors to your wiki. See Manual:Purge for more information. Hope that helps! —Ernstkm (talk) 17:44, 21 March 2022 (UTC)

Getting data from oracle db
Hi,

My Setup:


 * MediaWiki 1.37.1
 * PHP 7.4.28 (fpm-fcgi)
 * MariaDB 10.3.34-MariaDB-0ubuntu0.20.04.1
 * External Data 2.4.1

I try to get data from an oracle database with the get_db_data function. Now I'm very confused by the explanation in the configuration description.

There it's described to put all the values in one array (Extension:External Data/Databases) but if I do so there are errors that values like $edgDBServer, $edgDBServerType and $edgDBName are missing. So I configured them as singles with $edgDBServerType['mydb'] = 'oracle', ... But now I always get the error

Error: Unknown database type oracle

My question is now if there is the type oracle supported and how the configuration must look like?

I have also installed the oracle instaclient_21_5 which is working with my oracleDB.

Another question would be if there is planned to provide some mariaDB support?

Best regards Role-end (talk) 15:18, 24 March 2022 (UTC)


 * I'm pretty sure that querying Oracle no longer works with MediaWiki 1.34 and higher - sorry that the documentation is not clearer on this. You can connect to MariaDB, though - just use "mysql" as the DB type. Yaron Koren (talk) 19:44, 24 March 2022 (UTC)
 * Hi,
 * Thank you very much for your quick help.
 * I've tried with mariaDB and it works perfect. This is even a better solution than oracle because we already have the mariaDB so there is no exta effort.
 * Best Regards Role-end (talk) 08:07, 28 March 2022 (UTC)

syntax error, unexpected '?', expecting variable (T_VARIABLE) in ...bootstrap.php on line 29
When I enable the extension, trying to access any page gives me just this error:

Parse error: syntax error, unexpected '?', expecting variable (T_VARIABLE) in /var/lib/mediawiki/extensions/ExternalData/vendor/symfony/polyfill-php80/bootstrap.php on line 29

This is line 29 in that bootstrap.php file:

I'm running Mediawiki 1.31.10 on Debian 10 with Apache and PostgreSQL, and with PHP version 7.4.3. Albert25 (talk) 14:52, 1 April 2022 (UTC)


 * Are you sure you're running PHP 7.4.3? That "nullable types" feature seems to have been added to PHP in version 7.1. Yaron Koren (talk) 15:15, 1 April 2022 (UTC)
 * Right! Thank you! While php on the command line gave me version 7.4.3, Apache was actually still using 7.0! So the solution was:
 * Albert25 (talk) 15:30, 1 April 2022 (UTC)
 * Albert25 (talk) 15:30, 1 April 2022 (UTC)
 * Albert25 (talk) 15:30, 1 April 2022 (UTC)
 * Albert25 (talk) 15:30, 1 April 2022 (UTC)


 * Ah! That makes sense. Yaron Koren (talk) 15:36, 1 April 2022 (UTC)

#get_file_data : examples? Cannot get it to work.
I'm trying to use, but cannot make it work. Are there some working examples somewhere?

In "LocalSettings.php" I have:

wfLoadExtension( 'ExternalData' ); $wgExternalDataSources['mypath']['path'] = '/docs/www-wiki/';

In a page, I have

If I try, it is ignored, I just get it back as if it were normal text.

If I try, I get an error saying "no local variable "mycnt" has been set".

Before that, I had tried, in the hope of getting just the raw content of the file, but got an error saying "No "data" parameter specified".

I don't know what else to try. The documentation in https://www.mediawiki.org/wiki/Extension:External_Data/Local_files seems very incomplete for someone new to this extension. Albert25 (talk) 18:09, 1 April 2022 (UTC)


 * I finally got it working with a change in "LocalSettings.php":
 * And in the page, to display the full file content as text:
 * Or as CSV in a table:
 * Albert25 (talk) 09:23, 4 April 2022 (UTC)
 * And in the page, to display the full file content as text:
 * Or as CSV in a table:
 * Albert25 (talk) 09:23, 4 April 2022 (UTC)
 * Or as CSV in a table:
 * Albert25 (talk) 09:23, 4 April 2022 (UTC)
 * Or as CSV in a table:
 * Albert25 (talk) 09:23, 4 April 2022 (UTC)
 * Or as CSV in a table:
 * Albert25 (talk) 09:23, 4 April 2022 (UTC)
 * Albert25 (talk) 09:23, 4 April 2022 (UTC)
 * Albert25 (talk) 09:23, 4 April 2022 (UTC)
 * Albert25 (talk) 09:23, 4 April 2022 (UTC)
 * Albert25 (talk) 09:23, 4 April 2022 (UTC)
 * Albert25 (talk) 09:23, 4 April 2022 (UTC)
 * Albert25 (talk) 09:23, 4 April 2022 (UTC)
 * Albert25 (talk) 09:23, 4 April 2022 (UTC)
 * Albert25 (talk) 09:23, 4 April 2022 (UTC)
 * Albert25 (talk) 09:23, 4 April 2022 (UTC)
 * Albert25 (talk) 09:23, 4 April 2022 (UTC)
 * Albert25 (talk) 09:23, 4 April 2022 (UTC)
 * Albert25 (talk) 09:23, 4 April 2022 (UTC)
 * Albert25 (talk) 09:23, 4 April 2022 (UTC)
 * Albert25 (talk) 09:23, 4 April 2022 (UTC)

No luck with start line
Thanks for adding in the start line/end line feature! This would be really helpful for relatively bulky datasets (say, 50,000+ records) that cannot be loaded in their entirety. I tried it out with #get_web_page and csv with header, but while I can set end line to limit the number of results, any value for start line higher than 1 results in no data at all. Is there something I overlooked? Cavila 07:18, 21 April 2022 (UTC)


 * Cannot reproduce. Can you give an example? Alexander Mashintalk 03:42, 16 August 2022 (UTC)

Creating a page per entry in a DB table
I'm new to the External Data extension. It looks like it's basically what I want for a certain use case. I am wondering, though: As part of what I'm doing, I want a Wiki page per entry in a certain DB table. New entries can be added to the DB table at any time.

Am I correct in thinking that the best (perhaps only) way to do this would be externally (to External Data)? For example, having an external program that periodically runs, checks the DB for new entries, and creates new Wiki pages for any that are found? Or is there some way to do this from within External Data?

Thanks. Rwv37 (talk) 18:27, 18 July 2022 (UTC)


 * You might be able to do it with a combination of External Data and the #formredlink parser function from Page Forms, with the "create page" parameter set - in other words, have #display_external_table generate one big page of links, and then Page Forms would create new page(s) every time a user (or script) went to that big page and new links were there. How workable this would be, though, I'm not sure, especially if there were thousands or more entries in the DB table. A custom script may be the safer option. Yaron Koren (talk) 19:11, 18 July 2022 (UTC)

Fix for @ in node names with JSONPath
I recently ran into an issue similar to https://github.com/json-path/JsonPath/issues/798. If you select a node beginning with an at-sign, Eternal Data will not exit gracefully. From that bug rapport, it seems like a fix is available in that it now lets you escape that character (Java only I guess), although no commit is referenced. Do you think this is fixable? Cavila 11:48, 13 August 2022 (UTC)

Alexander Mashintalk 04:18, 16 August 2022 (UTC)
 * This works as expected, returning "@@@AAA":
 * Other variants display an error message about either invalid JSONpath or absent value, also as expected. I was unable to cause an unhandled exception or error 502. Please give an example, if you can.
 * Other variants display an error message about either invalid JSONpath or absent value, also as expected. I was unable to cause an unhandled exception or error 502. Please give an example, if you can.