User:Leucosticte/Junkyard


 * Could some database settings be changed to arrays, in order to store different wikis' stuff on different db servers while maintaining the structure of one big database? Or is there some other way to mitigate/get around that problem? E.g. task-based servers? Leucosticte (talk) 20:01, 14 August 2012 (UTC)

<!-- Hmm, Is this Pestergaines, logged out? Well, the way I see it, the main point of the Essay namespace would be for points of view that people have vetoed from appearing in mainspace. So, for example, if someone wrote an article about why libertarians should oppose abortion, that would probably be a prime candidate for essayspace. Then again, every point raised in such an essay could also be raised in an article on arguments against abortion or the like.

It seems to me that at some point, maintaining an Austrian point of view demands biasing articles, which is not much different from creating an essay. For example, suppose someone were to show up and post, say, the Wikipedia article on. We would edit it in ways that the Wikipedia community would deem biased.

But really, they're just saying "Our bias is better than your bias." Their bias is the mainstream bias. If the mainstream were Austrian, then Mises Wiki and Wikipedia would be one and the same; we could just mirror their articles without modification and that would be the end of it. There would be no need to edit this wiki because all of those edits would be accepted at Wikipedia. Every Wikipedia article is basically an essay; it's just that we don't happen to like their bias, and they don't like ours. I put a response to the one question at MisesWiki talk:Namespaces.

Granted, I do post a lot of original thought, too. But that which seems "original" is usually just synthesis of what already exists. And in any event, libertarian philosophy has to keep moving forward; someone has to explore into uncharted areas that haven't yet been covered by others in the Austrian school.

And some stuff would seemingly be off-topic, but not necessarily! Human action encompasses quite a large field, hence Mises' wide-ranging commentary, touching on biology, psychology, sociology, etc. and providing his own interpretations of history, which again were probably mostly syntheses of what he heard and read elsewhere. Really, an Austrian school encyclopedia could cover everything in the universe (or multiverse, as the case may be) since everything is interconnected. There were people who had such hopes for Libertapedia, although regrettably that plan flopped due to insufficient labor.

I guess what I should ask is, What are some examples of the kind of content that's good Mises Wiki material? The featured articles? The Ludwig von Mises article is quite biased; e.g. it says "Ludwig von Mises was one of the most notable economists and social philosophers of the twentieth century. In the course of a long and highly productive life, developed an integrated, deduct­ive science of economics based on the fundamental axiom that in­dividual human beings act purposively to achieve desired goals" and "Effectively barred from any paid university post in Austria and later in the United States, Mises pursued his course gallantly. As the chief economic adviser to the Austrian government in the 1920s, Mises was single-handedly able to slow down Austrian inflation". That's quite biased and the latter statement is inaccurate unless Mises had the power to single-handedly create economic policy. I thought that he merely had the ear of the government, rather than actually being in control of the levers of power. Indeed, according to one account, the halt in inflation was "Thanks to Seipel." ~

When server access was granted, no parameters were established for what would be allowed or disallowed, so I assume that the typical be bold principle applies; i.e. take whatever actions seem to be in furtherance of the site mission, unless higher authority vetoes it. Higher authority would be, presumably, the Mises Institute, as represented by David Veksler, who is in charge of the wiki (actually, he says he put himself in charge of it); the community, to whom those rights were delegated; etc. It's not really clear that those with server access have any particular place in the power scheme; according to this account, in Wikipedia's early days, "Developers formerly had an important role in the Wikipedia power structure" but that was done away with after bureaucrats and stewards were established. , although typically they're not particularly active in that role because their focus is more on the technical side of things. They tend not to get into wiki-politics much. Leucosticte (talk) 20:12, 15 October 2012 (UTC)-->

Hi Jonathan, I'm experiencing some problems trying to get Extension:SacredText to work on MW 1.19. I noted them at Extension talk:SacredText. I get the impression there's some sort of minor breaking change that was made to MediaWiki that caused this extension to not work anymore, but I wasn't able to figure out exactly what it might be. I looked at parser->setHook; is that supposed to take an array as parameter 2, or should it take a public static function like SacredTextLookup::hookBible? I made those changes and it still didn't work, and in reviewing the rest of the code, I couldn't figure out where the glitch might be. I've tried this on a few different v1.19 MediaWiki installations and it hasn't worked properly on any of them. Leucosticte (talk) 23:30, 15 October 2012 (UTC)

if ( $chapternum == $secondChapterNum ) { $where = array ( "{$dbr->tableName( 'st_chapter_num' )}=$chapternum			AND {$dbr->tableName ( 'st_verse_num' )}>=$versenums[0]			AND {$dbr->tableName ( 'st_verse_num' )}<=$secondVerseNum" ); } else { $where = array ( "(({dbr->tableName ( 'st_chapter_num' )}=$chapternum			AND {$dbr->tableName ( 'st_verse_num' )}>=$versenum[0])			OR ({$dbr->tableName ( 'st_chapter_num' )}=$secondChapterNum			AND {$dbr->tableName ( 'st_verse_num' )}<=$secondVerseNum)" );		$chapterArray = range ( $chapternum, $secondChapterNum );		foreach ( $chapterArray as $thisChapter ) {			if ( $thisChapter != $chapternum && $thisChapter != $secondChapterNum ) {				$where .= " OR {$dbr->tableName ( 'st_chapter_num' )}=$thisChapter";			}		}		$where .= ')'; }

if ( $res ) { $numRows = $dbr->numRows( $res ); for ( $count = 0; $count < $numRows - 1; $count++ ) { $retArray[ $count ] = $dbr->fetchRow( $res ); }           }

// Alphabetize it           $alphaRetArray = array; foreach ( $retArray as $retArrays ) { $alphaRetArray[] = $retArray->getPrefixedText; }           $alphaRetArray = natstort ( $alphaRetArray ); // 'pageid': List deleted revs for certain page IDs (4) // 'arid': List deleted revs for certain archive IDs (5) // 'logid': List deleted revs for certain log IDs (i.e. the log IDs of the deletion		// 	actions (6)

if ( count ( $pageids ) > 0 ) { $mode = 'pageid'; }		if ( !is_null ( $params['arid'] ) ) { $mode = 'arid'; }		if ( !is_null ( $params['logid'] ) ) { $mode = 'logid'; }

'ar_id'           => isset ( $row->ar_id ) ? $row->ar_id : null, 'log_id'          => isset ( $row->ar_log_id ) ? $row->ar_log_id : null, 'log_timestamp'   => isset ( $row->ar_log_timestamp ) ? $row->ar_log_timestamp : null, 'log_user'        => isset ( $row->ar_user ) ? $row->ar_user : null, 'log_user_text'   => isset ( $row->ar_user_text ) ? $row->ar_user_text : null, 'log_comment'     => isset ( $row->ar_comment ) ? $row->ar_comment : null,

$row[ 'ar_log_id' ] = $logid; $row[ 'ar_log_timestamp' ] = $dbw->timestamp ( $logEntry->getTimestamp ); $row[ 'ar_log_user' ] = $user->getId; $row[ 'ar_log_user_text' ] = $user->getName; $row[ 'ar_log_comment' ] = $wgContLang->truncate( $reason, 255 );

This could be useful information, if for no other reason as a sanity check. Sometimes my wikis get a bunch of linkspam and I wonder, "Are these people manually getting through my CAPTCHAs or is there some sort of configuration problem that is letting them avoid the CAPTCHAs?" I wonder, though, how useful it is to see how many users give up after receiving the CAPTCHA, since a lot of those users might be spambots. If you wanted to differentiate the two, it would be helpful to also record the attempted revisions somewhere to see what they are giving up on adding. Who knows, there might be some gems among those attempts that one could recover and post to the articles.

CREATE TABLE archive ( ar_id int unsigned NOT NULL PRIMARY KEY AUTO_INCREMENT,  ar_namespace int NOT NULL default 0,  ar_title varchar(255) binary NOT NULL default ,  ar_text mediumblob NOT NULL,  ar_comment tinyblob NOT NULL,  ar_user int unsigned NOT NULL default 0,  ar_user_text varchar(255) binary NOT NULL,  ar_timestamp binary(14) NOT NULL default ,  ar_minor_edit tinyint NOT NULL default 0,  ar_flags tinyblob NOT NULL,  ar_rev_id int unsigned,  ar_text_id int unsigned,  ar_deleted tinyint unsigned NOT NULL default 0,  ar_len int unsigned,  ar_page_id int unsigned,  ar_parent_id int unsigned default NULL,  ar_sha1 varbinary(32) NOT NULL default ,  ar_content_model varbinary(32) DEFAULT NULL,  ar_content_format varbinary(64) DEFAULT NULL,  ar_log_id int unsigned NOT NULL default 0,  ar_log_timestamp binary(14) NOT NULL default ,  ar_log_user int unsigned NOT NULL default 0,  ar_log_user_text varchar(255) binary NOT NULL default '', ar_log_comment varchar(255) NOT NULL default '' );

CREATE TABLE /*$wgDBprefix*/archive_tmp (  ar_id NOT NULL PRIMARY KEY clustered IDENTITY,   ar_namespace SMALLINT NOT NULL DEFAULT 0,   ar_title NVARCHAR(255) NOT NULL DEFAULT '',   ar_text NVARCHAR(MAX) NOT NULL,   ar_comment NVARCHAR(255) NOT NULL,   ar_user INT NULL REFERENCES /*$wgDBprefix*/[user](user_id) ON DELETE SET NULL,   ar_user_text NVARCHAR(255) NOT NULL,   ar_timestamp DATETIME NOT NULL DEFAULT GETDATE,   ar_minor_edit BIT NOT NULL DEFAULT 0,   ar_flags NVARCHAR(255) NOT NULL,   ar_rev_id INT,   ar_text_id INT,   ar_deleted BIT NOT NULL DEFAULT 0,   ar_len INT DEFAULT NULL,   ar_page_id INT NULL,   ar_parent_id INT NULL,   ar_log_id INT,   ar_log_timestamp DATETIME NOT NULL DEFAULT GETDATE,   ar_log_user INT NULL REFERENCES /*$wgDBprefix*/[user](user_id) ON DELETE SET NULL,   ar_log_user_text NVARCHAR(255) NOT NULL,   ar_log_comment NVARCHAR(255) NOT NULL, );

url = "http://meta.wikimedia.org/w/api.php";
 * 1) $wiki->url = "http://en.wikipedia.org/w/api.php";

/* All the login stuff. */ $user = 'LeucosticteBot'; $pass = 'RE3Uwreh'; $wiki->login($user,$pass); unset($pass);

// Two options: -q log, -q rc $options = getopt( 'q:'); switch ( $options ) { case 'log': $ret = $wiki->query ('?action=query&list=logevents&letype=newusers&lelimit=5'           . '&leprop=ids%7Ctitle%7Ctype%7Cuser%7Cuserid%7Ctimestamp%7Ccomment%7Cdetails%7Ctags'            . '&ledir=newer&format=php&lestart=2012-09-07T22:16:49Z', true); $logevents = $ret['query']['logevents']; break; case 'rc': $ret = $wiki->query ('?action=query&list=recentchanges&rcstart=2011-08-09T20:02:37Z'           . '&rcdir=newer&rcprop=user|userid|comment|timestamp|title|ids|sizes|redirect|loginfo'            . '|flags|patrolled|loginfo|tags&rclimit=500', true); $logevents = $ret['query']['logevents']; break; default: die ( "The options are log or rc\n" ); break; }

$logevents = $ret['query']['logevents']; $dbFields = array (   'mblq_log_id',    'mblq_page_id',    'mblq_page_namespace',    'mblq_page_title',    'mblq_type',    'mblq_action',    'mblq_user',    'mblq_user_id',    'mblq_timestamp',    'mblq_comment',    'mblq_tags' ); $userRow = array (     'logid',      'pageid',      'ns',      'title',      'type',      'action',      'user',      'userid',      'timestamp',      'comment',      'tags' ); $stringFields = array (     'title',      'type',      'action',      'user',      'comment',      'tags' ); $undesirables = array ( '-', ':', 'T', 'Z' ); $row = 'insert into mb_log_queue ( ' . implode ( ', ', $dbFields ) . ' ) values '; $isFirstInEvent = true; // For each user creation event in that result set foreach ( $logevents as $thisLogevent ) { if ( $isFirstInEvent === false ) { $row .= ', '; }     $isFirstInEvent = false; $row .= '( ';     $isFirstInItem = true;      // Get rid of dashes, colons, Ts and Zs in timestamp      $thisLogevent['timestamp'] = str_replace ( $undesirables, '', $thisLogevent['timestamp'] );      // Iterate over those database fields      foreach ( $userRow as $thisRowItem ) {            if ( $isFirstInItem === false ) {                  $row .= ', ';            }            $isFirstInItem = false;            // If it's an array (e.g. tag array), implode it            if ( is_array ( $thisLogevent[$thisRowItem] ) ) {                  $thisLogevent[$thisRowItem] = implode ( $thisLogevent[$thisRowItem] );            }            // If it's a string field, escape it            if ( in_array ( $thisRowItem, $stringFields ) ) {                  $thisLogevent[$thisRowItem] = "'" . $con->real_escape_string                        ( $thisLogevent[$thisRowItem] ) . "'";            }            $row .= $thisLogevent[$thisRowItem]; }     $row .= ')'; } $row .= ';'; echo $row;

$con->query ( $row ); $con->close;

rc_id int NOT NULL PRIMARY KEY AUTO_INCREMENT, -> rcid rc_timestamp varbinary(14) NOT NULL default '', -> timestamp rc_user int unsigned NOT NULL default 0, -> userid rc_user_text varchar(255) binary NOT NULL, -> user rc_namespace int NOT NULL default 0, -> part of title rc_title varchar(255) binary NOT NULL default '', -> part of title rc_comment varchar(255) binary NOT NULL default '', -> comment rc_minor tinyint unsigned NOT NULL default 0, -> minor (either there or not) rc_bot tinyint unsigned NOT NULL default 0, -> bot rc_new tinyint unsigned NOT NULL default 0, -> new rc_cur_id int unsigned NOT NULL default 0, -> pageid rc_this_oldid int unsigned NOT NULL default 0, revid rc_last_oldid int unsigned NOT NULL default 0, -> old_revid rc_type tinyint unsigned NOT NULL default 0,

-- These may no longer be used, with the new move log. rc_moved_to_ns tinyint unsigned NOT NULL default 0, rc_moved_to_title varchar(255) binary NOT NULL default '',

-- If the Recent Changes Patrol option is enabled, -- users may mark edits as having been reviewed to -- remove a warning flag on the RC list. -- A value of 1 indicates the page has been reviewed. rc_patrolled tinyint unsigned NOT NULL default 0,

-- Recorded IP address the edit was made from, if the -- $wgPutIPinRC option is enabled. rc_ip varbinary(40) NOT NULL default '',

-- Text length in characters before -- and after the edit rc_old_len int, rc_new_len int,

-- Visibility of recent changes items, bitfield rc_deleted tinyint unsigned NOT NULL default 0,

-- Value corresonding to log_id, specific log entries rc_logid int unsigned NOT NULL default 0, -- Store log type info here, or null rc_log_type varbinary(255) NULL default NULL, -- Store log action or null rc_log_action varbinary(255) NULL default NULL, -- Log params rc_params blob NULL ) /*$wgDBTableOptions*/;

$namespace = array (     'Main:',      'Talk:',      'User:',      'User talk:',      'Wikipedia:',      'Wikipedia talk',      'File:',      'File talk:',      'MediaWiki:',      'MediaWiki talk:',      'Template:',      'Template talk:',      'Help:',      'Help talk:',      'Category:',      'Category talk:' ); $namespace[100] = 'Portal:'; $namespace[101] = 'Portal talk:'; $namespace[108] = 'Book:'; $namespace[109] = 'Book talk:';

case 'log': $ret = $wiki->query ('?action=query&list=logevents&letype=newusers&lelimit=5'	   . '&leprop=ids%7Ctitle%7Ctype%7Cuser%7Cuserid%7Ctimestamp%7Ccomment%7Cdetails%7Ctags'	    . '&ledir=newer&format=php&lestart=2012-09-07T22:16:49Z', true); $events = $ret['query']['logevents']; $table = 'mb_log_queue'; $fields = array (		 'mblq_log_id' => 'logid',		  'mblq_page_id' => 'pageid',		  'mblq_page_namespace' => 'ns',		  'mblq_page_title' => 'title',		  'mblq_type' => 'type',		  'mblq_action' => 'action',		  'mblq_user' => 'user',		  'mblq_user_id' => 'userid',		  'mblq_timestamp' => 'timestamp',		  'mblq_comment' => 'comment',		  'mblq_tags' => 'tags'	    ); $stringFields = array (		 'title',		  'type',		  'action',		  'user',		  'comment',		  'tags'	    ); break;