Extension talk:SphinxSearch/LQT Archive 1

Old discussion points relevant only to older versions of the extension, Sphinx, or MediaWiki have been moved to the archive page.

feature request: show categories in result list
it would be great to be able to list the categories of articles in search results. --Nilsja 01:29, 1 June 2010 (UTC)

Working well in MW1.5!
Everyone I have spoken to that uses our internal Wiki has nothing but positives to say about this. I really think MediaWiki should adopt Sphinx as the DEFAULT search, as the bundled one is so bad. --195.75.83.25 08:45, 6 July 2009 (UTC)


 * Thanks! Note that the next release will not use ExtensionFunctions, and it will require at least MW 1.7 (and PHP5 in general.) Also note that it was never intended for MW <1.9 - no idea how you got it to work with 1.5 :-) Svemir Brkic 19:45, 12 December 2009 (UTC)

Grouping results by Namespace
I have the extension running but since it doesn't support the weighting of results, is there a way to group the results by the namespace they belong to? -- 20:48, 14 April 2009 (UTC)


 * Version 0.7 supports $wgSphinxSearch_index_weights array, which lets you specify a weight for each index you have. You still need to setup those indexes manually, as it does not make much sense to have that as the default setup. I will try to prepare a sample sphinx.conf for such a setup - unless somebody else beats me to it (HINT: you can have the main query include only namespace 0, and a supplemental index for other namespaces, or even each namespace separately. You will also need to have more than one incremental index, but maybe you will decide other namespaces do not need to get updated as frequently...) Svemir Brkic 04:08, 18 February 2010 (UTC)

Running on separate machines
The installation directions seem to assume that MediaWiki and sphinxd are on the same machine. How should I configure Sphinx and the extension if MediaWiki (and its database) are hosted separately from where Sphinx is installed? --Emufarmers 04:43, 5 November 2007 (UTC)
 * You are correct. I should update the main page with these instructions. To make this extension work in your case you will need to configure sphinx.conf. In that file, modify the src_wiki_main section to specify the correct sql_host = hostname, where hostname is the name of the machine running the MySQL database for your wiki (default is localhost). This way, you can install Sphinx on the same machine as the web server (as opposed to the MySQL server), and all instructions as listed on the main page are still valid.
 * Please let me know if you have any more questions, or, for that matter, if this worked for you. --Gri6507 13:05, 5 November 2007 (UTC)
 * Hi, sorry for the delay. I should have been more clear: My wiki is on shared hosting, so I can't install Sphinx on its server. I run the search backend (presently Lucene, but Sphinx sounds promising) on a machine in my home. With Lucene, my backend machine SSHs into the webserver, grabs a dump of the wiki, indexes it, and then runs search queries from the webserver through the index it generates and sends the results back to the webserver. It's a rather convoluted setup (and it's even messier when it comes to updating the index), but I'm wondering if I can do anything along the same lines here. --Emufarmers 20:08, 11 November 2007 (UTC)
 * Ok, I understand your setup. Sphinx.conf file can be configured to make sphinxd run on a different machine (let's call it Machine S, for sphinx) from the machine running MySQL (let's call it Machine M, for mysql). However, in that case Machine S has to have netword access to Machine M. My guess is that something similar to your present setup with SSH tunnels could be done here as well. If you are interested in trying this out, please let me know via email (see extension credits) and we could work through these questions then. --Gri6507 22:17, 11 November 2007 (UTC)

I am very interested to know how this configuration has worked for users. I have read many articles that state the problem of "I run a wiki, but it's through a shared hosting and installing the Sphinx daemon is out of the question". This is also my case - so in an effort to get better search capabilities than the standard search, this appears to be one of my only options. A few questions I have: how much traffic is involved when the indexing is performed? Is it a problem to have the daemon going across the wire to access the SQL database for indexing? I have concerns that it will drastically increase my bandwidth usage. Second, I think it would be a great addition to the extension to redirect/use the standard search if the sphinxd cannot be found (if the daemon machine goes offline). Just some ramblings but I am interested in anyone's thoughts - Blac0177 06:04, 12 December 2008 (UTC)
 * Daemon does not do the indexing. That is done with a separate process which you run on a schedule. Daemon simply searches the index and returns the results. Search requests are small, and search results depend on the actual data being searched. You could have a replica of your database on the machine that runs the indexer and the daemon - just as it is described above in the Lucene example. You just need to make sure that your web server can communicate on the specified host and port to your sphinx daemon (and that nobody else can, as a security precaution.) Your bandwidth usage will depend on two things - the way you use to replicate the database, and the amount of search queries you get.


 * For replication, if it is MySQL and you can turn on binary logs, you can just replay those logs on your local copy. This is in case the database is too big to copy entire thing over for every indexer run. You could also dump just the records modified since the last run, since indexer does not really need your entire database - it only needs those tables that it actually indexes. Svemir Brkic 18:57, 17 January 2009 (UTC)

Showing X of Y documents in search results, but document links not showing

 * "Displaying 0-0 of 0 matches for query specifi* retrieved in 0.000 sec with following stats: specifi* found 8 times in 8 documents".


 * Your query indicates that you are using a "star" search, which is not enabled by default in sphinx (it is a new, almost undocumented feature.) Our default sphinx.conf does not enable it either. Maybe you are using a different config file for command line search? To enable it in the config file used by the extension, you need to add this to the main index section:

min_infix_len = 1 enable_star = 1
 * After this, you need to do a full reindex and restart the searchd (--rotate alone is not enough when you change the config file.) Svemir Brkic 20:57, 2 February 2008 (UTC)

Searching multiple wikis
I currently have 3 wikis indexed with Sphinx. The search works well, but it is returning results for all of them. I've set the "$wgSphinxSearch_index = X"; line in the SphinxSearch.php. Am I missing something? --N0ctrnl 20:28, 24 March 2008 (UTC)


 * I think this can be achieved. You need to modify the SphinxSearch_body.php to only search only the specified index. Line 294 should change to:

$res = $cl->Query($search_term, $wgSphinxSearch_index);

Make sure the $wgSphinxSearch_index variable is set to the main index in the localsettings or somewhere. Also, assuming you are using the main+delta scheme outlined on main page, you need to roll the two indexes into one. Something like (stripped paths for brevity-haha): 0 9,15,21 * * * indexer --quiet --config sphinx.conf wiki1_incremental wiki2_incremental --rotate && indexer --quiet --config sphinx.conf --merge wiki1_main wiki1_incremental && indexer --quiet --config sphinx.conf --merge wiki2_main wiki2_incremental From the sphinx docs it says that merging the indexs does take time, but is normally faster than reindexing. --UnwashedMeme 22:41, 22 October 2009 (UTC)


 * Playing around with it a bit more: it works better if you sleep for a few seconds(I'm using 5) between running the indexer on the 2 incremental indexes and then running the merge command. Otherwise the .new version of the index isn't there yet when the indexer sends searchd the sighup that causes it to rotate and you don't actually get the updated index until the next time cron rotates the indexes. I don't think that sleep is related to index size; i.e. a large db that takes longer to index would still only need to sleep long enough to give the kernel a chance to get things synced again? UnwashedMeme 19:28, 23 October 2009 (UTC)


 * New in 0.7: $wgSphinxSearch_index_list lets you specify the list of indexes to search. You can have multiple main and delta indexes on the machine, and each wiki can define its own pair to search. You can also combine several index files and assign different weight to each of them, using $wgSphinxSearch_index_weights array. Svemir Brkic 01:06, 18 February 2010 (UTC)

Search through various indexes are failing to display combined results
On our 1.15.1 wiki system we are using SphinxSearch version 0.7 and the query log shows that a search on
 * both indexes are search for 0.023 sec [ext/1/rel 7 (0,15)] [wiki_main,wiki_incremental,research_main,research_incremental] Cotterrell but the search on one wiki system only shows results from [wiki_main,wiki_incremental] where results from [research_main,research_incremental] are not shown.
 * [research_main,research_incremental] are indexed and
 * SphinxSearch.php has been maintained with $wgSphinxSearch_index_list = "wiki_main,wiki_incremental,research_main,research_incremental"; and $wgSphinxSearch_index_weights has been set.

Changing to the other wiki system (1.16beta) for the search term 'kotler' would result in Displaying 61-75 of 87 matches for query kotler retrieved in 0.008 sec with these stats: kotler found 200 times in 107 documents Above numbers may include documents not listed due to search options. but would not show any result in the list (as those terms only exist on the 1.15.1 system). We conclude that the search term is found in the index files but something hinder to display results from one wiki on the others wiki result page.

Is their an option to set a split or combined display of search results and in case of a combined display also render the right url to an article pending on the server.

Any suggestions how to solve this would be appreciated --MWJames 01:35, 27 May 2010 (UTC)

^^ **I can confirm that this problem with my setup as well** --01:35, 13 June 2010 (UTC)

Facing the same issue, I have about 5 Wiki's and only the results of the main wiki are being displayed as links. --AutoStatic 15:10, 23 September 2010 (UTC)


 * Sphinx is not storing all the information needed to display the results. It only stores IDs and some other specific attributes such as the namespace. String content is only indexed - stored in some special manner that allows fast searching. When displaying search results, sphinx must have access to the original database to be able to lookup the titles and show the links. Svemir Brkic 17:25, 23 September 2010 (UTC)


 * What do you mean exactly by 'have access to the original database'? Where can I set these access rights? So it has probably nothing to do with the way I set up my Wiki's (http://www.steverumberg.com/wiki/index.php?title=WikiHelp_-_Method_Two)? --AutoStatic 12:26, 21 October 2010 (UTC)


 * SphinxSearch extension asks for search results from the index. It does not know how those results got into the index. It assumes they all came from the wiki database extension runs on currently. In order to display search results, extension needs to make additional database queries to the database the search results came from originally. This is not a matter or access rights. Extension was not designed to support multiple databases in this way. Such support would have to be added by someone - perhaps by adding an array that maps each index to a specific database. Svemir Brkic 15:44, 13 November 2010 (UTC)

match all words as default
Not sure if this is something that is controlled by Sphinx or not, but it would be better for me if "match all words" was the default. Matching mode set to: SPH_MATCH_ALL - by default it is SPH_MATCH_EXTENDED, but does not work properly as http://www.sphinxsearch.com/doc.html#extended-syntax explains --1 April 2008


 * You can set things such as $wgSphinxSearch_mode in LocalSettings. The reason we use SPH_MATCH_EXTENDED is to be able to modify the query internally and do "match all" vs. "match any" with a radio box instead of teaching users the Sphinx query syntax. Svemir Brkic 03:08, 6 April 2008 (UTC)

@page_title DOES NOT WORK
for some reason @old_text works but @page_title never returns any results... maybe it has something to do with the new releases of sphinx. --207.96.208.130 22:22, 1 April 2008 (UTC)


 * New in 0.7: 'match titles only' checkbox on the search page. Svemir Brkic 01:06, 18 February 2010 (UTC)

Is there any way to prevent Sphinx from indexing particular pages?
I realize this runs counter to what most people would want, but some pages don't need to be indexed. I've made some reasonable searches here and on the Sphinx site, and believe this is more relevant to a MediaWiki discussion than Sphinx in general. Jon Doran, 9 May 2008


 * You could modify the query in sphinx.conf to filter out any pages you do not want. It could be done based on namespace, a join with some other table (e.g. categorylinks,) or some new field or table you would create yourself. Svemir Brkic 01:23, 10 May 2008 (UTC)


 * Thanks for the suggestions. I did not consider the query, but now that you mention it, there is a lot I can do with it.  Jon Doran, 10 May 2008

More Windows Install Issues
Help for windows users.

Step 5 - Start Sphinx Daemon
To create the windows service ... C:\sphinx-0.9.9-win32\bin> searchd.exe --install  --config C:\Sphinx\sphinx.conf --servicename SphinxSearch

Start the service c:\> sc start SphinxSearch

If it fails, double check your sphinx.conf file. Make sure the paths are set properly in the searchd section.

Step 6 - Configure Incremental Updates
Use the Windows Task Scheduler, found under Accessories | System Tools | Scheduled Tasks

Create two jobs. One to run once a day and one to run as needed for incremental updates. If you use the scheduler there will not be any command pop ups.

If you get "ERROR: index 'wiki_main': column number 1 has no name." when trying to index, copy libmysql.dll from MySQL 5.0.37 into Sphinx bin directory. For some reason 5.1 version does not work with Sphinx on Windows.

Incremental Update and Windows Task Scheduler
As for the index update the windows task scheduler has to run the indexer.exe therefore one solution to avoid unnecessary command pop up in Windows is:
 * 1) Create one batch file wiki_main.bat that contain  \indexer.exe --quiet --config \sphinx.conf wiki_main --rotate
 * 2) Create one batch file wiki_incremental.bat that contain  \indexer.exe --quiet --config \sphinx.conf wiki_incremental --rotate
 * 3) Using the help of Invisible Batch File and create a " \bin\invisible.vbs" file that contains CreateObject("Wscript.Shell").Run """" & WScript.Arguments(0) & """", 0, this will help to set the batch file invisible
 * 4) After this the task scheduler can be run with a command like  \system32\wscript.exe " \bin\invisible.vbs" " \Sphinx\bin\wiki_incremental.bat"

Problems on Windows Vista / Windows 7
We are testing searchd on Vista and Windows 7 where the searchd daemon does not return any search results but since the test php-scripts that are coming with the standard sphinx interface itself not working we are guessing it is a problem inherited in the SphinxSearch and not related to the Mediawiki/SphinxSearch interface. We created a forum post Sphinx forum post/searchd and Windows for any follow up's. --MWJames 18:00, 4 May 2010 (UTC)

127.0.0.1 instead of localhost
As mentioned elsewhere (on the talk page?) it is important to use IP address 127.0.0.1 instead of localhost in the sphinx.config files. Found this hint after lengthy trials. It should be mentioned more prominently! Only this change made my installation sucessful and start working with mediawiki (on Windows XP).

On the command line localhost works fine.

Question when searching for IP's
We use the Wiki here in an IT setting so many of our articles refer IP addresses. The default search does not find any variation of IPs when searched (for example 102., 102.160.2.2, 106..etc.) Can anyone tell me if this search does a better job with this? Thanks. --Comalia 19:37, 15 July 2008 (UTC)
 * It would certainly do a better job than MySQL full-text index - even in default configuration. You could also tweak it further, but I am not sure I fully understand what exactly you need. If you provide a some specific examples of data and search strings that should match it, I can test it. Svemir Brkic 22:45, 15 July 2008 (UTC)

Sure. Say that I have a few articles that have the line of text 192.165.1.0 in them. So, if searching for 192.165.1.0, would it return any results? Or variations of it, such as "192.165"? --Comalia 13:41, 18 July 2008 (UTC)
 * Yes, both searches will match that article. It will consider 192, 165, 1, and 0 as separate "words". You can tell it whether to search for all those words or any of them (it is an option on the search page, but you can also change the default.) Since proximity of the matched words is an important factor, you will get the articles that have entire IP in them first. Svemir Brkic 16:46, 18 July 2008 (UTC)

Installing issues
I am trying to install SphinxSearch 0.9.8 on Linux RHEL with mySQL. I did the ./configure and everything seemed fine. Then when build the binaries with make, I get the follwing:

sphinx.h:54:19: error: mysql.h: No such file or directory --Comalia 19:50, 22 July 2008 (UTC)


 * SOLUTION
 * I had a similar issue on FC9. I did "yum install mysql-devel" and that fixed it. Try installing the mysql-devel version for your mysql install and then building sphinx. --5 August 2008


 * SOLUTION
 * Same here. I'm a debian user. aptitude install libmysql++-dev libmysqlclient15-dev checkinstall corrected the problem. Much Love

init.d script for FC users
Here is a chkconfig compatible script I created for FC users. It is a modification on a script by Vladimir Fedorkov. This Script assumes you've put the pid file (config in sphinx.conf) in /var/run for selinux purposes. Speaking of selinux, you'll need to add port 3312 to the http port context.

Keyword Priority in Query String
It seems that the order of keywords actually changes the results. In my case, if I send a space delimited list of keywords, I get different search results depending on the position of my most important keywords. Am I missing a setting that prioritizes keywords based on there position in the query string? Cedarrapidsboy 13:45, 26 September 2008 (UTC)


 * Yes, order of keywords matters, as well as the order and proximity of the matches in searched text. That is the function of the Sphinx itself, but you can affect it by changing the matching mode in SphinxSearch.php. Svemir Brkic 17:01, 26 September 2008 (UTC)


 * Thanks. Is the order of keywords documented?  I mean to say, where can I find information on where to put my most important words, and then my least important words?  I understand the matching modes, but haven't read any mention of keyword position (with no operators joining them) affecting priority.  I'm sure I'm likely blind. 12.207.221.230 00:23, 27 September 2008 (UTC)

Feature Request: Excluding Selected Categories from search
Will be useful to filter results not only by pointing to desired categories, but also by setting undesired categories $cl->SetFilter('category', $categories_to_exclude, true);

The Search Form will be like this: Include   Exclude Category1       [x]        [ ] Category2       [ ]        [ ] ... Category7        [ ]        [x] Category8       [ ]        [ ] --StasFomin 14:59, 10 November 2008 (UTC)


 * New in 0.7: $wgUseExcludes Svemir Brkic 01:59, 18 February 2010 (UTC)

Wikipedia
Can anyone tell me why Wikipedia has not installed this extension? According to the main article, it works with Wikipedia. --Robinson Weijman 09:59, 21 January 2009 (UTC)


 * Wikipedia already uses a Lucene search engine. —Emufarmers(T 11:53, 21 January 2009 (UTC)


 * OK, thanks. So when and why would SphinxSearch be better than Lucene-Search - and vice versa?  --Robinson Weijman 07:34, 22 January 2009 (UTC)


 * Lucene has more features and is a more stable and mature product. It also needs more resources and is harder to install and maintain. Sphinx is still evolving - both the search engine itself and MediaWiki extension. It may not have all the features of Lucene yet, but it is much easier to setup and try out. If it does not do something you need, by all means go for Lucene. Svemir Brkic 14:00, 22 January 2009 (UTC)


 * Thanks both of you for your feedback.--Robinson Weijman 08:57, 23 January 2009 (UTC)

Searching multiple namespaces as a default
I've had problems getting this working. The following has been specified in the LocalSettings.php with no luck: $wgNamespacesToBeSearchedDefault = array(       NS_MAIN           => true,        NS_HowTo          => true, Will SphinxSearch pick this up?  I've tried applying this before and after the SphinxSearch extension loading lines in LocalSettings.php. CaliVW78 06:07, 8 February 2009 (UTC)


 * SphinxSearch will pick it up the same way the default MediaWiki search picks it up - for new and for anonymous users. Existing users already have their own default set of namespaces saved in their user preferences and have to change it manually. See Manual:$wgNamespacesToBeSearchedDefault for details (and how to update existing user preferences with a maintenance script.) Svemir Brkic 12:58, 8 February 2009 (UTC)
 * Thanks for the reply, I clearly had missed some items when looking at the wgNamespacesToBeSearchedDefualt page earlier. I am however, still running into an issue where those who are not logged in, are not getting search results from the additional namespace that I am wanting.  Below is the code I have in my LocalSettings.php, and the new name space I am wanting to be searched is "HowTo".

$wgNamespacesToBeSearchedDefault = array(       NS_MAIN           => true,        NS_HowTo          => true, ); In addition to that, this is what what placed in the LocalSettings.php to create that namespace... $wgExtraNamespaces = array(

100 => "HowTo" );

define('NS_HowTo', 100); Thanks again for the help. CaliVW78 13:12, 17 February 2009 (UTC)
 * Ok, here's how I got this to work. Adding a line to $wgNamespacesToBeSearchedDefault was clearly NOT working.  Instead, I added the following to LocalSettings.php to define the default user option.  "100" is the ID that was given to my newly created namespace that I wanted to have included in the search results. CaliVW78 09:16, 18 March 2009 (UTC)

$wgDefaultUserOptions = array(       'searchNs100'   =>      1,        );
 * This $wgDefaultUserOptions worked with the search option but caused trouble with our SMW(Semantic Mediawiki). We had to rewrite the statement as of (Namespaces in SMW from 100-109 are reserved, so we started at 150 respectively) --MWJames 11:24, 25 June 2010 (UTC)

$wgDefaultUserOptions['searchNs154'] = 1;

Orphaned pages not indexed?
It seems SphinxSearch is unable to find orphaned pages in my wiki. Only things linked somehow (also indirectly) from the Main Page can be found. Is that something that can be configured away? I am using sphinx-0.9.9-rc1, SphinxSearch extension 0.6.1, MediaWiki 1.13.3. Thanks! — User:Trohlfing Feb 8, 2009


 * SphinxSearch extension is not using wiki links in any way. It is running direct database queries, as specified in sphinx.conf. You need to check the namespace of those orphaned pages. Svemir Brkic 21:44, 8 February 2009 (UTC)


 * There is a case in which it only finds linked-to pages. If a page title has spaces in it, these are stored as underscores in the MediaWiki database (in the "page" table).  Sphinx doesn't know about this, so if you search for a page called "Sphinx search engine", by typing "Sphinx search engine" instead of "Sphinx_search_engine", you'll only see the links to that page -- so, if your page is orphaned (no pages reference your page title using spaces), you won't get the result.  User:mphasak 23:30, 9 February 2009 (UTC)


 * Check your sphinx.conf file. If it has "_-> ," in the charset_table, you should remove it and reindex. Svemir Brkic 02:34, 10 February 2009 (UTC)


 * Perfect!! Thanks, Svemir.  User:mphasak 19:25, 10 February 2009 (UTC)

How to Get the Did you mean feature working
Im trying to get the Did you mean feature of Sphinx working, i have aspell installed on ubuntu. But when i add the lines that tell sphinx to use it - i see no difference. I have re-indexed still to no avail. Has anybody got any troubleshooting advice, as i cant seem to find much online or how to get it working. Is there anything i can do to find where the problem lies? --Trickedicky 11:51, 9 February 2009 (UTC)

Confirm that Aspell command access is working in Windows?
I've got a similar issue. I'm trying to use the Aspell command line with SphinxSearch (which I love, btw). Aspell tests fine on it's own, but it doesn't offer any suggestions for spelling on my wiki. Is there a special format for specifying the path to Aspell when hosting on Windows? 130.234.189.190 16:49, 23 February 2009 (UTC)

Solution
To enable Aspell to work within Windows, open the LocalSettins.php file and do this before you include SphinxSearch.php:

P. S. Above solution did not work, until I added the php_pspell.dll to the php.ini and installed the aspell-15.dll in the WINDOWS/system32 dir. See this guide. Then I used the config from this guide, and then Did You Mean finally worked.

More info
I got this working on Window 2008 on 8/27/2010. I wanted to mention a few things I uncovered.

The php_pspell.dll appears to no longer be supported, and they are moving towards the Enchant library instead. When I downloaded PHP 5.3.3 NTS package the php_pspell.dll file was not included. So you can't use that dll to call ASpell from PHP, and must use the command line calls to ASpell instead.

When using the personal dictionary, it is not just a list of words. It needs to have a header line. http://aspell.net/man-html/Format-of-the-Personal-and-Replacement-Dictionaries.html So this is my test personal dictionary:

For whatever reason, I had to specify the $wgSphinxSuggestMode and $wgSphinxSearchPersonalDictionary settings both before and after the require_once. So here is my complete config:

If you use stopwords, they will not fall out of the index, until you re-index. Might be obvious to some, but doesn't hurt to say it.

Also the extension page mentions this: "SphinxSearch will create a new restricted access special page called Wiki-specific Sphinx search spellcheck dictionary." ... but I have no idea what this is talking about. The restricted special page never showed up, and I didn't see any code the looked like it extended SpecialPage besides the special Sphinx search page.

Case Sensitivity
Is SphinxSearch case sensitive? --Robinson Weijman 11:42, 10 February 2009 (UTC)


 * Yes, if you use default sphinx.conf that comes with it. If you want to change that, remove the "A..Z->a..z, " part from the charset_table setting. Svemir Brkic 13:53, 10 February 2009 (UTC)


 * Thanks for the prompt response! --Robinson Weijman 15:16, 10 February 2009 (UTC)

Search Results without Wikicode/Wiki markup
Does anyone have experience in excluding wikicode / wiki markup from being displayed in the search results? I couldn't find anything on the Sphinx site. Thanks in advance, labalena 149.211.153.96


 * There is no easy way. Sphinx does not know anything about wiki markup - it can only be told to strip HTML when indexing. You could in theory keep a separate copy of active revisions, with wiki markup removed with some script, and index that instead of the real content. Svemir Brkic 02:32, 18 March 2009 (UTC)

REDIRECT
Q: How config sphinx to not search "Redirect page" in my search result I have REDIRECT Page title to redirect
 * Page title

A: You can modify the query in sphinx.conf to filter out any pages you do not want. Svemir Brkic 02:30, 18 March 2009 (UTC)

Q2: How can filter out all page contain this: "#REDIRECT [["

A2: Change this:

sql_query = SELECT page_id, page_title, page_namespace, old_id, old_text \ FROM page, revision, text WHERE rev_id=page_latest AND old_id=rev_text_id

To something like this:

sql_query = SELECT page_id, page_title, page_namespace, old_id, old_text \ FROM page, revision, text WHERE rev_id=page_latest AND old_id=rev_text_id and page_is_redirect=0

Svemir Brkic 17:39, 19 March 2009 (UTC)

No Result Page
I'm running on Windows and MediaWiki 1.14. The daemon is started, and I can use the command line search, but when I try and search using the special page, nothing happens. The page simply appears to reload. There are no messages in Event Viewer, the query.log file shows my query terms, there are no messages in searchd.log. I have no idea why I'm not getting any result pages. Any able to give me a clue?

Also, the service appears to be listening since I was able to connect via telnet to localhost at 3312. I haven't changed any of the properties in the SphinxSearch.php page.

Running through the options in SphinxSearch.php I noticed that the $wgSphinxSearch_index value didn't match my Sphinx.conf. I've corrected that but still no search results. The URL in the address bar doesn't even change after hitting the search button.


 * Is Sphinx set to be the default search? Whatever it is now, try the other option. Also make sure php.ini is set to display or log errors and watch for PHP errors or warnings. Svemir Brkic 02:36, 18 March 2009 (UTC)


 * 207.126.248.6 23:54, 22 April 2010 (UTC) I ran into this issue too with windows 2008 server R2. I was able to resolve it by changing 'localhost' to '127.0.0.1' in SphinxSearch.php. See this thread for more info
 * --80.216.253.12 22:00, 21 July 2010 (UTC)Had the same issue with running an XAMPP setup on Windows Vista machine, and this resolved my problems as well.

Special Pages broken
The SVN version 73342 of SphinxSearch.php has been extended to give choices whether to use SphinxSearch, SphinxMWSearch and not to replace Mediawiki Search. This breaks the Special:SpeciaPages list. The follwing error is returned: Fatal error: Class 'SphinxSearch' not found in [install path of mediawiki]\includes\SpecialPage.php on line 390

To fix this add the $wgAutoloadClasses['SphinxSearch'] expression just below the } else {, which should be line 40 (Rev. 73342, 19-Sep-2010): 30:if ( $wgSearchType == 'SphinxMWSearch' ) { 31:	$wgAutoloadClasses['SphinxMWSearch'] = $dir. 'SphinxMWSearch.php'; 32:} else { 33:	if ( $wgSearchType == 'SphinxSearch' ) { 34:		$wgAutoloadClasses['SphinxSearch'] = $dir. 'SphinxSearch_body.php'; 35:		$wgDisableInternalSearch = true; 36:		$wgDisableSearchUpdate = true; 37:		$wgSpecialPages['Search'] = 'SphinxSearch'; 38:		$wgDisableSearchUpdate = true; 39:	} else { -->		$wgAutoloadClasses['SphinxSearch'] = $dir. 'SphinxSearch_body.php'; 40:		$wgExtensionAliasesFiles['SphinxSearch'] = $dir. 'SphinxSearch.alias.php'; 41:		$wgSpecialPages['SphinxSearch'] = 'SphinxSearch'; 42:	} 43:}

This error cannot be observed when replacing Wikimwdia's search with SphinxSearch.


 * Thank you very much. Fixed in revision 73790 by moving the line just before that second if. Svemir Brkic 19:53, 26 September 2010 (UTC)

Handling of HTML tags
There are problems with the current handling of HTML tags: The  tags that highlight the match are inserted into the result before the result is run through , requiring   to exclude the   tag. This has potential to cause problems, when  tags are used in Wiki pages.

Furthermore, strip_tags gets confused (and removes a lot of wanted content) by input like which is likely to appear on Wiki pages. --Patrick Nagel 09:48, 8 April 2009 (UTC)
 * 3<4
 * Run  on the shell to send an e-mail containing the contents of file text to who@ev.er.

In LocalSettings.php try adding $wgSphinxSearch_host = "127.0.0.1";

sphinxapi.php - Permission Denied
I've installed the extension 0.6.1 and SphinxSearch 0.9.8.1 using MediaWiki 1.14 on Centos 5.2. when i uncomment the line require_once( "$IP/extensions/SphinxSearch/SphinxSearch.php" ); in localsettings.php I get the following error:

Warning: require_once(/var/www/mediawiki/extensions/SphinxSearch/sphinxapi.php) [function.require-once]: failed to open stream: Permission denied in /var/www/mediawiki/extensions/SphinxSearch/SphinxSearch.php on line 20

Fatal error: require_once [function.require]: Failed opening required '/var/www/mediawiki/extensions/SphinxSearch/sphinxapi.php' (include_path='.:/var/www/mediawiki:/var/www/mediawiki/includes:/var/www/mediawiki/languages') in /var/www/mediawiki/extensions/SphinxSearch/SphinxSearch.php on line 20

permissions and user is the same as all other extensions that work fine. When i comment out require_once ( dirname( __FILE__ ) . "/sphinxapi.php" ); in SphinxSearch.php the extension is loaded and listed in the Special pages, although it can't be used to search. Help is much appreciated.


 * Are you sure permissions are exactly the same on sphinxapi.php file as on the other files in the same folder? There has to be some difference. 17:01, 12 May 2009 (UTC)

They are the same as noted below -rwxr--r-- 1 root root 2217 May  4 08:09 ExtensionFunctions.php -rwxr--r-- 1 root root 34191 May 11 13:54 sphinxapi.php -rw-r--r-- 1 root root 14081 May 7 08:50 sphinx.conf.bak -rwxr--r-- 1 root root 22472 Nov 10 18:59 SphinxSearch_body.php -rwxr--r-- 1 root root 1758 Nov 10 19:00 SphinxSearch.js -rwxr--r-- 1 root root  8616 Nov 10  2008 SphinxSearch_PersonalDict.php -rwxr--r-- 1 root root 4222 May 11 13:59 SphinxSearch.php -rw-r--r-- 1 root root 4287 May 11 13:11 SphinxSearch.php.bak -rwxr--r-- 1 root root 5534 Nov 10  2008 SphinxSearch_spell.php -rw-r--r-- 1 root root 1456 Apr  6  2008 spinner.gif Thanks J

Sorting by namespace
Is this possible? I'm not able to find a clear way to do it in the documentation, but am looking for some way to put one of our existing namespaces at the top of all the other hits. Any ideas? As always, thanks in advance. CaliVW78 13:39, 18 May 2009 (UTC)
 * See above --Svemir Brkic 19:58, 12 December 2009 (UTC)

Tweaking the Search Excerpts?
Does anyone know how I might adjust the excerpts displayed with the search returns? I find 5 lines to be a bit too long, and would like to experiment with fewer. Thanks! --John Thomson 03:13, 11 June 2009 (UTC)


 * In SphinxSearch_body.php, line 343 or so, change "limit" => 400 to a smaller number. I guess we should make it configurable, but there it is for now. Svemir Brkic 13:22, 11 June 2009 (UTC)


 * Ah! Many thanks, Svemir, that did the job nicely!  (I went to 250, and am very pleased!) --John Thomson 22:14, 11 June 2009 (UTC)

Pages made from templates/transcluded pages do not rank well
So far Sphinx search produces the best result of all the search engines I have tried. Recently I have noticed that some my templates and sub pages are appearing higher in the search results than the main page that includes them. I understand that the indexer does not parse any of the wiki text and only looks at a single page entity. Is it possible to specify groups of related pages so that my main page will contain all of the text of it sub pages?

For Example, if Page-main =>(includes) Page-info & Page-index then I want all the text on Page-info and Page-index to be included in the results for Page-main. I can even go as far as saying that it is a rule that pages have a strict naming convention -main -info -index

Or is there some way for the indexer to know that -index is 'linked to' by -main and include the results for -index in main.

Any suggestions are welcome.

Result weighting
Hello,

I have a little problem with the sortings in the result page.

For e.g. if I search for mysql I get every entry but the sorting is horrible. I have several pages with mysql in text and as part of the page_title.

I would like to have the page_title parts in front of the appearing in body results. I set the

in SphinxSearch.php.

I also tried sql_attr_uint  = page_title in my sphinx.conf. but didn't help at all to get a better result.

Settings in the sphinxapi.php: Matching mode is set to extended. Sort mode is set to relevance. Tried every type but this is the best so far. Group mode is set to SPH_GROUPBY_ATTR and ranker is the default one.

I would be very pleased if someone could help me.

Greetings,

Tom


 * sql_attr_uint is only for numeric values. It is used for filtering. After you adjust this and the weights, make sure to rebuild the index and restart the deamon, just in case. In our case, we always get the title matches first. Our weights are 'old_text'=>1, 'page_title'=>100, extended match mode, and we leave sorting and grouping at default (we do not set them at all.) Svemir Brkic 13:16, 8 September 2009 (UTC)


 * Hi, thanks for the quick answer. I turned it back to default. Rebuilded the index, after the changes but still for e.g. i get a page with the title Statistics infront of a page called MySQL5. It has something todo with the weight of upcomming words in body i guess but as I defined the weight of page_title higher than the old_text, i guess it should be vice verca. Greetings, Tom.


 * What versions of sphinx, the extension, etc. do you use? Svemir Brkic 03:12, 9 September 2009 (UTC)

Hi,

basic sphinx installer is sphinx-0.9.8.1. Extension Sphinx is SphinxSearch-0.6.1. PS: Mediawiki is version 1.15.0 Tom

SQLite Configuration
First off, many thanks for the extension. For me it is a real problem-solver.

With a few changes to the sphinx config file and a couple of helper scripts (php), SphinxSearch works with SQLite based MediaWikis, too. Thought this might be of interest, seeing as vanilla search does not work with SQLite in the current stable MediaWiki (1.15.1).

My setup:


 * MediaWiki 1.15.1
 * PHP 5.1.6
 * SphinxSearch 0.6.1
 * Sphinx 0.9.8.1

First, the helper scripts. These are php scripts that are run by the sphinx indexer and do the following:


 * Connect to the MediaWiki SQLite database file using php_pdo (which must be available to run MW on SQLite anyhow)
 * Run the indexer queries
 * Translate the results into XML for sphinx to process as an xmlpipe2 type source

These can go anywhere you want - I put mine in the ./data/ directory alongside the wikidb.sqlite file. Probably they should go in ./maintenance where all the other command-line PHP scripts are.

This is the main update script:

/path/to/wiki/data/sphinx_sqlite_main.php:

This is the incremental update script. The only difference is the extra AND clause in the query. Probably these should be combined, but I didn't want to bother with command line options. I'm lazy.

/path/to/wiki/data/sphinx_sqlite_incremental.php:

You can test these with php -e script.php and they should spit out XML.

We will use these to feed the xmlpipe2 sources in sphinx.conf. I chose to define the fields and attributes in the config file, though you can also do this in the XML itself. Here is what your source containers look like. These replace the MySQL ones of the same names.

sphinx.conf:

You will likely need to specify the full path to the PHP command line executable in the xmlpipe_command directives if this isn't in cron's exection path. (i.e., xmlpipe_command = /usr/bin/php -e [...]).

That's it. All else is as it appears on the Extension Page.

'''NB: This is functional on a dev box with a ten page wiki. It has not been production tested. That is your job .'''

-Jef (jef at lfaccess dot net - checked infrequently)

Strange output
I'm running the SphinxSearch on a standard LAMP setup (CentOS 5.3, Apache 2.2.14, MySQL 5.0.77, PHP 5.2.11) with MediaWiki 1.15.1. Searches keep returning well formatted and readable page titles, but garbage for the excerpts, like: Has anyone else seen this or have an idea of where to look?


 * Do you have $wgCompressRevisions enabled? It looks like by default this extension sets Sphinx up to read text.old_text directly out of the database, which will not give useful results if you have any of our fancier storage features enabled. (Compressed revisions, batch compression, external storage, 'cur' table back-compat entries, legacy encoding back-compat entries, etc.) In this case you'd need to feed updates into Sphinx over an xmlpipe source or something... --brion 08:55, 6 November 2009 (UTC)


 * Excellent catch. I do have $wgCompressRevisions enabled.  As a result, I had tried altering the sql_query statements in sphinx.php to accont for that with " " to no avail.  What puzzles me is that the page titles come out just fine, but the full text excerpts don't.  How would I go about feeding updates into Sphinx over an xmlpipe since it's all in a MySQL DB?


 * Unfortunately, it is not that kind of compression that Brion is talking about. There are some comments above about using xmlpipe, but I did not play with those myself. There are also other ways you could try working around this. I would consider using the Manual:CompressOld.php script periodically, instead of enabling $wgCompressRevisions. That way current revision text will not be compressed and sphinx will work just fine, but you will still have old revisions compressed. Svemir Brkic 18:43, 5 December 2009 (UTC)

Issues with SphinxSearch on Ubuntu 9.10
I am getting a blank page when I am adding the extension to my mediawiki installation.

Recently I moved from a windows xp server to a ubuntu (desktop) pc where i have apache, php, mysql installed. I can get the sphinx search to work in the terminal, but when I have it listening from within the terminal and then change the localsettings.php to allow me to test it, the special pages load blank, no source code what so ever. I am kinda lost on how to fix this.

Any thoughts? Jacebenson 03:06, 15 December 2009 (UTC)


 * It is not clear how you have it setup. Did you follow instructions on the main extension page? What versions of MediaWiki, PHP, and sphinx do you run? What do yo have in your LocalSettings? Svemir Brkic 03:13, 15 December 2009 (UTC)


 * Sphinx = 0.9.9
 * Php = 5.2.10-2ubuntu6.3
 * Apache 2.2
 * We followed the instructions and installed extension version SphinxSearch-0.6.1.
 * Currently it is installed, we were able to compile the code, create the index, search in the command line. But once we added 'require_once ("$IP/extensions/SphinxSearch/SphinxSearch.php");' to the LocalSettings.php, the special pages loads blank.  We enabled it as our main search and it seems to work but the Special Pages still return a blank page.
 * I added My LocalSettings.php if you want to see the the full text. Jacebenson 14:31, 15 December 2009 (UTC)
 * Your local settings seem fine, would you please also post your SphinxSearch.php? From next release, there will be no need to change SphinxSearch.php, and all customization will be done in LocalSettings, but for now I still need to see SphinxSearch.php. Also, you may want to see what your php error log says - it usually tells you why there was a blank page. Svemir Brkic 14:54, 15 December 2009 (UTC)
 * Added the SphinxSearch.php
 * We tried to enable the php error log, but couldn't find it. Thanks.
 * Do you have ExtensionFunctions.php in the same folder as SphinxSearch.php? Also, to find where your error log is, you may try to run this command:
 * If that does not help, create a php file in your web folder, put phpinfo call there and check the output (sometimes a specific web site will have different error_log setting than what command-line php has.) Svemir Brkic 16:24, 15 December 2009 (UTC)
 * I couldn't get that to go. It was something weird.  Reinstalled Ubunutu, Apache, Mysql, Php, Sphinx.  Now I am able to get it to go.  Thanks for your support. Jacebenson 22:44, 16 December 2009 (UTC)
 * I couldn't get that to go. It was something weird.  Reinstalled Ubunutu, Apache, Mysql, Php, Sphinx.  Now I am able to get it to go.  Thanks for your support. Jacebenson 22:44, 16 December 2009 (UTC)

Search Results within Interlinks counted as double
A rather philosophical questions, but if one uses interlinks (normal interlink, semantic interlink) such as Cateora, Philip R. then a search for the term Cateora would be counted twice in the search result screen.

The system configuration Sphinx Version 0.9.9, SphinxSearch Version 0.6.1, LightTPD 1.4.22, MediaWiki	1.15.1, PHP	5.2.9-1 (cgi-fcgi), MySQL	5.0.77-community-nt,


 * Default setup indexes the wiki source. You could use xml pipe or some other approach to index parsed articles. That would solve this and some other issues. I may look into that in the future. Svemir Brkic 14:52, 5 January 2010 (UTC)

Installing a different language (morphology) for sphinx
I'm trying to install a french stemmer: morphology=libstemmer_french as suggested here: http://www.sphinxsearch.com/forum/view.html?id=11#9507

Of course I downloaded libstemmer.c.tgz and extracted it to libstemmer.c.

I added --with-libstemmer to ./configure. All seems to have gone well (got lots of verbose when make entered this directory):

Making all in libstemmer_c make[1]: Entering directory `/home/inmdev/Downloads/sphinx-0.9.9/libstemmer_c' gcc -DHAVE_CONFIG_H -I. -I../config -I/usr/local/include -I/usr/include/mysql -Wall -g -D_FILE_OFFSET_BITS=64 -O3 -DNDEBUG -MT stem_ISO_8859_1_danish.o -MD -MP -MF .deps/stem_ISO_8859_1_danish.Tpo -c -o stem_ISO_8859_1_danish.o `test -f 'src_c/stem_ISO_8859_1_danish.c' || echo './'`src_c/stem_ISO_8859_1_danish.c mv -f .deps/stem_ISO_8859_1_danish.Tpo .deps/stem_ISO_8859_1_danish.Po gcc -DHAVE_CONFIG_H -I. -I../config -I/usr/local/include -I/usr/include/mysql -Wall -g -D_FILE_OFFSET_BITS=64 -O3 -DNDEBUG -MT stem_UTF_8_danish.o -MD -MP -MF .deps/stem_UTF_8_danish.Tpo -c -o stem_UTF_8_danish.o `test -f 'src_c/stem_UTF_8_danish.c' || echo './'`src_c/stem_UTF_8_danish.c mv -f .deps/stem_UTF_8_danish.Tpo .deps/stem_UTF_8_danish.Po [...]

After all this I try ./indexer --all --config ../../SphinxSearch-0.6.1/sphinx.conf and I get the following warning: WARNING: index 'wiki_main': invalid morphology option 'libstemmer_french' - IGNORED

According to libstimmer.c/libstimmer/modules.txt, the french module can be refered to by either french, fr, fre, fra... french UTF_8,ISO_8859_1 french,fr,fre,fra

Anyways, don't know where to go from here. Fabricebaro 19:01, 6 January 2010 (UTC)

PS: I posted this matter here too: http://www.sphinxsearch.com/forum/view.html?id=19#22615


 * What did you put in your sphinx.conf instead of "morphology = stem_en"? I would try "morphology = stem_fr" --Svemir Brkic 04:46, 7 January 2010 (UTC)


 * Sorry for the omission (I removed this info by accident). I tried:
 * * morphology = libstemmer_french
 * * morphology = libstemmer_fr
 * * morphology = stem_fr
 * * morphology = french
 * It should be libstemmer_french according to http://www.sphinxsearch.com/forum/view.html?id=11

Activating "Did you mean" with a French wiki
I installed SphinxSearch on a French wiki. When I activate "Did you mean" I get the following error (displayed at the top of the search result page): Also I do have some English in the wiki. Is it possible to use both languages for spelling suggestions ? Fabricebaro 18:42, 13 January 2010 (UTC)


 * You need to find out how to install French dictionary for pspell. If yuou actually have it installed, but the language code is not "fr" for some reason, you can edit that line in SphinxSearch_spell.php and change  to whatever it should be - even if you have to hard-code the string. Spelling suggestions in both languages would require some additional work in the same file - if you can find someone who knows some PHP... Svemir Brkic 01:07, 17 February 2010 (UTC)

Always add wildcard better automatically add wildcard?
The SphinxSearch-Extension is running nicely here on openSUSE with MW 1.15.1. I would love to be able to do the following:

Define a number x as a variable. After searching for "string" and the number of results is =< x, the search is automatically changed to "string*" and results are shown.

Alternatively: being able to switch on always adding a * to every search term. If I enter "string", "string*" will be searched.

Is it feasible?


 * One way to do this yourself is to find  in SphinxSearch_body.php and change it to   (of course, it would be better to first check if * is already in there, or if there are multiple words, or to use a preg_replace to do it conditionally, but I am kind of pressed for time right now...) Svemir Brkic 01:14, 17 February 2010 (UTC)

Cannot Find Wiki Main
I installed SphinxSearch but when I do:

$> /path/to/sphinx/search/ --config /path/to/sphinx.conf "search string"

I get an error saying:

index 'wiki_main' : search error: failed to open /var/data/sphinx/wiki_main.sph: No such file or dierctory

Did I miss something?


 * Perhaps some of the installation steps had errors you have not noticed. You need to check wiki_main actually exists. Maybe something is wrong in the sphinx.conf file. Hard to say without more information. Svemir Brkic 01:16, 17 February 2010 (UTC)


 * I am getting a very similar error as the one above. I am using exactly what other successful user have used in the sphinx.conf file, but I believe I must be missing whatever this wiki_main.spl is.  I know I don't have a file named this (maybe too new believing I should), figured it must come from somewhere.  So I believe my statement of /var/data/sphinx/wiki_main is wrong or it isn't and I'm at a loss for how to fix it.  Can anyone help?


 * The error is misleading, all you need to do is to create the mentioned directory, and the script will take care of the rest.
 * Jahângir

PHP Warning: AutoLoader::require
I have SphinxSearch running on a WAMP setup with Mediawiki 1.15.0. The search daemon is running fine, the indexer is running fine, an we can execute searches from the command line. When we try to run it from the wiki, however, we get the following error:

PHP Warning: AutoLoader::require(\\leia\wiki/\\leia\wiki\extensions\SphinxSearch/SphinxSearch_body.php) [function.AutoLoader-require]: failed to open stream: No such file or directory in \\leia\wiki\includes\AutoLoader.php on line 582

PHP Fatal error: AutoLoader::require [function.require]: Failed opening required '\\leia\wiki/\\leia\wiki\extensions\SphinxSearch/SphinxSearch_body.php' (include_path='\\leia\wiki;\\leia\wiki/includes;\\leia\wiki/languages;.;C:\php5\pear') in \\leia\wiki\includes\AutoLoader.php on line 582

Leia is the name of our server, and wiki is the shared path to the mediawiki install in the Apache/htdocs folder. the issue seems to be that the path is getting doubled up (\\leia\wiki/\\leia\wiki\), but I am not sure where that is coming from. If we remove the $IP from the path to the extension in LocalSettings.php, the error message changes to this:

PHP Warning: require_once(/extensions/SphinxSearch/SphinxSearch.php) [function.require-once]: failed to open stream: No such file or directory in \\leia\wiki\LocalSettings.php on line 30

PHP Fatal error: require_once [function.require]: Failed opening required '/extensions/SphinxSearch/SphinxSearch.php' (include_path='\\leia\wiki;\\leia\wiki/includes;\\leia\wiki/languages;.;C:\php5\pear') in \\leia\wiki\LocalSettings.php on line 30

Can anyone offer any help on this?


 * Do not remove $IP. This is probably caused by ExtensionFunctions. We are getting rid of that for the next release. In the meantime, you will need to play with this line in SphinxSearch.php:

extAddSpecialPage( dirname(__FILE__) . '/SphinxSearch_body.php', ...


 * Try changing it to:

extAddSpecialPage( 'extensions/SphinxSearch/SphinxSearch_body.php', ...


 * If that does not work, check the error message again and try to adjust accordingly. If nothing else helps, new release should be ready soon. Svemir Brkic 01:30, 17 February 2010 (UTC)

Version 0.7 released
New release of this extension has been uploaded to SourceForge. The most important change is that you are not supposed to edit SphinxSearch.php anymore. All your modifications (host, port, mode, etc.) go to LocalSettings.php AFTER you include SphinxSearch.php. The only exception is this line that has to happen before SphinxSearch.php is included (if you want Sphinx to completely replace default MW search)

$wgSearchType = 'SphinxSearch';

Also, if you use the "personal dictionary" option, that needs to be set before inclusion - as explained in SphinxSearch.php. Default port has been changed to the new official sphinx port (9312.) All the fixes discussed on the talk page have been included as well. Both pages will be updated soon, just making sure the breaking changes are listed immediately.

Is there a publicly available wiki with SpinxSearch installed?
I am looking into both SphinxSearch and LuceneSearch. I would like to try out both before installing one or the other. Is there a publicly available wiki with SpinxSearch installed on which I can try out some queries? Also, I need the following capabilities and wonder if SphinxSearch supports them: 1) ability to search pages for specific tags (e.g.,, ) ; 2) ability to index a site with on the order of 150,000 articles, and 3) exporting a list of pages (identified either by page name or some internal identifier like page_id) to an external file. Dnessett 18:07, 25 February 2010 (UTC)


 * NWE uses Sphinx. It has 107,378 total pages in the database, but less than 20,000 are actual articles. I see no reason not to be able to handle 150,000 or more, given enough disk space and memory. sphinx engine itself is used on some very big sites. You can search for tags, but it is ignoring special characters by default, so it will find the alphanumeric part only. Dealing properly with tags would require some customization, but should not be too hard if you know PHP. As for exporting, the extension now supports MediaWiki Search API, so you could provide a link that would open the api request in XML or text format, for example (or point it to a script that would format it further.) Svemir Brkic 18:30, 25 February 2010 (UTC)


 * Thanks. Dnessett 18:37, 25 February 2010 (UTC)


 * A follow-up question. I have looked through the sphinx and sphinx extension documentation, but could not find the answer to the following question. Suppose I want to index the same db twice, once using the standard character set and a second time using an enhanced character set including characters that are used in wikitext markup (e.g., "{", "}", "<", ">"). Is this possible? Would I need to run two instances of the sphinx daemon, each with a different configuration file and different directory targets to separate their activity? Or is there some way to utilize the same daemon to run two indexes? Dnessett 20:59, 25 February 2010 (UTC)

Create the page "I want to create a new page" on this wiki!
Can anyone tell me if it's possible to have SphinxSearch provide a link to create a new page if it doesn't find an existing page in the search results? I liked this feature in the default search engine. Thanks! --WilkBoy 14:43, 31 March 2010 (UTC)


 * If you press enter or click Go, a red link "create this page" will appear. If you click search, it will not. Svemir Brkic 18:55, 31 March 2010 (UTC)

Moving to MediaWiki repository
SphinxSearch is moving from SourceForge to MediaWiki SVN repository. Links in the infobox already point to the latest code. Some bugs have been fixed and a dozen or so translations added to MW version already. Changelog will be updated once all the cleanup and tweaks are done. Svemir Brkic 22:48, 10 April 2010 (UTC)

in MW 1.16
Upgraded my wiki to 1.16 (in DEV), and now when I don't get an exact match, I see at the top of the search results, instead of seeing "There is no page titled "Foo Bar". You can create this page."

Any ideas? Thanks! --John Thomson 21:17, 14 April 2010 (UTC)


 * Thanks for the report. That message was removed from MW in 1.16. The fix for this will be implemented in the next release of the extension. Svemir Brkic 00:23, 15 April 2010 (UTC)

Temp Fix Between Releases
In the meantime I fixed this in my installation by modifying SphinxSearch_body.php $wgOut->addWikiText( wfMsg( 'noexactmatch', wfEscapeWikiText( $term ) ) ); needs to be changed to $wgOut->addWikiText( wfMsg( 'searchmenu-new', wfEscapeWikiText( $term ) ) );

Gomeztogo 00:00, 22 June 2010 (UTC)

Semantic Wiki and Sphinx
This topic is surely not on your urgent list, but semantic abilities for a Wiki become more and more import and we are using it extensively throughout our Wiki to build ontologies and give pages characteristics other than the standard. Do you have any plans to give some thoughts on how to integrate select statements for the Semantic Wiki Extension. We assume that pages with specific properties and keywords should be ranked higher in the search hierarchy than standard pages without those special classification. It could be a nice feature in comparison with other MW search engines.

Thanks, James


 * I am in the process of re-engineering some parts of the extension. I am making it use more of the standard MW search code (which improved significantly since this extension was first developed,) but I am also looking for more ways to make it better than the default. I will look into Semantic Wiki and how to integrate with it when available. Svemir Brkic 20:32, 17 April 2010 (UTC)

Compatible with $wgEnableMWSuggest?
Is this extension compatible with the $wgEnableMWSuggest = true; setting? I can't get that to work. If not, is it a planned enhancement? Thanks! Gomeztogo 23:06, 30 April 2010 (UTC)


 * It is not compatible right now, but it will be soon - probably by the end of May. Svemir Brkic 01:35, 1 May 2010 (UTC)

Search results and vector skin
Still not as important as the functional interface redesign but do you plan to adapt the display of search results and incorporating the new layout from the vector skin.

We are using 1.16 beta on with our test server and the sphinx search display is a bit out of sync with the vector skin such as (option boxes at the bottom, un-collapse option box, navigation to other pages at the bottom, search bar at the bottom, option such as content pages, image pages).

The sphinx works with our research wiki therefore not only results itself important but also the display of search results to ease navigation. Would be great if you could have look at it. Thanks. --MWJames 15:34, 3 May 2010 (UTC)


 * Next release of Sphinx will use much more of the existing MW classes, including display and pagination. Instead of implementing it as a special page and rewriting everything, most of it is now implemented as an extension of the existing search classes. The new search interface, image thumbnails are there by default now. I am just missing a few pieces before being able to release a candidate. The biggest missing piece is the category filter. I will try to commit a testable version to svn by the end of the week, with or without categories. Svemir Brkic 17:06, 3 May 2010 (UTC)
 * Do you have any news on this request ? --Toma007 14:09, 6 September 2010 (UTC)

Could not instantiate Sphinx client
I finally got Sphinx and the extension working (Linux Ubuntu 9.10, MediaWiki 15.3, SphinxSearch-trunk-r66651). But always when browsing to the "Special Page: Search wiki using Sphinx" I get: "Could not instantiate Sphinx client.". This thing is: Sphinx IS working. So why does this message appear?? ;-) And by the way: I'm really looking forward to the new version Svemir is talking about. Keep up the great work! --SmartK 10:48, 18 May 2010 (UTC)


 * +1 here. I am looking to see if I can hack in a fix.  I'll post here if I find something. Gomeztogo 00:01, 22 June 2010 (UTC)


 * It appears that there is no code that handles a null $term passed to the special page. So when you load the page it runs all of the search code with a null value, and returns the instantiate error. For now I'm not gonna put in the exception handling as a new major version of the extension is expected soon, and this is really just a cosmetic issue.


 * If you want to get rid of this in a bit of a hacky way, look in SphinxSearch_body.php near line 228, and modify to match:

} } elseif ( $cl == false and trim( $term ) == "") { } else { $wgOut->addWikiText( wfMsg( 'sphinxClientFailed' ) . "\n" ); }
 * 1) No error on blank search term

Remote host doesn't work
You have a line in SphinxSearch.php: $wgSphinxSearch_host = 'localhost'; with no checking to see if this is overriding a user-configured variable, i.e. it needs - if (!isset($wgSphinxSearch_host)) - around it. I had to remove this line to get it to work. DimeCadmium 22:04, 19 June 2010 (UTC)


 * Please use the latest version of the extension and set these variables after you include SphinxSearch.php Svemir Brkic 17:18, 13 November 2010 (UTC)

Problems with message handling in MediaWiki:Noexactmatch and MediaWiki:Searchmenu-new
Did you plan to use the MediaWiki:Searchmenu-new to show search messages one's the search has failed. We actual would need this, because we would create new pages not just as a red link but through so called Semantic Forms which allows higher data integrity and a predefined set of categorized data. The MediaWiki:Searchmenu-new would give a possibility to customized and enter those defined forms. --MWJames 00:26, 26 June 2010 (UTC)

After further testing here is the result and its seems that message is still coming from MediaWiki:Noexactmatch but the problem is that in case that we render a link through a form statement, the standard search interface would allow an URL to be rendered into the message.

But when we switch back to SphinxSearch than we don't get an rendered URL, only a message such as  Chapter Form (In the standard search their would be a link instead of this message). Even though we are using something like the standard interface reacts as it should but the SphinxSearch interface interpret something else so that we can not create a dynamic URL. It seems that their is a problem with handling some html input as indicated by <a href="/a/index.php... --MWJames 22:14, 26 June 2010 (UTC)

sphinxsearch in Debian and Ubuntu -- api missing
Debian and Ubuntu now package sphinxsearch; 0.9.9 comes with Sid and Maverick. This of course makes things simpler: no need to download source and build, and the daemon is already configured in sysv.

We still need to copy the sphinx.conf file from this project -- it should be placed in /etc/mediawiki -- along with the rest of the extension. I had to comment out the ports (and add prefixes); then it worked fine.

Upgrades of sphinxsearch within Debian/Ubuntu should work without changing the configuration.

The only problem is the API file, sphinxapi.php. It is not in the binary package. If you download the latest from upstream, you get the dreaded version mismatch error. I haven't tested this in detail, but found that I could use a sphinxapi.php from a sphinxsearch installation from last fall for the 0.9.9.6 version in Ubuntu Maverick. I don't really see an elegant solution -- maybe we should request Debian to package it in the binary deb.

Liontooth 20:14, 5 July 2010 (UTC)

Category filter
I just installed sphinx search on MW 1.17a and it works fine.. but there is no category filter. Is that an option one could enable somewhere? Or is it currently blank? Or am I just experiencing a bug?

Feature request / Using the search function in templates
Maybe a bit far fetched, but maybe you could provide a search function such as {#ssearch } that can be called in a template and return the results as ul or ol list where the amount of search results is limit by a variable parsed through the function. --MWJames 05:33, 24 July 2010 (UTC)

modify "SphinxSearch_body.php" to work with user rights restriction extension
there are many user rights restriction extensions which use the "usercanread" hook to make some page can not be touched, such as "Extension:Restrict_access_by_category_and_group". how can i modify the SphinxSearch_body.php, to add somes codes like " if (!$t->userCanRead) {}", to prevent some important messages don't listed on the search result page. 121.8.153.6 08:02, 26 July 2010 (UTC)

solved, but not perfectly
no matter what extensions of user rights you are using, most of them use the "userCan" hook. For example,on the page of "Extension:Restrict access by category and group", click the "hooks show", you can see that this extension uses the "userCan" hook. If an extension uses "userCan" hook to carry out user rights restriction, we can make sphinxsearch do not show the infomation not permitted. I modified the codes, in the SphinSearch_body.php, find "	function wfSphinxDisplayResults( $term, $res, $cl ) {",

+ if( !$title_obj->userCanRead ) { + 		       	$wgOut->addWikiText(""); +		      	}else{ $wgOut->addWikiText( "* $wiki_title " ); # uncomment this line to see the weights etc. as HTML comments in the source of the page # $wgOut->addHTML("");

strip_tags( $entry, ' ' ) );  $wgOut->addHTML( " $entry \n" );  +    }     }    }   }  }

these two modifies can make the information not permitted not be shown by the sphinxsearch, but still on the result page displys the result count and the total page number. if do more job, we can show that how many docs are found in xx time, how many docs are not shown due to the permission, and shows the exact numbers of pages.121.8.153.6 02:07, 27 July 2010 (UTC)

Where defines these hooks?
in the sphinxsearch_body.php i found these codes:

wfRunHooks( 'SphinxSearchFilterSearchableNamespaces', array( &$namespaces ) ); wfRunHooks( 'SphinxSearchGetSearchableCategories', array( &$categories ) ); wfRunHooks( 'SphinxSearchGetNearMatch', array( &$term, &$t ) ); wfRunHooks( 'SphinxSearchBeforeResults', array( wfRunHooks( 'SphinxSearchBeforeQuery', array( &$this->search_term, &$cl ) ); wfRunHooks( 'SphinxSearchAfterResults', array( $term, $this->page ) );

i looked up every files of sphinxsearch, bus still can not found the functions such as "SphinxSearchFilterSearchableNamespaces" "SphinxSearchBeforeQuery", i wang to know how them works, but i can't find the function definition.

who can help me, many thx! 121.8.153.6 09:39, 30 July 2010 (UTC)


 * These are all optional hooks, so they are not defined by default. Some of them will be deprecated soon, as standard MW search-related hooks will be available in this extension as well. Once that is cleaned up, I will document any sphinx-specific hooks that remain. For now, just see where they are called and what arguments they receive. That should give you an idea about what you may be able to do with them. Svemir Brkic 15:38, 19 September 2010 (UTC)

Can't get negation to work, always finds all words
I'm struggling to get the negate search option to work, i.e. searching for "word1 -word2" still shows hits for both words, instead of those with only word1. The search CLI tool works fine, though.

I have already tried the different search modes ($wgSphinxSearch_mode: SPH_MATCH_EXTENDED, SPH_MATCH_EXTENDED2, SPH_MATCH_BOOLEAN). SphinxSearch extension 0.7.0 and Sphinx 1.10-beta (r2420) (but same behavior with 0.9.9-release (r2117)), MediaWiki 1.12.0

Any help would be greatly appreciated! --Mmaddin 08:18, 5 August 2010 (UTC)


 * There is definitely something broken now, I also can't use quotes for a phrase match ("word1 word2") - it always finds OR match, or, if I choose "match all words" via the radio button, it finds AND match. I need to make AND match the default, but there is no option, except setting $wgSphinxSearch_mode to SPH_MATCH_ALL - but then there is no radio button anymore (for the rare cases where an OR match is useful). There is not much missing for this extension to be a really great alternative to the built-in search - but right now there is a number of small but annoying (maybe cosmetic, but still important) bugs, that really need to be fixed. --Patrick Nagel 02:03, 31 August 2010 (UTC)
 * Shortly after posting this, I found $wgSphinxMatchAll in the code (can't find documentation anywhere), which you can set to true in LocalSettings.php. Then the radio buttons are there, but "match all words" is selected by default. --Patrick Nagel 02:06, 31 August 2010 (UTC)

Access denied
I'm getting this error message when trying to search in MW: Query failed: connection to 127.0.0.1:9312 failed (errno=13, msg=Permission denied). Deamon is running and responding to search from term.

System: MediaWiki 	1.16.0 PHP 	5.3.3 (apache2handler) MySQL 	5.1.49 SphinxSearch (versjon 0.7.0) Sphinx Engine: Sphinx 0.9.9 (r2117; Dec 02, 2009) CentOS 5

From startup file: [Thu Aug 5 14:47:37.533 2010] [ 8444] using config file '/etc/sphinx/sphinx.conf'... [Thu Aug  5 14:47:37.533 2010] [ 8444] listening on 127.0.0.1:9312

Solved: Turn of SELinux. I'm leaving this post if anyone else should run into this problem themselves.

Project still alive?
I hope this project is still active. Many of the threads above mention a new build possibility in May of 2010, but the only build still available is from Feb.

This extension is great, but some missing features are holding me back from deployment. -  Gomeztogo 01:06, 18 August 2010 (UTC)


 * Apologies to everybody who has been waiting for the new release. I am about to start working on the extension again. --Svemir Brkic 02:06, 11 September 2010 (UTC)

successfully controls the search results according to user rights
i have an extension named rarc, which builds a category and uses the same name as the name of a user group. By adding only users who belongs to the xxx user group can see this page. At first, the sphinxsearch do not care of this privilege rules, still displays the search results. With the recommendation of the sphinxsearch author, i build 5 indexes to implement this. wiki_main --includes everyting expect todays new articles wiki_incremental --index automatically every 2mins, just today's new articles, and merge at night with the wiki_main wiki_small_main --some thing like wiki_main, but exclude the private articles wiki_small_incremental --some thing like wiki_incremental, but exclude the private articles wiki_private --every private articals of the wikisits (due to small amount of articls, no need to build the incre+merg mechanism )

modified the codes, if a user belonged to xxx user group, then search with (wiki_small_main wiki_small_incremental wiki_private ) if it is a normal user, the searchlist is (wiki_small_main wiki_small_incremental)

the wiki_main is uesed to display the breif cntent of the article according to the page_id.

can only one tell me where defines these hooks such as 		wfRunHooks( 'SphinxSearchFilterSearchableNamespaces', array( &$namespaces ) ); wfRunHooks( 'SphinxSearchGetSearchableCategories', array( &$categories ) ); wfRunHooks( 'SphinxSearchGetNearMatch', array( &$term, &$t ) ); wfRunHooks( 'SphinxSearchBeforeResults', array( wfRunHooks( 'SphinxSearchBeforeQuery', array( &$this->search_term, &$cl ) ); wfRunHooks( 'SphinxSearchAfterResults', array( $term, $this->page ) );

i use powergrep to search in the directory of /extesnion/sphinxsearch and /sphinx, both of them don't contains only of these "SphinxSearchFilterSearchableNamespaces"  or 'SphinxSearchGetSearchableCategories' 121.8.153.6 08:23, 26 August 2010 (UTC)

undefined function wfLoadExtensionMessages with install on MediaWiki 1.10.1
I am attempting to install sphinx search extension 0.7 on MediaWiki 1.10.1, but I get the error: undefined function wfLoadExtensionMessages.

How can I resolve this without upgrading mediawiki?


 * For older versions of MW you need to use older versions of the extension. They are available at SourceForge --Svemir Brkic 02:03, 11 September 2010 (UTC)


 * One thing you could try, if you downloaded 0.7 from SourceForge, is to download the latest build instead. Svemir Brkic 00:23, 13 September 2010 (UTC)

runaway strikethrough in search results
Sometimes in the middle of my search results, in the middle of an excerpt, the text changes to strikethrough and this continued to the end of the page. Somehow this was caused by embedded hyphens in the excerpt. I fixed it by stripping hyphens out of the excerpt before it gets formatted, in SphinxSearch_body.php line 424, with this change (added "\-" to the characters):

$entry = preg_replace('/([\[\]\{\}\*\#\|\!\-]+|==+)/',       ' ',        strip_tags($entry, '  ')

I don't understand exactly what happened here, but most likely this is why. My page names are not standard mediawiki usage, but rather lower case words separated by hyphens, like .../wiki/my-page and .../wiki/another-page. I'm also using the semantic mediawiki extensions so there is additional metadata embedded in the wikitext.

The generated html where the strikethrough began looked something like this - the " " tag gets split somehow and we unfortunately end up with an " " tag (after ...issues-in-)

modernities-issues-in- pan style='color:red'&gt;culture-and 10   The

which if I put here as wikitext yields

modernities-issues-in- pan style='color:red'&gt;culture-and 10   The

and oh no!! that continues until I close it with an " " tag.

Thanks for a great extension! The results are SO much better than the default search! Initialcapsarestupid 18:18, 13 September 2010 (UTC)

Version 0.7.1 Released
This minor update contain various fixes and tweaks that were made in recent months. It has been uploaded to SourceForge repository - to avoid complaints about bugs that have already been fixed in MW repository. This it not the big update I was talking about, but I am working on that too and it is looking good... --Svemir Brkic 02:12, 15 September 2010 (UTC)

Curious about Searching Syntax
Do boolean operators, ordering, phrase proximity work with this extension? We've installed the extension, but it does not appear to work.--Alanna.macnevin 14:30, 20 September 2010 (UTC)


 * Please download the latest snapshot (trunk, version 0.7.2) from MW (not from SourceForge) and set $wgSearchType to 'SphinxMWSearch' in your LocalSettings file. Also, make sure to update your sphinx.conf (page_is_redirect is being indexed now) and reindex the wiki. This search type uses the new approach that lets MW handle the interface. It still misses some features from the old version, but they will be added soon, and maybe you do not care about those anyway. Svemir Brkic 19:11, 20 September 2010 (UTC)

Did you mean? pigtailed pigtailed pigtailed Aggregate ....No.
For some reason when I try to search part of a title the search will not locate the article. Instead it preforms a full-text search omitting article titles and asks if I meant "pigtailed Keyword Searched". The same thing happens no matter the parameters I set (i.e. match titles only etc).--Siadsuit 18:37, 28 September 2010 (UTC)

possible to install without compiler?
Hi, in my hosting plan, the compiler is disabled. I'm wondering if i can install sphinx without using a compiler? Is it possible for me to compile it locally, and just upload and run? --Wmama 05:06, 29 September 2010 (UTC)

Won't return some obvious results
Hello. I've got Sphinx 0.9.9 working in MW 1.16.0 with the latest extension (i think). I just noticed today that some obvious should-be-hits aren't making the results (ie. search for 'floor' gets many instances but misses it in a title and an internal link that I know are there!). The closest similar errors I see look like one that won't return link hits and another where sphinx.conf was interpreting underscores wrong. Any other ideas as to what might cause this? I rebuilt the index twice! - Lucas Billett 21:27, 6 October 2010 (UTC)


 * Ok. Weird. I tried setting wgSearchType = 'SphinxMWSearch' as mentioned above and the problem went away. Not sure why that would happen? I like the idea of the search output being packaged in MW. Nice work. However I get an 'undefined index:matches' error on line 189 of SphinxMWSearch.php when it doesn't find an exact match anywhere. I tried surrounding the culprit statement with an if(isset) and it went away... Not sure what I might have broken though. - Lucas Billett 12:33, 7 October 2010 (UTC)


 * Thanks for isset catch Lucas - it has been committed in revision 76659, along with some other tweaks - will post about that in a minute. Svemir Brkic 15:24, 14 November 2010 (UTC)

"Table 'mywiki.documents' doesn't exist" issue
Hi,

I've installed Sphinx in Windows 2003, to index a Mediawiki page, under XAMPP server.

When I configure the Sphinx.conf.in file, and run the indexer.exe, I've this:

C:\sphinx\bin>indexer.exe --config c:\sphinx\sphinx.conf.in --all Sphinx 0.9.9-release (r2117) Copyright (c) 2001-2009, Andrew Aksyonoff

using config file 'c:\sphinx\sphinx.conf.in'... indexing index 'test1'... ERROR: index 'test1': sql_query: Table 'mywiki.documents' doesn't exist (DSN=m ysql://root:***@127.0.0.1:3306/mywiki). total 0 docs, 0 bytes total 0.008 sec, 0 bytes/sec, 0.00 docs/sec indexing index 'test1stemmed'... ERROR: index 'test1stemmed': sql_query: Table 'mywiki.documents' doesn't exist (DSN=mysql://root:***@127.0.0.1:3306/mywiki). total 0 docs, 0 bytes total 0.001 sec, 0 bytes/sec, 0.00 docs/sec distributed index 'dist1' can not be directly indexed; skipping. total 0 reads, 0.000 sec, 0.0 kb/call avg, 0.0 msec/call avg total 0 writes, 0.000 sec, 0.0 kb/call avg, 0.0 msec/call avg

In phpmyadmin I can see that "documents" table doesn't exists.

What is the table taht I have to configure?

Thanks in advanced.


 * Isn't sphinx.conf.in a sample configuration file that comes with sphinx? You sure you updated and are using the sphinx.conf file that came with the extension? - Lucas Billett 11:53, 12 October 2010 (UTC)

Templates / Transclusion - returns only template not users
If I have a template called Cereals with the contents "Rice, Wheat, Rye", and then I use that template in an article, and then search for "Wheat" I only get the template, not the page using the template. I guess this makes sense since MW doesn't store the rendered content, it expands the templates on the fly when the page is rendered.

I was thinking to have sphinx index all the rendered pages instead, any ideas as to how? --72.148.136.13 23:01, 12 October 2010 (UTC)


 * ARGH! Yes, this is going to be the only way. I'm going to have to use xmlpipe and write a shell script (or php?) to get every page, wrap it inside of xmlpipe tags and dump it into the indexer. Argh, argh, argh. The good news is I get to spend the next few days at work hacking php :) Or switch to Lucene, but everyone says Lucene doesn't work on Windows, and my box is a WAMP.

Query failed: no enabled local indexes to search
Hi,

Pls help! Sphinx Search Problem - Query failed: no enabled local indexes to search I am able to run indexer in console and do a search in console. i am able to see the results.

i have started the daemon and it is running. When i use Special: Sphinx Search: it says "Could not instantiate Sphinx client. " And then when search for a word : it says "Query failed: no enabled local indexes to search "

If anyone have an idea, what mistake i am doing, pls let me know.

Some details :

Mediawiki 1.15.3

XAMPP 1.7.2

Windows Xp SP2

Sphinx 0.9.9 (win32)

SphinxSearch 0.7.1

Thanks in Advance.

--Ramesh

I have the same issue, any help out there? Using sphinx-0.9.9, Mediawiki 1.16 --193.16.163.244 14:14, 26 October 2010 (UTC)

Proximity search not working
I have set $wgSphinxSearch_mode = SPH_MATCH_EXTENDED2;

in LocalSettings.php, but proximity searches do not seem to be working:

Query failed: index wiki_incremental,wiki_main: syntax error, unexpected '~', expecting $end near '~2'

I tried putting a backslash before the ~, e.g.,

"search terms"\~2

instead, I just get results similar to an "or" query, which is not what I want. Any suggestions?

This has been an issue since SphinxSearch 0.7 - now using 0.7.1 with Sphinx 1.10-beta. (same issue with 0.7 and sphinx 0.99)

Update: SphinxSearch 0.6.1 works perfectly with sphinx 1.10-beta, MW 1.15.5 - all SPH_MATCH_EXTENDED2 syntax works.

--Fungiblename 03:50, 12 November 2010 (UTC)


 * Would you please try the latest build instead of 0.7.1? Set $wgSearchType to 'SphinxMWSearch' in your LocalSettings file. Also, make sure to update your sphinx.conf (page_is_redirect is being indexed now) and reindex the wiki. Svemir Brkic 15:48, 13 November 2010 (UTC)
 * User error: Thanks Svemir; I tried all of that, but unfortunately, this returns no results at all for any query and takes me to the default MediaWiki search page ("Search results" at top of screen, rather than "Search wiki using Sphinx" - although it looks like you're trying to do away with a dedicated special page in SphinxMWSearch.php).
 * I tried changing $wgSearchType back to SphinxSearch, but SPH_MATCH_EXTENDED2 still throws the same error. There is a small change from 0.7.1: If I escape the "~", the number at the end gets treated as an "AND" part of the query, and the terms between quotes get treated as an AND, so the query only returns pages containing "search" AND "terms" AND "2", which is not expected/desired behavior. 0.6.1 still works quite well, but I will keep testing new SphinxSearch versions as you release them. I hope that some of this helps you pinpoint the regression between 0.6.1 and 0.7+ for handling SPH_MATCH_EXTENDED2 syntax. Unfortunately, I really do not know my way around PHP that well, so I am sorry that I cannot be too helpful in debugging. I am sorry that I forgot to mention this, but I am running SMWHalo 1.5.1, if this is relevant at all. --Fungiblename 01:55, 14 November 2010 (UTC)
 * Proximity search works fine for me with the latest build and SphinxMWSearch. It is supposed to say "Search results" at top of screen in the new mode. Please post or send me all the lines from your LocalSettings file that deal with SphinxSearch.
 * As for the old version working, that is only if your quotes are matching. If your search term contains only one double quote, you will get an error. Fixing that for 0.7 broke proximity searching. In 0.7.2 (latest build) this is addressed in a better way, but only for SphinxMWSearch. 173.72.163.240 03:25, 14 November 2010 (UTC)
 * User error: I made a small mistake in my configuration and forgot to copy over the "sql_attr_uint = page_is_redirect" statement from the attribute columns portion of the sphinx.conf template into my conf file (I did pull in the new wiki_main and wiki_incremental queries). The proximity search now returns valid results, thanks! The results still highlight the proximity number as a search term, e.g. for a search

"search terms"~2"
 * The results look like: search something terms some other text ... 2
 * Now, the number is treated like an OR, rather than an AND, so valid pages get returned, but the proximity parameter highlighting might be distracting for some users. One other note: If there is a wildcard inside a proximity search query, the wildcarded term will not get highlighted in the search results. Example:

"sear* term"~2
 * returns: text of some length search term some more text 2.
 * Using wildcards on both terms completely disables term highlighting in search excerpts (except for the proximity parameter), but the search results return valid pages. FYI, here are my LocalSettings parameters for Ubuntu 10.04 (I have some custom file locations and port settings):

$wgSearchType = 'SphinxMWSearch'; require_once( "$IP/extensions/SphinxSearch-r75726/SphinxSearch.php" ); $wgSphinxSearchPersonalDictionary = dirname( __FILE__ ). "/path/to/wordlist/wordlist.en.pws"; $wgSphinxSearchExtPath = '/web/accessible/path/to/SphinxSearch-r75726'; $wgSphinxSearch_port = 10101; $wgSphinxSearch_mode = SPH_MATCH_EXTENDED2; $wgSphinxSuggestMode = true; $wgSphinxSearchPspellDictionaryDir = "/usr/lib/aspell"; $wgSphinxSearchAspellPath = "/usr/bin/aspell";


 * Thanks for building this incredible tool - it has made my installation so much more useful! --Fungiblename 14:44, 14 November 2010 (UTC)

Search suggestions
Latest SVN build has improved search suggestions for SphinxMWSearch - using enchant library and a script that creates a "dictionary" based on the frequency of words in your wiki (so it will never suggest a word that is not actually in the wiki.) I decided to use enchant because direct support for aspell/pspell is being dropped from PHP (at least on Windows) and I could not even install them on my box with php 5.3.

To try it, install enchant extension, run SphinxSearch_setup.php to create a dictionary, and set $wgSphinxSuggestMode to 'enchant'. Without enchant, you can set $wgSphinxSuggestMode to true and it will use mysql's SOUNDEX to find wiki articles that sound similar to the search query. Svemir Brkic 15:35, 14 November 2010 (UTC)