I've installed MediaWiki 1.27.1 and managed to get a basic single-wiki setup working. Now I'm trying to change it into a wiki family according to the instructions at Manual:Wiki_family. Here's what I'm aiming for:
wiki.mysite.com/ #public wiki
wiki.mysite.com/priv1 #private wiki 1
wiki.mysite.com/priv2 #private wiki 2
The result I got:
wiki.mysite.com shows the public wiki, as expected.
wiki.mysite.com/priv1 returns 404
wiki.mysite.com/priv2 returns 404
[edit: added message text] The text of the 404 message:
Not Found
The requested document was not found on this server.
Web Server at mysite.com
What I did:
I followed the steps outlined in the manual, generated the 3 copies of LocalSettings.php, and renamed them. I then modified the main LocalSettings.php to the following:
<?php
if ( !defined( 'MEDIAWIKI' ) ) {
exit;
}
## Database settings - cut-and-pasted out from the 3 sub-wikis' LocalSettings.php
$wgDBtype = "mysql";
$wgDBserver = "localhost";
$wgDBname = "db_wiki";
$wgDBuser = "db_wiki_user";
$wgDBpassword = "password";
$callingurl = strtolower( $_SERVER['REQUEST_URI'] ); // get the calling url
if ( strpos( $callingurl, '/priv1' ) === 0 ) {
require_once 'LocalSettings_priv1.php';
} elseif ( strpos( $callingurl, '/priv2' ) === 0 ) {
require_once 'LocalSettings_priv2.php';
} elseif ( strpos( $callingurl, '/' ) === 0 ) {
require_once 'LocalSettings_public.php';
} else {
header( 'HTTP/1.1 404 Not Found' );
echo "This wiki (\"" . htmlspecialchars( $callingurl ) . "\") is not available. Check configuration.";
exit( 0 );
}
For test, I've tried changing the file name in the require_once 'localsettings_public.php' line to each sub-wiki's LocalSettings.php file; when I open wiki.mysite.com, the related sub-wiki does get shown correctly. The URLs with the subdirectory path continue to return 404, however.
Any idea what's wrong with my setup?
Related
I an new to html programming using perl-cgi and i want to import an excel file through a webpage.I am using the Spreadsheet::ParseExcel Module of perl for that and the code is :
if($fileName) {
my $parser = Spreadsheet::ParseExcel->new();
my $workbook = $parser->parse($fileName);
if ( !defined $workbook ) {
die $parser->error(), ".\n";
}
for my $worksheet ( $workbook->worksheets() ) {
my ( $row_min, $row_max ) = $worksheet->row_range();
my ( $col_min, $col_max ) = $worksheet->col_range();
for my $row ( $row_min .. $row_max ) {
for my $col ( $col_min .. $col_max ) {
my $cell = $worksheet->get_cell( $row, $col );
next unless $cell;
# print "Row, Col = ($row, $col)\n";
print $cell->value() ,"|" ;
#print "Unformatted = ", $cell->unformatted(), "\n";
}
print "\n";
}
}
}
Here fileName variable is the name of excel file i am inputting from the HTML form whose code is:
<form id='form1' method='GET' action='#'>
<input id='fileSelect' name='file' type='file' accept='application/vnd.openxmlformats-officedocument.spreadsheetml.sheet, application/vnd.ms-excel'>
<input type=submit value='submit'>
</form>
I have written this server side code in a directory say /a/b/c in linux and in that directory i have copied the file xyz.xls.So whenever i load the file xyz.xls through the web , i obtain the results and in case i load another file say pqr.xls which is not located in the directory /a/b/c , i have this Error File not found.But i want to import any file from a user who is using this webpage.
I am stuck here ,please suggest something.
Thanks in advance
Your perl code looks OK. But your form should be
method="post" enctype="multipart/form-data"
unless you make a FromData object using javascript on the client-side and send it via ajax.
Also it is not very clear how you handle the sent file on the server-side.
I want to set up a cron job and do scheduled imports from a particular .csv file that I will upload/update via ftp.
I wonder if there is an easy way to set up a product import for X-Cart 5 using linux console command?
There is not default way to do import via linux console. But you can create simple console script and run it via cron.
Example of code(only concept, not solution for your case):
#!/usr/bin/env php
<?php
if ('cli' != PHP_SAPI) {
exit (1);
}
require_once __DIR__ . DIRECTORY_SEPARATOR . 'top.inc.php';
XLite::getInstance()->run(true);
// Initialize importer
// See all possible options in classes/XLite/Logic/Import/Importer.php __construct()
$importer = new \XLite\Logic\Import\Importer(
array(
'warningsAccepted' => true,
'delimiter' => ',',
'ignoreFileChecking' => true,
'files' => array(
'/full/path/to/xcart/var/import/products.csv',
'/full/path/to/xcart/var/import/categories.csv'
)
)
);
// Verifiaction step
while ($importer->getStep()->valid()) {
$importer->getStep()->current()->process();
$importer->getStep()->next();
}
// Check warnings & errors after verification and save to log file
if($importer->hasWarnings()) {
$warnings = \XLite\Core\Database::getRepo('XLite\Model\ImportLog')
->findBy(array('type' => \XLite\Model\ImportLog::TYPE_WARNING));
\XLite\Logger::logCustom('import_warnings', var_export($warnings, true));
//Clear warning messages
\XLite\Core\Database::getRepo('XLite\Model\ImportLog')
->deleteByType(\XLite\Model\ImportLog::TYPE_WARNING);
}
if($importer->hasErrors()) {
$errors = \XLite\Core\Database::getRepo('XLite\Model\ImportLog')
->findBy(array('type' => \XLite\Model\ImportLog::TYPE_ERROR));
\XLite\Logger::logCustom('import_errors', var_export($errors, true));
}
// Import/proccess quick data for products/resize images
// This loop wont'b executed if ($importer->hasWarnings() == true && warningsAccepted == false)
// or ($importer->hasErrors() == true)
while ($importer->isNextStepAllowed()) {
$importer->getOptions()->step = $importer->getOptions()->step + 1;
$importer->getOptions()->position = 0;
while ($importer->getStep()->valid()) {
$importer->getStep()->current()->process();
$importer->getStep()->next();
}
}
Also you can use scheduled task in X-Cart 5. To use it you should create your own module with class witch will extends abstract class classes/XLite/Core/Task/Base/Periodic.php
You can find example of code in file classes/XLite/Module/CDev/XMLSitemap/Core/Task/GenerateSitemap.php
Run tasks registered in X-Cart 5: php console.php --target=cron
After I've updated my Wordpress install to 3.9, I keep getting these errors:
Warning: mysql_query(): Access denied for user 'www-data'#'localhost' (using password: NO) in /home/sites/wordpress/site/wp-content/plugins/crm/main.php on line 20
Warning: mysql_query(): A link to the server could not be established in /home/sites/wordpress/site/wp-content/plugins/crm/main.php on line 20
Warning: mysql_fetch_row() expects parameter 1 to be resource, boolean given in /home/sites/wordpress/site/wp-content/plugins/crm/main.php on line 21
I can't quite figure out what's wrong. Here's the code that worked pre-3.9:
<?php
session_start();
/**
* Plugin Name: CRM
* Description:
* Version:
* Author:
*
*/
add_action( 'admin_menu', 'menu' );
function menu() {
add_menu_page( 'CRM', 'CRM', 3,'form', 'form' );
}
function form() {
global $wpdb,$current_user,$user_ID;
echo "<h3>CRM</h3>";
$count = mysql_query("SELECT COUNT(id) FROM user_form_data");
$nume2 = mysql_fetch_row($count);
$nume = $nume2[0];
I've snipped the rest, as it does not seem relevant for the error :)
SOLUTION:
Found it.
The error was in the 3.9 upgrade.
http://make.wordpress.org/core/2014/04/07/mysql-in-wordpress-3-9/
"In WordPress 3.9, we added an extra layer to WPDB, causing it to switch to using the mysqli PHP library, when using PHP 5.5 or higher.
For plugin developers, this means that you absolutely shouldn’t be using PHP’s mysql_*() functions any more – you can use the equivalent WPDB functions instead."
You should read this post http://make.wordpress.org/core/2014/04/07/mysql-in-wordpress-3-9/
In WordPress 3.9, we added an extra layer to WPDB, causing it to switch to using the mysqli PHP library, when using PHP 5.5 or higher.
For plugin developers, this means that you absolutely shouldn’t be using PHP’s mysql_*() functions any more – you can use the equivalent WPDB functions instead.
Change this to wp_results
$count = mysql_query("SELECT COUNT(id) FROM user_form_data");
$nume2 = mysql_fetch_row($count);
to
$count = $wpdb->get_results("SELECT COUNT(id) FROM user_form_data",ARRAY_A);
$nume2 = $wpdb->num_rows; ====== it will return same as mysql_fetch_row
Try this hope this help
<?php
/**
* Plugin Name: CRM
* Description: any desc
* Author: ABS
*
*/
add_action( 'admin_menu', 'user_data_menu' );
function user_data_menu() {
add_menu_page( 'CRM', 'CRM', 3,'user_data_form', 'user_data_form' );
}
function user_data_form() {
#session_start();
global $wpdb,$current_user,$user_ID;
echo "<h3>CRM</h3>";
$count = mysql_query("SELECT COUNT(id) FROM user_form_data");
$nume2 = mysql_fetch_row($count);
$nume = $nume2[0];
if ( $limit < $nume && empty($_POST['searching']) && empty($_POST['filter_flag']) && empty($_POST['rowsselect']) ) {
$start = $_GET['start'];
$eu = ($start - 0);
$limit = 20;
$this4 = $eu + $limit;
$back = $eu - $limit;
$next = $eu + $limit;
}
} ?>
It looks like that the update changed the mysql username and password. So the problem isn't the code.
Check the wp-config.php file if these settings are changed and incorrect
I have a problem with memcached.
I have the following code:
/**
* Load the char object
* #param char_id id char
* #return $char object
*/
function get_info( $char_id )
{
$cache = Cache::instance();
$cachetag = Kohana::config( 'medeur.environment' ) . '-charinfo_' . $char_id . '_obj' ;
kohana::log('debug', "-> Getting $cachetag from CACHE..." );
$char = $cache -> get( $cachetag );
if ( is_null( $char ) )
{
kohana::log('debug', "-> Getting $cachetag from DB.");
$char = ORM::factory('character', $char_id );
if ( !$char -> loaded )
$char = null;
$cache -> set( $cachetag, $char, 3600 );
}
return $char;
}
I see in the logfile that the object $char is taken from the cache:
2012-12-08 18:24:07 +01:00 --- debug: -> Getting test-global_adminmessage from CACHE...
2012-12-08 18:24:07 +01:00 --- debug: -> Getting test-charinfo_1_obj from CACHE...
However i keep seeing in the profiler table that i am still going on the database:
SELECT `characters`.* FROM (`characters`) WHERE `characters`.`id` = 1 ORDER BY `characters`.`id` ASC LIMIT 0, 1
Why? in this case, the memcached it would be useless...
Your "Getting nnnn from CACHE..." logging statement will always show up, regardless of whether or not you actually retrieve anything from the cache. Consider moving it into an else statement after the large if block.
if(is_null($char)){
....
}
else {
kohana::log('debug', "-> Got $cachetag from CACHE..." );
}
I checked with the guys at Kohana. Kohana 2.x ORM class is not cacheable. It is cacheable on framework version 3.x
I'm writing to read a xml from a wbsite.
$jtype = $job_type == 1 ? 'fulltime' : $job_type == 2 ? 'parttime' : $job_type == 3 ? 'contract':'';
$xml ="http://api.indeed.com/ads/apisearch?";
$xml .="publisher=9628233567417007";
$xml .="&q=".$q; //Query. By default terms are ANDed. To see what is possible, use our advanced search page to perform a search and then check the url for the q value.
$xml .="&l=".$location; //Location. Use a postal code or a "city, state/province/region" combination.
$xml .="&sort="; //Sort by relevance or date. Default is relevance.
$xml .="&radius=30"; //Distance from search location ("as the crow flies"). Default is 25.
$xml .="&st=employer"; //Site type. To show only jobs from job boards use 'jobsite'. For jobs from direct employer websites use 'employer'.
$xml .="&jt=".$jtype ; //Job type. Allowed values: "fulltime", "parttime", "contract", "internship", "temporary".
$xml .="&start=0"; //Start results at this result number, beginning with 0. Default is 0.
$xml .="&limit=10"; //Maximum number of results returned per query. Default is 10
$xml .="&fromage=".$within; //Number of days back to search.
$xml .="&filter=1"; //Filter duplicate results. 0 turns off duplicate job filtering. Default is 1.
$xml .="&latlong=1"; //If latlong=1, returns latitude and longitude information for each job result. Default is 0.
$xml .="&co=GB";//arch within country specified. Default is us. See below for a complete list of supported countries.
$xml .="&v=2";
$xmlData = new SimpleXMLElement( $xml, null, true);
$xmls = $xmlData->xpath('results/result');
$jIndeed = array();
$iIndeed=1;
if( !empty($xmls) )
{
foreach ( $xmls as $xml )
{
$created_at = strftime( dateFormat ,strtotime((string)$xml->date));
$jIndeed[$iIndeed]['job_id'] = (string)$xml->jobkey;
$jIndeed[$iIndeed]['jobTitle'] = cleanText( (string)$xml->jobtitle );
$jIndeed[$iIndeed]['var_name'] = seoUrl( (string)$xml->jobtitle);
$jIndeed[$iIndeed]['jobDescription'] = (string)$xml->snippet;
$jIndeed[$iIndeed]['created_at'] = $created_at;
$jIndeed[$iIndeed]['job_type'] = (string)$xml->typeName;
$jIndeed[$iIndeed]['companyName'] = (string)$xml->company;
$jIndeed[$iIndeed]['location'] = (string)$xml->formattedLocation;
$iIndeed++;
}
$smarty->assign('searchIndeed', $jIndeed);
}
When I run this on local machine it works fine but when I upload to my site I get error 500 "Page cannot be display"
I have change the memery to 20MB and chnage the post to 1000 but still failing. I think my host has limit, it does not make any differece when i set in php is still failed,
Does any have xml class I can used to process this website xml.
UPDATE:
After Putting this ini_set('display_errors', E_ALL);
Warning: SimpleXMLElement::__construct(): http:// wrapper is disabled in the server configuration by allow_url_fopen=0 in /.../indeedXMLSearch.php on line 44
Warning: SimpleXMLElement::_construct(http://api.indeed.com/ads/apisearch?publisher=9628233567417007&q=desktop&l=&sort=&radius=30&st=employer&jt=&start=0&limit=10&fromage=&filter=1&latlong=1&co=GB&v=2): failed to open stream: no suitable wrapper could be found in /.../indeedXMLSearch.php on line 44
Warning: SimpleXMLElement::_construct(): I/O warning : failed to load external entity "http://api.indeed.com/ads/apisearch?publisher=9628233567417007&q=desktop&l=&sort=&radius=30&st=employer&jt=&start=0&limit=10&fromage=&filter=1&latlong=1&co=GB&v=2" in /.../indeedXMLSearch.php on line 44
Fatal error: Uncaught exception 'Exception' with message 'String could not be parsed as XML' in /.../indeedXMLSearch.php:44 ...
For security season, php disable fopen url in default setting. You 'd better get xml file content according php curl lib and save it to local file.
Then use new SimpleXMLElement ($localxml).
Example Code:
$xml = "http://....";
$ch = curl_init();
// set URL and other appropriate options
curl_setopt($ch, CURLOPT_URL, $xml);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1);
// grab URL and pass it to the browser
$xmlcontent = curl_exec($ch);
// close cURL resource, and free up system resources
curl_close($ch);
file_put_contents('/tmp/xmlfile', $xmlcontent);
$xmlelement = new SimpleXMLElement( $xml, null, true);
.....
Try this
$xmlData = simple_xml_load_file(file_get_contents($xml));
print_r($xmlData);
instead of
$xmlData = new SimpleXMLElement( $xml, null, true);