I had an old script (called Photopost Pro) installed on my server which was working fine. Today I had to move it to another folder on the same server, and I also changed its configuration to reflect the new location.
However I am now getting millions of these error messages:
PHP Warning: readdir() expects parameter 1 to be resource, boolean given in /***/adm-inc.php on line 690
This error log keeps building in gigabytes and choking my server disk space. I am deleting it every few minutes and it keeps coming back.
This is the function at line 690 on that file:
while (($file = readdir($dh)) !== false)
And these are lines 684-710:
function dirsize($dir)
{
$dh = opendir($dir);
$size = 0;
while (($file = readdir($dh)) !== false)
{
if ($file != "." and $file != "..")
{
$path = $dir."/".$file;
if (is_dir($path))
{
$size += dirsize($path);
}
elseif (is_file($path))
{
$size += filesize($path);
}
}
}
closedir($dh);
return( $size );
}
It seems to me there must be some funny folder not being accessed or something similar? But why is this creating these millions of errors and how can I at least stop the errors? I don't care about the script actually working at this point. I even DELETED the adm-inc.php file and the error_log file still keeps getting the same error, even though that file is no longer there (!)
After commenting that function out (or even removing the file) I thought it was strange it kept creating errors, so I restarted the httpd service and it seems to have stopped now finally. I don't know what caused it, but at least I've stopped the error_log problem.
Related
I have what I assumed was a pretty standard service worker at http://www.espruino.com/ide/serviceworker.js for the page http://www.espruino.com/ide
However recently, when I have "Update on reload" set in the Chrome dev console for Service Workers the website stays with its loading indicator on, and the status shows a new service worker is "Trying to Install".
Eventually I see a red 'x' by the new service worker and a '1' with a link, but that doesn't do anything or provide any tooltip. Clicking on serviceworker.js brings me to the source file with the first line highlighted in yellow, but there are no errors highlighted.
I've done the usual and checked that all files referenced by the service worker exist and they do, and I have no idea what to look at next.
Does anyone have any clues how to debug this further?
thanks!
I'm on Chrome Beta.
I updated to the newest release a magically everything works. So I guess it was a bug in Chrome or the devtools, not my code.
For those running in to this issue with the latest version of Chrome, I was able to fix it by caching each resource in its own cache. Just call caches.open for every file you want to store. You can do this because caches.match will automatically find the file in your sea of caches.
As a messy example:
self.addEventListener('install', event => {
event.waitUntil(swpromise);
var swpromise = new Promise(function(resolve,reject) {
for (var i = 0; i < resources_array.length; i++) {
addToCache(i,resources_array[i]);
}
function addToCache(index,url) {
caches.open(version+"-"+index).then(cache => cache.addAll([url])).then(function() {cacheDone()});
}
var havedone = 0;
function cacheDone() {
havedone++;
if (havedone == resources_array.length) {
resolve();
}
}
})
I used the version number and the index of the file as the cache key for each file. I then delete all of the old caches on activation of the service worker with something similar to the following code:
addEventListener('activate', e => {
e.waitUntil(caches.keys().then(keys => {
return Promise.all(keys.map(key => {
if (key.indexOf(version+"-") == -1) return caches.delete(key);
}));
}));
});
Hopefully this works for you, it did in my case.
I have a large json file (not in size, but in elements). It has 30000 JSON elements and I am trying to produce Entities from it as it reads it.
So far I have it reading the file with Guzzle, and it ends up producing about 1500 entities before it crashes. I feel that I must be doing this the wrong way.
Here is my code:
public function generateEntities(Request $request, $number)
{
$client = new \GuzzleHttp\Client();
$request = new \GuzzleHttp\Psr7\Request('GET', 'http://www.example.com/file.json');
$promise = $client->sendAsync($request)->then(function ($response) {
$batchSize = 20;
$i = 0;
foreach (json_decode($response->getBody()) as $entityItem) {
$entity = new Entity();
$entity->setEntityItem($entityItem->string);
$em = $this->getDoctrine()->getManager();
$em->persist($entity);
if (($i % $batchSize) === 0) {
$em->flush(); // Executes all updates
}
$i++;
}
$em->flush();
});
$promise->wait();
return $this->redirect($this->generateUrl('show_entities'));
}
I worked out from research that I should be clearing the Entity Manager frequently, so I added in batch sizing etc to flush it every 20 entities created. This did help but is not enough to load the full 30000 files.
Maybe I am completely wrong and should be handling it a different way?
Is it possible for someone to please point me in the right direction, I am happy to work it out on my own I am just a little unsure where to proceed from here.
Thank you!
You could improve your process act in two ways:
1) increment the time limit of the execution of the controller action with the function set_time_limit, so put this as first line of the controller:
public function generateEntities(Request $request, $number)
{
set_time_limit(0); // set to zero, no time limit is imposed
2) Free much memory is possible for each interaction flush the data to the database and detach/free memory as follow:
$em->persist($entity);
$em->flush($entity);
$em->detach($entity);
$em->clear($entity);
unset($entity);
Hope this help
The script is running out of memory because every one of the 30000 entities is managed in memory. You need to detach the entities from the manager periodically to make sure they are "garbage collected". Use $em->clear(); in your batch flushing block to ensure memory isn't exhausted. See the Doctrine page on batch operations for more information.
Keep in mind though, that $em->clear() will detach all entities from the manager, not just those you are using in this loop.
The requirement is to execute SSIS package, when a file is arrived at a folder,i do not want to start the package manually .
It is not sure about the file arrival timing ,also the files can arrive multiple times .When ever the files arrived this has to load into a table.I think, some solution like file watcher task ,still expect to start the package
The way I have done this in the past is with an infinite loop package called from SQL Server Agent, for example;
This is my infinite loop package:
Set 3 Variables:
IsFileExists - Boolean - 0
FolderLocation - String - C:\Where the file is to be put in\
IsFileExists Boolean - 0
For the For Loop container:
Set the IsFileExists variables as above.
Setup a C# script task with the ReadOnlyVariable as User::FolderLocation and have the following:
public void Main()
{
int fileCount = 0;
string[] FilesToProcess;
while (fileCount == 0)
{
try
{
System.Threading.Thread.Sleep(10000);
FilesToProcess = System.IO.Directory.GetFiles(Dts.Variables["FolderLocation"].Value.ToString(), "*.txt");
fileCount = FilesToProcess.Length;
if (fileCount != 0)
{
for (int i = 0; i < fileCount; i++)
{
try
{
System.IO.FileStream fs = new System.IO.FileStream(FilesToProcess[i], System.IO.FileMode.Open);
fs.Close();
}
catch (System.IO.IOException ex)
{
fileCount = 0;
continue;
}
}
}
}
catch (Exception ex)
{
throw ex;
}
}
// TODO: Add your code here
Dts.TaskResult = (int)ScriptResults.Success;
}
}
}
What this will do is essentially keep an eye on the folder location for a .txt file, if the file is not there it will sleep for 10 seconds (you can increase this if you want). If the file does exist it will complete and the package will then execute the load package. However it will continue to run, so the next time a file is dropped in it will execute the load package again.
Make sure to run this forever loop package as a sql server agent job so it will run all the time, we have a similar package running and it has never caused any problems.
Also, make sure your input package moves/archives the file away from the drop folder location.
As others have already suggested, using either WMI task or an infinite loop are two options to achieve this, but IMO SSIS is resource intensive. If you let a package constantly run in the background, it could eat up a lot of memory, cpu and cause performance issues with other packages depending on how many other packages you've running. So other option you may want to consider is schedule an Agent job every 5 minutes or 10 minutes or something and call your package in the job. Configure the package to continue only when a file is there or quit otherwise.
You can create a Windows service that uses WMI to detect file arrival and launch packages. Details on how to are located here: http://msbimentalist.wordpress.com/2012/04/27/trigger-ssis-package-when-files-available-in-a-folder-part2/?relatedposts_exclude=330
What about the SSIS File Watcher Task?
I am new to LDA and mallet. I have the following query
I tried running Mallet-LDA with the command line and by setting the --random-seed to a fixed value, I was able to get consistent results for multiple runs of the algorithm
However, I did try with the Mallet-Java-API and everytime I run the program I get different output.
I did google around and found out that random-seed needs to be fixed and I have it fixed in my java code. I still am getting different results.
Could anyone let me know what other parameters do I need to consider for consistent results (when run multiple times)
I might want to add that train-topics when ran multiple times(command line) yields same result. However, when I rerun import-dir and then run train-topics, the results do not match with previous one. (Probably as expected).
I am ok with running import-dir just once and then experiment with different number of topics and iterations by running train-topics.
Similarly, what needs to be changed/ kept constant if I want to replicate the same when I use Java-Api.
I was able to solve this.
I will respond in detail here:
There are two ways in which Mallet could be run.
a. Command mode
b. Using Java API
To get consistent results for different runs, we need to fix the 'random seed' and in the command line we have an option of setting it. We have no surprises there.
However, while using APIs, though we have an option of setting 'random seed', we need to know that it needs to be done at proper point, else it does not work. (see code)
I have pasted the code here which would create a model(read InstanceList) file from the data
and then we could use the same model file and set the random seed and see to it that we get consistent(read same) results every time we run.
Creating and saving model for later use.
Note: Follow this link to know the format of input file.
http://mallet.cs.umass.edu/ap.txt
public void getModelReady(String inputFile) throws IOException {
if(inputFile != null && (! inputFile.isEmpty())) {
List<Pipe> pipeList = new ArrayList<Pipe>();
pipeList.add(new Target2Label());
pipeList.add(new Input2CharSequence("UTF-8"));
pipeList.add(new CharSequence2TokenSequence());
pipeList.add(new TokenSequenceLowercase());
pipeList.add(new TokenSequenceRemoveStopwords());
pipeList.add(new TokenSequence2FeatureSequence());
Reader fileReader = new InputStreamReader(new FileInputStream(new File(inputFile)), "UTF-8");
CsvIterator ci = new CsvIterator (fileReader, Pattern.compile("^(\\S*)[\\s,]*(\\S*)[\\s,]*(.*)$"),
3, 2, 1); // data, label, name fields
InstanceList instances = new InstanceList(new SerialPipes(pipeList));
instances.addThruPipe(ci);
ObjectOutputStream oos;
oos = new ObjectOutputStream(new FileOutputStream("Resources\\Input\\Model\\Model.vectors"));
oos.writeObject(instances);
oos.close();
}
}
Once model file is saved, this uses the above saved file to generate topics
public void applyLDA(ParallelTopicModel model) throws IOException {
InstanceList training = InstanceList.load (new File("Resources\\Input\\Model\\Model.vectors"));
logger.debug("InstanceList Data loaded.");
if (training.size() > 0 &&
training.get(0) != null) {
Object data = training.get(0).getData();
if (! (data instanceof FeatureSequence)) {
logger.error("Topic modeling currently only supports feature sequences.");
System.exit(1);
}
}
// IT HAS TO BE SET HERE, BEFORE CALLING ADDINSTANCE METHOD.
model.setRandomSeed(5);
model.addInstances(training);
model.estimate();
model.printTopWords(new File("Resources\\Output\\OutputFile\\topic_keys_java.txt"), 25,
false);
model.printDocumentTopics(new File ("Resources\\Output\\OutputFile\\document_topicssplit_java.txt"));
}
I need to bootstrap Drupal. I have a php-file with all my functions and in one of those functions, I need to use a Drupal function which is located in the bootstrap.inc file.
Structure of the server:(d) drupal (sd)includes (f)bootstrap.inc(d) scripts (sb)functions (f) functions.php
So I need to include in a self-written function the "variable_set" function, located in bootstrap.inc.
A little piece of the function my college wrote (I'm terribly sorry, but I don't know how to format php on the forum. If someone does, please let me know so I can edit this mess):
function readxml()
{
echo "<br/>READING...<br/>";
$file = './config.xml';
$xml = simplexml_load_file($file);
if($xml !== false)
{
foreach($xml->config->children() as $item){
$name = $item->getName(); // GETS CHILDREN UNDER 'CONFIG'
switch($name)
{
case 'website':
foreach($xml->config->website->children() as $kid){
$childname = $kid->getName();
switch($childname)
{
case 'theme':
if(inserttheme($kid)or die ('failed to insert theme<br/>')){
echo 'theme is installed.<br/>';}
break;
case 'slogan':
if(insertslogan($kid)or die('failed to insert slogan<br/>')){
echo 'slogan is installed.<br/>';}
break;
case 'sitename':
if(insertname($kid)or die('failed to insert name<br/>')){
echo 'website name is installed.<br/>';}
break;
}
}
break;
`
So, somewhere in the theme/slogan/name section, I have to call the variable_set function which is located in the bootstrap.inc file.
Somewhere I found this (again sorry for the non-formated text):
$drupal_directory = "/home/httpdocs/drupal"; // wherever Drupal is
$current_directory = getcwd();
chdir($drupal_directory);
require_once './includes/bootstrap.inc';
drupal_bootstrap(DRUPAL_BOOTSTRAP_FULL);
chdir($current_dir);
return;
I included it both in my function.php as well as in my final php-file (where all the functions are called) but no result... What am I doing wrong?
That code you found looks about right, what is exactly is "no result", are you getting errors or nothing or...? Also, where did you put it exactly? (If it is not in a function, you need to remove the last line (return))
Also, the proper way to fix this would be to integrate your custom code as a Drupal module, then you don't have to worry about stuff like this: http://drupal.org/developing/modules
Or if it is a CLI script, expose it as a drush command: http://drupal.org/project/drush
Answer to the question, just include the file inside the Drupal folder!