I have a problem with sphinx searchd configuration. The indexes and indexer are working because I can run queries to the indexes using the search tool. But when I'm using searchd --config, I get the errors : Warning: index : lock : failed to lock index.spl, No such file or directory: Not serving Fatal: no valid indexes to serve.
I gave permissions to the data folder and I logged the service in with my admin account so I suspect it is a permissions problem.
I have been trying to solve it for 2 days now with no success, and no one is suggesting anything. I would really appreciate Any help.
Does any one have any ideas?
Thanks a lot.
Related
My issue is I've a laravel web application and in the storage folder I'm logging the slow SQL queries, but they do not appear. In laravel.log I can see the following error:
[2019-02-22 07:23:21] dev.ERROR: file_put_contents(/var/www/html/com.mywebsite/storage/logs/sql/2019-02-22-slow-log.sql): failed to open stream: Permission denied {"userId":40,"email":"a.user#somewhere.com","exception":"[object] (ErrorException(code: 0): file_put_contents(/var/www/html/com.mywebsite/storage/logs/sql/2019-02-22-slow-log.sql): failed to open stream: Permission denied at /var/www/html/com.mywebsite/vendor/mnabialek/laravel-sql-logger/src/SqlLogger.php:179)
I'm running this on CentOS 7 linux with NginX and PHP-FPM. Both of them runnin with user called web and the permissions on all the files and folders is web:web. However, I saw one slow-query log file, with the permission root:web. What gives?
Oh yeah! I have found the issue. One of my former sys.operators wrote the cronjob and I didn't pay attention. He wrote it for the sudo user and not the web user. It was a job to run certain PHP commands. That is why the log file appeared with root permissions, but later on, the laravel application was unable to write it. Thanks for the help. :)
I am very frequently getting this error in MySQL:
OS errno 24 - Too many open files
What's the cause and what are the solutions?
I was getting the errno: 24 - Too many open files too often when i was using many databases at the same time.
Solution
ensure that the connections to db server close propertly
edit /etc/systemd/system.conf. Uncomment and make
DefaultLimitNOFILE=infinity
DefaultLimitMEMLOCK=infinity
then run systemctl daemon-reload and service mysql restart.
You can check the results with the query: SHOW GLOBAL VARIABLES LIKE 'open_files_limit' and you may notice that the value has changed. You should not have any errno 24 now.
Please notice that the solution may differ from other OS/versions. You can try to locate the variables first.Tested with Ubuntu 16.04.3 and mysql 5.7.19.
In my case it was useless to setting up the open_files_limit variable in mysql configuration files as the variable is flagged as a readonly.
I hope it helped!
You probably have a connection leak in your application, that is why open connections are not closed once the function completes it's execution.
I would probably look into the application code and see where the connections/preparedstatement (if it's java) objects are not closed and fix it.
A quick workaround is to increase ulimit of the server (explained here) which would increase number of open file descriptors (i.e. connections). However, if you have a connection leak, you will encounter this error again, at later stages.
I faced the same problem and found a solution on another stackoverflow-question.
By running the following snippet with Bash:
ulimit -n 30000
I'm trying to run sonar-runner.bat, when it almost finished analyzing, it's written max_allowed_packet more than something something. So it fails.
Through deep search, everyone said that i should configer my.ini file inside MySQL folder. But,
I don't have MySQL Installed.
Log:
Error: unable to execute sonar
error: caused by: unable to save file sources
error: caused by:
Error updating database. cause: com.mysql.jdbc.packettoobigexception: packet for query is too large (3215747 >1048576). you can change this..bla..bla
the error may involve org.sonar.core.source.db.filesourcemapper.insert-inline
the error occurred while setting parameters
how can i change it?
help!
As #Fabrice- SonarQube Team suggested that you are running SonarQube Server on top of Mysql. If you want to check you can check within Sonar.properties file.
for removing this issues you have to modify the my.cnf(Linux) or my.ini(for windows).
[mysqld]
max_allowed_packet=256M
if you want to set the same Globally.Log in to Mysql and run the following command.
SET GLOBAL max_allowed_packet=1073741824;
Once you do this settings, Please restart Mysql Server.
I found the answer myself.
Looks like I don't realize how the database works in Sonarqube.
So by DEFAULT, sonarqube use H2. This is a good one, and I believe such my problem won't happened.
Turned out someone from my company actually used his own MYSQL server. So, I found the MySQL folder, change .ini/.cfg file, insert MAX_ALLOWED_PACKET value to bigger number.
VOILA!
Thanks for your help!
I need help
First , it throw exception
cannot load class 'com.mysql.xxx.JDBC'",so I copy the
sonarqube-4.0/extensions/jdbc-driver/mysql/mysql-connector-java-5.1.26.jar
to the /usr/lib/jvm/java-7-sun/jre/lib/ext/
Then I run sonar-runner again, it throws this exception:
Unknown database status: FRESH_INSTALL
My heart is broken , plz help me
I saw this error when the database is empty (no tables are created). Usually tables are created during first start of sonar.
Can you reach sonar by web interface?
Make sure that settings in your conf/sonar.properties file in Sonar directory are correct, these lines should be uncommented.
sonar.jdbc.url: jdbc:mysql://localhost:3306/sonar?useUnicode=true&characterEncoding=utf8&rewriteBatchedStatements=true
sonar.jdbc.username: mysqlusername
sonar.jdbc.password: mysqlpassword
Then check if your conf/sonar-runner.properties file in sonar-runer directory has the same settings.
#----- MySQL
sonar.jdbc.url=jdbc:mysql://localhost:3306/sonar?useUnicode=true&characterEncoding=utf8
#----- Global database settings
sonar.jdbc.username=mysqlusername
sonar.jdbc.password=mysqlpassword
Also make sure that the line for default embeded database is commented.
#sonar.jdbc.url: jdbc:h2:tcp://localhost:9092/sonar
Sonar stopped supporting the property "sonar.jdbc.schema" from 4.2 version. Please refer link http://jira.codehaus.org/browse/SONAR-5000. To fix the issue on PostgreSQL based sonar deployment, execute the following SQL command on the database:
ALTER USER $dbUser$ SET search_path to $soanrQubeSchema$
I just encountered the same problem, and I solved it by restarting sonar server after creating proper user and database in mysql.
In fact, sonar server will fill the empty mysql database with many tables after restarting, and that removed the "FRESH_INSTALL" problem.
I got problem (#2006 Mysql server gone away) with mysql while connecting and performing some operations through web browser.
Operation Listed below:
When Executing big procedure
Importing database dump
When Access some particular tables It immediately throws "Server gone away".
Refer this question for Scenarios: Record Not Inserted - #2006 Mysql server gone away
Note : The above operations are works fine when I perform through terminal.
I tried some configuration as googing stated. That is set wait_timeout, max_allowed_packet. I checked for the bin_log but it is not available.
But the issues will not rectified.
What is the problem & How can I figure out & fix the issue?
what is the different between access phpmyadmin mysql server from web browser and terminal?
Where I can find the mysql server log file?
Note: If you know about any one of the above questions. Please post here. It would be helpful to trace.
Please help me to figure this out..
Thanks in advance...
Basically nothing except phpMyAdmin is limited by PHP's timeout and resource limits (limits to keep a runaway script from bogging down your entire machine for all eternity; see the docs for details of those values. In some cases, you might be authenticating through a different user account (for instance, root#localhost and root#127.0.0.1 aren't the same user), but as long as you're using a user with the same permissions the differences are minimal.
You can read more about logs in the MySQL manual, note that "By default, no logs are enabled (except the error log on Windows)".
Below are answer for question
From my research the problem is that browser have some limit to disconnect the connection i.e timeout connection. So that the above problem raised.
To resolve this problem
Go to /opt/lampp/phpmyadmin and open config.inc.php
add the command $cfg['ExecTimeLimit'] = 0;
Restart the xamp server. Now you can perform any operations.
`
2. Web client is differ from terminal because Terminal client will not getting timeout. Terminal client maintain the connection till the progress completed. I recommenced to use command prompt to import/export/run process by safe way.
Basically phpmyadmin will not have any log file. If you wanna see warnings and error you should configure the log file.
Configuration steps:
Go to /opt/lampp/etc/my.cnf
Add log_bin = /opt/lampp/var/mysql/filename.log
Restart the xamp server. You can get the log information.