I have a simple batch file to spin up a dev site in XAMPP. However, if I have forgotten to start MySQL, it fails to handle the parts of the process that set up a database for the site.
In a Windows batch file, is there a way to check if MySQL is running, and if not, start it?
I don't have any examples that I have tried because I simply have not been able to find any.
I had been running a MySQL server for weeks now. Everything had been working perfectly until yesterday, when the MySQL server simply stopped. I realized the Service had stopped, and whenever I try to start it I get an error message that says "The MySQL80 service on Local Server started and then stopped. Some services stop automatically if they are not in use by other programs or services."
So far, I have done some research online and have found solutions that include modifying the my.ini file. After looking for it for quite a while, I realized it wasn't being stored in Program Files, but in the hidden ProgramData folder. I opened the my.ini file and it was empty (I am not sure if this File is generated every time the service starts or if there should already be something in there).
Other solutions include running the mysqld --install and mysqld --initilize commands, but my Command Prompt won't recognize these commands. I am pretty much lost as to what to attempt now, and I am also pretty much confused since it had been running for weeks without any issues whatsoever.
What are some more troubleshooting steps for this?
I have been using MySQL workbench for years. Recently the previous version 8.0.nnn stopped launching. I decided to try to repair the installation by upgrading. It worked for a few days and now it hangs again.
Uninstalling and reinstalling is not helping.
There are no notifications in event viewer. There is a task (process) in Task Viewer, but the process does not progress to showing a window.
I can connect to, and query, the server on the command line or via the API (not that it is required for launching Workbench).
Any ideas about origin of this rather perplexing problem?
I discovered that the user data (User\AppData\Roaming\MysSQL\Workbench) contained corrupted files.
The error message popup (window) remains hidden, but can be seen when using Alt-Tab to rotate between applications. One cannot bring focus on the error window, but one can see the message among the choices in the Alt-Tab interface. I could see something about corrupt file and that it was in the profile's Roaming.
Deletion of the single file that was indicated (wb_state.xml) was insufficient to get WorkBench to display. Deletion of all files in the Workbench directory restored Workbench.
Hope this is of some use to others.
I am running MySQL on Windows 7. I use a scheduled task to insert a record into a table after an action has occurred. However, when the scheduled task runs, nothing is inserted. I have redirected output from the "mysql" line into a log file, but the log is always empty. Running the batch file manually does cause the record to be inserted successfully. The scheduled task runs under the same user account and priveleges as when I run it manually.
Has anyone seen this behavior before?
Never mind. Apparently despite being run as my account, "taskeng" doesn't know where "mysql" is. Writing the full path to the mysql executable solved it.
SSIS package loops through input files. For each file, flatfile parse adds records to a DB table, then file is renames/moved for archiving. After all files, package calls a sproc to delete all year-old records.
Package runs from visual studio OK. Put in SSIS package store, run from there, no problem.
Create an SQL Agent job to run package. Job does something for about five minutes, announces it was successful, but no new records in DB and no renaming of input files.
Package uses dedicated login for SQL Server privileges. Job is run as HOSTNAME-SVC which has read/write privileges on the input directory and the archive directory.
Have you setup logging for the package? You could add a script task to the For-Each Loop Container that runs a Dts.Events.FireInformation command during each loop. This could help you track the file name it finds, the number of loops it does, how long each loop takes, etc. You could also add a logging step at the end so that you know it is at least exiting the For-Each Loop container successfully.
If you find that the package is running successfully but not looping through any files at all, then you may want to test using a simpler package that reads one file only and loads it into a staging table. If that works, then go the next step of looping over all the files in the director and only importing the one file over and over again. If that works, then go the next step of changing the file connection to match the file that it finds in the For-Each Loop Container file enumerator task.
If the package isn't looping over any files and you can't get it to see even the one file you tested loading from the job, then try creating a proxy account with your credentials and running the job as the proxy account. If that works, then you probably have a permissions issue with your service account.
If the package doesn't import anything even with the proxy account, then you may want to log into the server as the service account and try to run the SSIS package in BIDS. If that works, then you may want to deploy it to the server and run the package from the server (which will really use your machine, but at least it uses the ssis definition from the server). If this works, then try running the package from the agent.
I'm not sure I fully understand. The package has already been thoroughly tested under several Windows accounts, and it does find all the files and rename all the files.
Under the Agent, it does absolutely nothing visible, but takes five minutes to do it. NO permissions errors or any other errors. I didn't mention that an earlier attempt DID get permissions errors because we had failed to give the service acount access to the input and output directories.
I cannot log in as the service account to try that because I do not have a pasword for it. But sa isd job owner so it should be able to switch to the service account--and the access errors we got ten days ago show that it can. The package itself has not changed in those ten days. We just deleted the job in order to do a complete "dress rehearsal" of deployment procedure.
So what has changed, I presume, is some detail in the deployment procedure, which unfortunately was not in source control at the time it succeeded.
It seems to be something different about the permissions. We made the problem go away by allowing "everyone" to read the directory on the production server. For some unknown reason, we did not have to do that on the test server.
When the job tried to fetch the file list, instead of getting an error (which would be logged) it got an empty list. Why looping through an empty list took five minutes is still a mystery, as is the lack of permissions. But at least what happened has been identified.
I had a similar problem. Was able to figure out what was happening by setting the logging option of the SQL Server Agent Job.
Edit the step in the job that runs the package, go to the logging tab and pick "SSIS log provider for SQL Server" and, in the configuration string, I picked (using the drop down) the OLEDB connector that was in the package, it happens to connect to SQL Server in question.
I was then able to view more details in the history of that job, and confirmed that it was not finding files. By changing permissions on the directory to match the sql server agent account, the package finally executed properly.
Hope this helps.
You may want to turn logging off after you resolve your issue, depending on how often your package will run and how much information logging provides in your case.
Regards,
Bertin