In Beaker notebook, If something crashes or I get disconnected, can I recover my work? - beaker-notebook

Does Beaker automatically save my work? What will happen if I get disconnected in the middle of a project?

Yes, Beaker saves your notebooks as you edit them automatically every 60 seconds. Look in ~/.beaker/v1/var/sessionBackups to find your work.
Also, if the client becomes disconnected from the server, it tries for 5 seconds to reconnect, but if it cannot, Beaker gives you a chance to save a copy of your notebook locally.

Related

SSIS - File system task, Create directory error

I got an error after running a SSIS package that has worked for a long time.
The error was thrown in a task used to create a directory (like this http://blogs.lessthandot.com/wp-content/uploads/blogs/DataMgmt/ssis_image_05.gif) and says "Cannot create because a file or directory with the same name already exists", but I am sure the directory or a file with the same name didnĀ“t exist.
Before throwing error, the task created a file with no extension named as the expected directory. The file has a modified date more than 8 hours prior to the created date wich is weird.
I checked the date in the server and it is correct. I also tried running the package again and it worked.
What happened?
It sounds like some other process or person made a mistake in that directory and created a file that then blocked your SSIS package's directory create command, not a problem within your package.
Did you look at the security settings of the created file? It might have shown an owner that wasn't the credentials your SSIS package runs under. That won't help if you have many packages or processes that all run under the same credentials, but it might provide useful information.
What was in the file? The contents might provide a clue how it got there.
Did any other packages/processes have errors or warnings within a half day of your package's error? Maybe it was the result of another error. that you could locate through the logs of the other process.
Did your process fail to clean up after itself on the last run?
Does that directory get deleted at the start of your package run, at the end of your package run, or at the end of the run of the downstream consumer of the directory contents? If your package deletes it at the beginning, then something that slows the delete could present a race condition that normally resolves satisfactorily (the delete finishes before the create starts) but once in a while goes the wrong way.
Were you (or anyone) making a copy or scan of the directory in question? Sometimes copy programs (i.e. FTP) or scanning programs (anti virus, PII scans) can make a temporary copy of a large item being processed (i.e. that directory) and maybe it got interrupted and left the temp copy behind.
If it's not repeatable then finding out for sure what happened is tough, but if it happens again try exploring the above. Also, if you can afford to, you might want to increase logging. It takes more CPU and disk space and makes reviewing logs slower, but temporarily increasing log details can help isolate a problem like that.
Good luck!

Websites running very slow

we have a vserver problem that started all of the sudden yesterday.
If you go to this Website:
http://www.rightsfreeradio.de
You will notice that it needs ages to load.
This happends to all websites we have running on the vserver.
I was asking the Provider if there is any problem with their connections, but they dont have any problems.
If I log in to FTP its running fast as usual
only all web based applications and websites are running very slow.
Running "top" shows that mysql takes like 70%+ on the CPU, but Iam not sure if thats normal or not.
Do you have any ideas what could be wrong with the server?
What programming standards are you using. I opened link but did not open it.
Either there may issue with server. Or another cas is:
Check any js, css file taking time to load
Put unncessary imported files at the end of body tag
On load are you calling any function which may be prone to deadlock getting blocked?
Make sure to use HTML Validator to correct your HTML etc.
Also make sure all scripts are working fine or to debug. Take off all the script files imported and go from there.
Link doesn't open at all.
first, I suggest you to restart all service on your server and then:
check mysql error log as you say above
tail -f /var/log/mysql.log
and then, check your databases
mysqlcheck -Aor
and you can follow this link bellow
Show top five CPU consuming processes with ps

With an SSISDeploymentManifest file, is there a way to pre-select the Installation Folder?

Short Version:
I have 7 SSISDeploymentManifest files I need to run. Is there a way to alter the SSISDeploymentManifiest file to per-populate Installation value?
Rant Version
At first running 7 deployments did not seem like much of a problem. But the part of the process where you "Select Installation Folder" for package dependencies is horribly designed.
First, you have to enter a network path here if you are not running from the server you will install to. This is because the dialog box makes sure path you enter a valid path... on the local machine you run the manifest from. But when the package is run it will need to also work for the server. (dumb huh?))
The next problem with this screen is that the field is read only. So I cannot just specify the path directly.
Second, the dialog box to "browse" won't let me enter a path.
So... I have to browse my entire network (from home, over a vpn). That takes a long time.
Is there a way to alter the SSISDeploymentManifiest file to pre-populate this value?
No, dtsinstall doesn't accept any command line arguments, pity. My first approach to this was to write a heavy, command line application that made repeated calls to dtutil to get things done. I never finished it but if you want to peek, it's on codeplex
What I do like and prefer is a PowerShell script that handles my SSIS deployments now. Even if PowerShell isn't your cup of tea, the concepts should apply to whatever .NET language you might want to use to handle it.
Attractive features of it are that it will create the folders in SQL Server for you and correctly deploy those packages into said folders. The other nice thing that might be of assistance to you is that if all the 7 deploys are in a common folder structure, the script walks the folder structure looking for manifests and deploys all the packages per manifest so you could conceivably deploy all with a single mouse click.

How to automatically update MS-Access 2007 application

I have a front-end Access 2007 apllication which talks to MySql server.
I want to have a feature where the application on the user's computer can detect that there is a new version on the network (which is not difficult) and download the latest version to the local drive and launch it.
Does anybody has any knowledge or exprience how this can be done?
Thanks
Do you actually need to find out if there is a newer version?
We have a similar setup as well, and we just copy the frontend and all related files every time someone starts the application.
Our users don't start Access or the frontend itself. They actually start a batch file which looks something like this:
#echo off
xcopy x:\soft\frontend.mde c:\app\ /Y
c:\app\frontend.mde
When we started writing our app, we thought about auto-updating as well and decided that just copying everything everytime is enough.
We have enough bandwidth, so the copying doesn't create any performance problems (with about 200 users).
Plus, it makes some things easier for me as a developer when I can be sure that each time the application is started, the frontend is overwritten anyway.
I don't have to care about auto-compacting the frontend when it's closed (and users complaining that closing the app takes too long...), and I don't have to deal with corrupted frontends after crashes.
#Lumis - concerning the custom icon:
Ok, maybe I should have made this more clear. There is only one batch file, and it's in the same network folder as the frontend.
The users just have links on their desktops which all point to the same batch file in the network folder.
This means that:
future changes to the batch file are easy, because it's only one single
file in one central place
we can change the icon, because
what the user sees is a normal Windows link
(By the way, we did not change the icon. Our app is for internal use only, and I'm working in a manufacturing company, which means that all but very few users are absolutely non-technical and couldn't care less about the icon, as long as it's the same on all machines and they know how it looks like so they can find it quickly on their desktop...)
Tony Toews has one: Access Auto FE Updater
It appears to be free, but I'm not 100% sure.
Lumis's option is solid, however if you want to check the version and only copy the database when their is a new version, have a 'Version' field in a back end table, and a 'Version' constant in a front end module. Keep these in sync with each new production release. Compare the table version against the version in the module when the main form of the front end database opens.
If they don't match, have the database close, but have the database call a batch file as the last bit of code to run as it's closing. The database should finish closing before the batch file begins it's copy process. If needed, place a minor delay in the batch file code just to be sure there are no file locking issues.

How does one properly cache/update data-driven iPhone apps that use remote databases?

My app is highly data driven, and needs to be frequently updated. Currently the MySQL database is dumped to an xml file via PHP, and when the app loads it downloads this file. Then it loads all the values in to NSMutableArray's inside of a data manager class which can be accessed anywhere in the app.
Here is the issue, the XML file produced is about 400kb, and this apparently takes several minutes to download on the EDGE network, and even for some people on 3G. So basically I'm looking for options on how to correctly cache or optimize my app's download process.
My current thought is something along the lines of caching the entire XML file on to the iPhone's hard disk, and then just serving that data up as the user navigates the app, and loading the new XML file in the background. The problem with this is that the user is now always going to see the data from the previous run, also it seems wasteful to download the entire XML file every time if only one field was changed.
TLDR: My iPhone app's download of data is slow, how would one properly minimize this effect?
I've had to deal with something like this in an app I developed over the summer.
I what did to solve it was to do an initial download of all the data from the server and place that in a database on the client along with a revision number.
Then each time the user connects again it sends the revision number to the server, if the revision number is smaller than the server revision number it sends across the new data (and only the new data) from the server, if its the same then it does nothing.
It's fairly simple and it seems to work pretty well for me.
This method does have the drawback that your server has to do a little more processing than normal but it's practically nothing and is much better than wasted bandwidth.
My suggestion would be to cache the data to a SQLite database on the iPhone. When the application starts, you sync the SQLite database with your remote database...while letting the user know that you are loading incremental data in the background.
By doing that, you get the following:
Users can use the app immediately with stale data.
You're letting the user know new data is coming.
You're storing the data in a more appropriate format.
And once the most recent data is loaded...the user gets to see it.