I have a small problem with my logback configuration and I wonder if what I want to do is possible.
I use logback with RollingFileAppender rolling my files on a daily basis.
It works very well, but all files are recreated everyday, even if files are empty.
Is there a mean to tell logback to create a file only if something has to be written in it ?
Thanks a lot,
Seb
Related
We have been updating configuration files for standalone deployed applications inside a number of directories like this:
[wildfly]\modules\system\layers\base\com\\configurations\main\
But when we do so we create a copy of the previous version with a time and date stamp, for example:
appname\env-setup.properties
appname\env-setup_201912201230.properties
We have been advised this is not a good idea because these date and time stamped files will also be read and result in values being set incorrectly.
Is this correct? Where can we learn more regarding how these settings are read?
The git repo for my Django app includes several .tsv files which contain the initial entries to populate my app's database. During app setup, these items are imported into the app's SQLite database. The SQLite database is not stored in the app's git repo.
During normal app usage, I plan to add more items to the database by using the admin panel. However I also want to get these entries saved as fixtures in the app repo. I was thinking that a JSON file might be ideal for this purpose, since it is text-based and so will work with the git version control. These files would then become more fixtures for the app, which would be imported upon initial configuration.
How can I configure my app so that any time I add new entries to the Admin panel, a copy of that entry is saved in a JSON file as well?
I know that you can use the manage.py dumpdata command to dump the entire database to JSON, but I do not want the entire database, I just want JSON for new entries of specific database tables/models.
I was thinking that I could try to hack the save method on the model to try and write a JSON representation of the item to file, but I am not sure if this is ideal.
Is there a better way to do this?
Overriding save method for something that can go wrong or that can take more than it should is not recommended. You usually override save when changes are simple and important.
You can use signals but in your case it's too much work. You can instead write a function to do this for you but still not exactly after you saved the data to database. You can do it right away but it's too much process unless it's so important for your file to be updated.
I recommend using something like celery to run a function in the background separated from all of your django functions. You can call it on every data update or each hour for example and edit your backup file. You can even create a table to monitor the update process.
Which solution is the best is highly depended you and how important the data is. And keep in mind that editing a file can be a heavy process too so creating a backup like everyday might be a better idea anyway.
I have an issue writing files to a samba share. We don't seem to receive any failure error during the writing of the files, but a second later when we check from a different process, no files have been written. This problem seems to happen sporadically for about 5 minutes, or 10 minutes, and then go away.
The only clue we have is from the samba's logging. There are STATUS_OBJECT_NAME_COLLISION errors present. My understanding is that this means our software is trying to write a new file over a file that already exists. But what I don't understand is why, then, I see no files in that location at all after the process concludes. Could this error mean something else? Could it be caused by the configuration of the fileshare somehow?
Thank you.
The code STATUS_OBJECT_NAME_COLLISION may indicate an attempt to create a file which already exists while overwrite option was not specified.
Is your soft renaming any file on the destination?
I need to make a backup system for my rails app but this has to be a little special: It doesn't have to back up all the database info and files in a single file or folder but it has to back up the database info and attachment files per user. I mean, every one of this backups should be able to regenerate all the information and files for one single user.
My questions are:
Is this possible? What's the best way to do it? And, if it's impossible or a bad idea at all, why is it?
Note: The database is a MySQL one.
Note2: I used Paperclip for the users uploads.
Im guessing you have an app that backs up data, when a user clicks on something right? I'm thinking get all the info connected to the user(depends on how you did your user model, so maybe you should have a get_all_info method) then write it out in sql format to a file, which you save as .sql. (either using File.new or Logger.new)
I would dump the entire user object and related objects into a single xml file dump. As you go through the creation of the XML grab out all the files and write the XML + all files into one directory, then compress them.
I think there are definitely use cases to have a feature like this, but be sure to have it run in a background process and only when needed in order to not bog down the web server. Take a look at http://github.com/tobi/delayed_job or http://github.com/defunkt/resque.
Several times now I've had Eclipse delete files for me seemingly randomly - then they appear under the 'Local History' option.
What is going on! I'm definitely not just deleting things by mistake.
Most recently it deleted my template files under html-template which are quite important!
I have an AIR project and a web project that references the src directory inside the AIR project. Usually I close one project while working on the other.
FYI: Currently my backup solution is Windows Home Server which means I have to go home to find a file if its lost in some other fashion and not in history. Yes I do plan to rectify that!
Under Local History you can find the previous versions of your files, after you modified it.
Didn't you set this folder as the output folder for compiling? Then eclipse could clear the files during build.
I suspect it is an external application that is deleting or moving your files. Eclipse's local history simply keeps of copy of your files for quick reverting later.
I suggest trying using a different IDE for a while like NetBeans, and see if the files are still being deleted. Eclipse probably isn't the suspect, as those files would be in local history even if they were not deleted.
I am trying to fix an issue like this myself, I find that when I look into files that have been deleted with another text editor like GEdit, they look like they have been corrupted. I hadn't previously noticed that eclipse kept them in local history, thank you for that. I had been using gitHub for backups before and restoring from that.
If I find that switching to another IDE fixes it, or any other info, I will update this post.