Log Audit file when update mySQL and Perl - mysql

The want a Perl script that will write to a data file every time update the Database MySQL. I dont mind about growth of the file since Every Items audited will be stored seperatly
Thank you will Appreciate you Help

The module Log::Log4perl provides many different ways to log events to many types of output including files. It also would allow you to set debug levels to turn this off if you needed to.

Related

Export query from MySQL to Redshift

I need daily load the result of a specific query to Redshift. I've already created a table on redshift that will support the results of this query but now i'm a little stuck since i can't find a good way to solve this.
So far i tried using python but im getting lots of headaches regarding line terminators in fields that basically store a description, and character encodings.
I know lots of programs that allow you to connect to a db and make querys also have an export option to csv but since i need to automatically do this everyday i don't think any of those would work for me.
Now i would like to know if maybe there are better suited options so i can start looking into them. Im not asking for a step by step how to but just for tools/programs/etc that i should start looking into.
You should look into MySQL's stored procedures and events -- using just MySQL, you can have it generate a file every day.
You can't dynamically rename the file, or overwrite it, though, so you'd need a second job which deletes the file -- this can be done with Python.
Whether you're running Windows or Linux, you should be able to schedule a batch file or python script to execute once a day, and that would be an alternate way to do that.
Does this address your question?

insert csv file into MySQL with user id

I'm working on a membership site where users are able to upload a csv file containing sales data. The file will then be read, parsed, and the data will be charted. Which will allow me to dynamically create charts
My question is how to handle this csv upload? Should it be uploaded to folder and stored for later or should it be directly inserted into a MySQL table?
Depends on how much processing needs to be done, I'd say. if it's "short" data and processing is quick, then your upload-handling script should be able to take care of it.
If it's a large file and you'd rather not tie up the user's browser/session while the data's parsed, then do the upload-now-and-deal-with-it-later option.
It depends on how you think the users will use this site.
What do you estimate the size of the files for these users to be?
How often would they (if ever) upload a file twice, can they download the charts?
If the files are small and more for one-off use you could upload it and process it on the fly, if they require repetitive access and analysis then you will save the users time by importing the data to the database.
The LOAD DATA INFILE command in MySQL handles uploads like that really nice.If you make the table you want to upload it to and then use that command it has worked great and super quick for me. I've loaded several thousand rows of data in under 5 seconds using it.
http://dev.mysql.com/doc/refman/5.5/en/load-data.html

importing multiple xml files

I'm after a bit of advice here if possible.
I basically have 12 XML files which I need to use in my mysql db. These xml files are all in a completely different structure and the data changes constantly.
With this in mind what would be the best approach for bringing this in and using it and updating it. I had thought to use a cronjob to execute a php file to write each of these into thier own table, but baring in mind there are 12 files, with around 60 lines in each file and the cron job will need to run every 15 minutes I think this will end up killing the server.
Any ideas on a solution would be gratefully appreciated.
Thanks
Richard
You can use the LOAD_XML command in mysql to do this:
http://dev.mysql.com/doc/refman/5.5/en/load-xml.html
If the data is changed constantly in the XML files, how often does it need to be up to date in the database? If the database is just a backing record for the XML, then maybe you can update once or twice an hour.

How to periodically extract data from a CSV file?

I'm currently working in some Q&A projects. I am running tests (which can vary from a couple of minutes to 2-3 days) in an applications that is generating some csv files and updates them periodically, with a new row added with each update (once every couple of seconds or so).
Each CSV file is structured like this:
Header1,Header2,Header3,.................,HeaderN
numerical_value11,numerical_value12,numerical_value13,......,numerical_value1N,
numerical_value21,numerical_value22,numerical_value23,......,numerical_value1N,
etc
The number of columns may vary from csv file to csv file.
I am running in a windows environment. I also have cygwin (http://www.cygwin.com/) installed.
Is there a way I can do a script that runs periodically (once per hour or so), extracts data (a single/multiple values from a row, or the average of the the values from specific rows added in the csv between interrogations) and sends some email alerts if, for example, the data from one column is out of a range?
Thx
This can be done in several ways. Basically, you need to
1) Write a script in maybe pearl or python that does one iteration of what you want it to do.
2) Use windows scheduler to run this scrip at the frequency that you want. The Windows scheduler is very easy to setup from the Control Panel
Using Window' Scheduling, you can very easily get the interval part down; with the program parsing and alerting however, you have a few options. I myself would use C# to make the program. If you want an actual script however, VBA is a viable choice and could very easily Parse a basic CSV file and contact the web to send an email. If you have office already installed, this should give you some more detail. Hope that helps.

Data sync solution?

For some security issues I'm in an envorinment where third party apps can't access my DB. For this reason I should have some service/tool/script (dunno what yet... i'm open to the best option, still reading to see what I'm gonna do...)
which enables me to generate on a regular basis(daily, weekly, monthly) some csv file with all new/modified records for a certain application.
I should be able to automate this process and also export at any time a new file.
So it should keep track for each application which records he still needs.
Each application will need some data in some other format (csv/xls/sql), also some fields will be needed for some application and some aren't... It should be fairly flexible...
What is the best option for me? Creating some custom tables for each application? Based on that extracting modified data?
I think you best thing here, assuming you have access to the server to let you set this up is to make a small command line program that can do the relativley simple task you need. Languages like pearl are good for this sort of thing I do believe.
once you have that 'tool' made you can schedule it through the OS of the server to run ever set amount of time. Either schedule task for a windows server or a cronjob for a linux server.
You can also (with out having to set up the scheduled task if you don't / can't want to) enable this small command line application to be called via 'CGI' this is a special way of letting applications on the server be executed at will by a web user. If you do enable this though, I suggest you add some sort of locking system so that it can only be run every so often and to stop it being run five times at once.
EDIT
You might also want to just look into database replication or adding read only users. This saves a hole lot of arseing around. Try to find a solution that dose not split or duplicate data. You can set up users to only be able to access certain parts of the database system in certain ways, such as SELECT data