Basically I have a web app which it contains items and the people can comment on it and tell what they think about it. just like youtube videos or any other similar website.
Question
Where is the best place to save these comments. Do I Create a table in MySql database and save it there or do I save it in a .txt file and then save the location of the file on the database. I would really appreciate if someone tell me which is good from performance perspective or is there any alternative better way.
Much appreciations.
Save the comments directly in database.
Adding a comment in file and saving the path in database is not good you are adding extra work.
Reading from database is easy and more professional as every where it is in use. In YouTube as well.
There is chance someone delete your file and you will lose your data.
Clearly the database. They were build to replace the unpleasant storage of data in flatfiles also for performance reason.
You need to save the comments in the database. It is easier to query and sort the comment later on. Despite save as .txt, it is safer to save it in database.
Related
I am building a website with ROR 3. I need to provide a page to my clients wherein he could edit his pricing info regarding the application. I am quite confused on how to do this. The pricing page needs to be displayed as an html table with different columns which has got the pricing info.
I am thinking of different ways to do this.
1) Allow the client to create and upload an html page and then save it as a file in my public directory and render as an when the client clicks on the pricing link.
2) The clients may not have bare technical knowledge, hence make the client upload some other formats like Word, Excel etc and then parse it and store it as an HTML file in the public directory.
3) Provide the client with some real time editing tools where in the client could edit in a fixed format, and after wards save the file and render it later.
Also, I wouldn't like to store these infos in my database. There would be quite a few number of clients and hence managing all these data in my database would become cumbersome. Storing all these as plain html files and rendering it later would be the most ideal thing for me.
There might be other better steps in doing this as well. Could you please suggest which might the better, or any other option that could suit my needs? Basically I would want my clients to have a mechanism where they could provide there pricing details, edit it later and display it back as an html table, all this without using an Database backend. Any suggestions would be mostly appreciated.
Good way is Excel(csv format).
You can do PHP with Excel. I thing this is the best solutions for your requirement.
Try this.
http://php.net/manual/en/function.fgetcsv.php
If you are give authority to user to change edit contain and you have to used " CSV or Excel" please see these links:
Importing CSV and Excel
Exporting CSV and Excel
If you really don't want to use database then you can use YAML as a structured storage.
e.g. ( you, most probably, could come up with a better structure )
SMS_Pack:
Sl_No:
1: 10000
2: 25000
3: 50000
You can read those .yml files and parse them as hashes. Should be fairly easy to represent that hash as a HTML table.
For the creation, I'm sure you can come up with some dynamic form input. Or to just let the client send this kind of file ( which might not be the best solution ).
But it just might be easier to manage all of this information within a database.
I'm new in this subject so this might be a silly question for most of you. I have a simple server which several users will access. If any of them change a CSS property of an element, the others should be able to see the change in real time.
Should I use something like node.js to perform this? How do I save the changes the users do?
The page would look something like this: http://stom89.dyndns.org/
Thanks!
I guess what you want to change in your CSS / html , are states. Like if a lamp is on/off? Then you need to save each state in a mySQL DB and just grab the data for each user. If you want it to look like realtime for online users, then use js(ajax) to sync data regularly.
Alternative way without a DB would be with files.
If you don't wanna use mysql for this, you can use files. I suggest using ini files. For more on how to read/write ini files, you can visit this question. It's super simple and you'll be able to have each variable in a nifty array.
What you need: A bit of PHP, a little bit of jQuery (or js), understanding of GET variables
I suggest you create 3 files.
index.php :
Your main page which is the client. Pulls info using get
variables. You can use jQuery.get() for this.
getstate.php :
This is the file which will read the ini file and give you back the states for each device. Read them with jQuery.get() from index.php .
savestate.php:
This is the file which you'll send the new states to from index.php Example request: http://address.goes.here/savestate.php?bedroomlight=1&garagelight=0
Whats even more interesting is that ini files can be written/read easily by many programming languages so you can manipulate the data using your Raspberry Pi easily. (say someone turns of a light, a script polling state could change the ini file)
I think you would need to use a sql database and have a javascript to detect changes and update through AJAX. That's my best idea.
I have been messing with this subject for sometime if I completely understand your question. I would suggest looking at python, ruby or node.js though I could not say which is the easiest to learn for you though I would suggest python and a comet server which could be ape and simply have the server push the updates to the users that are already on the site.
Edit:
Suggestions for polling :: jQuery
http://api.jquery.com/jQuery.get/ for standard data retrieval which is about all you will need.
I need to create a daily process which pulls market data from a website http://www.apxendex.com/index.php?id=137.
I had originally created an excel+VBA sheet which added the data to an xml file and this worked fine. However, the machine which I will be putting the code on doesn't have excel so all my work was pointless (stupid I know).
I'm not looking for someone to write anything for me, but some general tips on where to start, what language to go for etc. would be helpful.
Thanks
How did you get your information from the website?
There is a DataCapture API for that site. You can find the manual on this site (you have to login to see the manual). I guess it would be the easiest way to go the recommended way.
I need to make a backup system for my rails app but this has to be a little special: It doesn't have to back up all the database info and files in a single file or folder but it has to back up the database info and attachment files per user. I mean, every one of this backups should be able to regenerate all the information and files for one single user.
My questions are:
Is this possible? What's the best way to do it? And, if it's impossible or a bad idea at all, why is it?
Note: The database is a MySQL one.
Note2: I used Paperclip for the users uploads.
Im guessing you have an app that backs up data, when a user clicks on something right? I'm thinking get all the info connected to the user(depends on how you did your user model, so maybe you should have a get_all_info method) then write it out in sql format to a file, which you save as .sql. (either using File.new or Logger.new)
I would dump the entire user object and related objects into a single xml file dump. As you go through the creation of the XML grab out all the files and write the XML + all files into one directory, then compress them.
I think there are definitely use cases to have a feature like this, but be sure to have it run in a background process and only when needed in order to not bog down the web server. Take a look at http://github.com/tobi/delayed_job or http://github.com/defunkt/resque.
What is the best way to create an Archive of image documents in the database ?
Given we have about 2-10 million records and each record includes 2-4 images and about 20 text fields , what is the best way for create this archive so that we have good speed and high security for data?
Also, what database is good for this project?
Definitely use the file system as Minor suggested.
One option is SQL Server FILESTREAM. See http://msdn.microsoft.com/en-us/library/cc949109.aspx.
Use file system storage for archive image. You must save link in DB for the image file. And if you use a HTTP content you can use the cache proxy server such as Squid, Nginx, etc.
More questions for you:
How dynamic is the data? Do you store it once and never change it or it gets frequently changed?
Do you need versioning for the documents or the latest version overwrites the previous and that's it.
Are the documents always edited using one application or they can be changed outside (ex: using Word)
Are the documents related to other "non-document" data (database rows) or is it the only thing that you need to store?
File system won't offer any real security, so I would discount that straight off.
In Oracle there is built-in image support through the ORDImage type.
Check out Marcel's blog as he, and the Piction company, do a lot of work in this area and he has lots of useful material to download.
You can use control downloads. Look at http://kovyrin.net/2006/11/01/nginx-x-accel-redirect-php-rails/lang/en/