I have a table in My Sql qhere i have fields lie name, location, description and picture.
What I want to do is store multiple picture links in the picture column.
Is there a way of doing that without creating a separate table for picture?
Thank you
Well you need to perform some sort of serialization in order to do that. I used to do that before I moved to document-oriented databases. Quite possibly your best option is to store everything in a json format as it is pretty universal and I can't think of any language that cannot handle it and convert it back to an object, array, dictionary or whatever the language requires. Assuming you need to save the name of the file as in somefile.png, what you could do is store ["image1.png","image2.png","image3.png"] and so on. If you want to store a blob however it's a bit more complicated. You either have to create a second table or read the contents of each image, convert it to base64, load all base64 strings into an object and then serialize it into a json. I wouldn't recommend that as each operation would cost a lot of system resources.
Related
I am trying to understand why JSON is widely used for data transfer between client and server. I understand that it offers simple design which is easy to understand. However, on the contrary;
A JSON string includes repeated data, e.g, incase of a table, columns names (keys) are repeated in each object . Would it not be wise to send columns as first object and rest of the object should be the data (without columns/keys information) from the table.
Once we have a JSON object, the searching based on keys is expensive (in time) compared to indexes. Imagine a table with 20-30 column, doing this searching for each key for each object would cost a lot more time compare to directly using indexes.
There may be many more drawbacks and advantages, add here if you know one.
I think if you want data transfer then you want a table based format. The JSON format is not a table based format like standard databases or Excel. This can complicate analyzing data if there is a problem because someone will usually use excel for that (sorting, filtering, formulas). Also building test files will be more difficult because you can't simply use excel to export to JSON.
But, If you wanted to use JSON for data transfer you could basically build a JSON version of a CSV file. You would only use arrays.
Columns: ["First_Name", "Last_Name"]
Rows: [
["Joe", "Master"],
["Alice", "Gooberg"]
.... etc
]
Seems messy to me though.
If you wanted to use objects then you will have to embed Column names for every bit of data, which in my opinion indicates a wrong approach.
I need your help with a db question on MySQL. I have a table containing spots with several properties about each spot, and an user table containing properties for each user. How to store into database the spots a user subscribed to ?
Currently I have a field containing a Json array, which I decode and re-encode each time I need to access and modify. But a direct mysql request can't support it so complexity is quickly so important...
How would you structure your dB in such a situation?
Have your fields in different colums in a table instead of just one for the Json array.
You can try a join operation on both tables. That way you wont need to decode and re-encode your Json array.
Downside being it will take up more storage space.
Hope that helps. :)
Let's assume I need to store some data of unknown amount within a database table. I don't want to create extra tables, because this will take more time to get the data. The amount of data can be different.
My initial thought was to store data in a key1=value1;key2=value2;key3=value3 format, but the problem here is that some value can contain ; in its body. What is the best separator in this case? What other methods can I use to be able to store various data in a single row?
The example content of the row is like data=2012-05-14 20:07:45;text=This is a comment, but what if I contain a semicolon?;last_id=123456 from which I can then get through PHP an array with corresponding keys and values after correctly exploding row text with a seperator.
First of all: You never ever store more than one information in only one field, if you need to access them separately or search by one of them. This has been discussed here quite a few times.
Assuming you allwas want to access the complete collection of information at once, I recommend to use the native serialization format of your development environment: e.g. if it is PHP, use serialze().
If it is cross-plattform, JSON might be a way to go: Good JSON encoding/decoding libraries exist for something like all environments out there. The same is true for XML, but int his context the textual overhead of XML is going to bite a bit.
On a sidenote: Are you sure, that storing the data in additional tables is slower? You might want to benchmark that before finally deciding.
Edit:
After reading, that you use PHP: If you don't want to put it in a table, stick with serialize() / unserialize() and a MEDIUMTEXT field, this works perfectly, I do it all the time.
EAV (cringe) is probably the best way to store arbitrary values like you want, but it sounds like you're firmly against additional tables for whatever reason. In light of that, you could just save the result of json_encode in the table. When you read it back, just json_decode to get it back into an array.
Keep in mind that if you ever need to search for anything in this field, you're going to have to use a SQL LIKE. If you never need to search this field or join it to anything, I suppose it's OK, but if you do, you've totally thrown performance out the window.
it can be the quotes who separate them .
key1='value1';key2='value2';key3='value3'
if not like that , give your sql example and we can see how to do it.
I have an application that modifies a table dynamically, think spreadsheet), then upon saving the form (which the table is part of) ,I store that changed table (with user modifications) in a database column named html_Spreadhseet,along with the rest of the form data. right now I'm just storing the html in a plain text format with basic escaping of characters...
I'm aware that this could be stored as a separate file, the source table (html_workseeet) already is. But from a data handling perspective its easier to save the changed html table to and from a column so as to avoid having to come up with a file management strategy (which folder will this live in, now must include folder in backups, security issues now need to apply to files, how to sync db security with file system etc.), so to minimize these issues I'm only storing the ... part in the database column.
My question is should I gzip the HTML , maybe use JSON, or some other format to easily store and retrieve the HTML from the database column, what is the best practice to store HTML content in a datbase? Or just store it as I currently am as an escaped text column?
If what you are trying to do is save the HTML for redisplay, what's wrong with saving it as is, then just retrieving it via a stored proc, and re-displaying it for them when needed?
Say you have an HTML page, which can select some kind of ID from a list, either on a ThickBox page, or from a select option.
Normally for this kind of situation, you would probably query the DB via $Ajax possibly JSon, or not.
Then the result sent back to the $Ajax call will be your resultant data.
Then you replace the Div which holds your SpreadSheet with the DB SpreadSheet.
So, in answer to your original question, you could store the SpreadSheet with some sort of ID, storing it as the HTML of the Div.
When retrieved, you merely replace the Div HTML, with what you have stored.
It depends on the size of the HTML. You could store it as a binary BLOB after zipping it. Most of the time it is just best to store the data directly after escaping the SQL characters that may cause problems.
As asked many times why are you storing the view instead of the model ?
You probably should bit the bullet and parse the table (using an HTML parser perhaps), as otherwise you risk the user storing damaging JavaScript in the table data. (This means the cell contents should be parsed as well). The data can still be stored as a blob (maybe compressed CSV or JSON), but you need to make certain it is not damaging.
I am working on an embedment section on my site where users can embed different media from various services, youtube, myspace music, vimeo etc
I am trying to work out the best way to store it. Users do not have to embed all of the options and can only embed one of each type (one video for example).
Initially I thought just have a table with a row per embeded item like so:
embedid (auto increment primary key),
userid, embedded_item_id (e.g a
youtube id)
but then I realised that some embeddable items require multiple arguments such as myspace music so I thought id make a table where each user has one row.
userid, youtubeid, vimeoid,
myspaceid1, myspaceid2
but it seems a bit clumsy especially considering there will always be empty rows as users can not ever have all of them. Does anyone have a better solution?
`EmbededItem' table has columns common to all items.
YouTube, Vimeo, MySpace have only columns specific to each one.
So, here's what I'd do in such a situation:
Setup your table with columns for your primary key and userid fields and anything else you may need to identify the user or application (maybe a 'mediatype' field). The rest, put into a VARCHAR field, make it large enough to hold lots of data. Not sure how much space you would need, but I'm going to venture a guess that you will need between 1K and 4K+ of space.
The reason for a VARCHAR field: you never know what other new fields you will need in the future. Let's say next year youtube adds another parameter, or a new media format comes along. If you model your database to represent all fields individually, you will create an application that is not scalable to future or other media formats. Such modeling is great when you're describing a system on paper, but not so good when you implement code.
So, now that you have a varchar field to store all your data in, you have several options for how to store the data:
You can store the data as an XML document and parse it on input/output (But you will most likely need more than 4k of space), and you will incur the cost of parsing XML.
You can store the data as whatever data format you may need for your application (serialized object for java, JSON for javascript, etc). If you're serializing an object, you may also need more than 4k of space, and a VARBINARY field, not VARCHAR.
comma delimited string, although this fails if your strings contain commas. I probably would not recommend this.
null delimited key/value pair strings, with a double null at the end. You will need a VARBINARY data field for this one.
Number 4 is my favorite, and something I would recommend. I've used this pattern for an existing web project, where my strings are stored in format of:
'uid=userid/0var1=value1/0val2=value2/0url=urltosite/0/0'
Works like a charm. I use the data to build dynamic web pages for my users. (My application is C though, so it deals well with parsing a character array).
Your application could use the data from your first columns (like 'mediatype') to execute specific parsing routines if required, and use the VARCHAR/VARBINARY fields as input. Scaling to new types of embeddable media would be as simple as writing a new (or extending an existing) parser and defining a new 'mediatype' value.
Hope this helps.