I have something like this in a mysql row value:
{15;16;}
And I want a mysql_query to update it. It must be something like it "{15;16;17}" So I need to delete last "}" add my text and close with "}"
to delete last "}" code:
SUBSTRING(server_players, 1, LENGTH(server_players)-1)
to add text this :
CONCAT(server_players, '17;}')
but I don't know how to build a one query :(
It's the query :
UPDATE members SET "Here must be built code" WHERE server_id=1
Generally this sort of manipulation is best done in the application, but additionally, it's not a great idea to store data in that format. Relational databases prefer things to be broken out as individual records.
That said, your solution is to nest the two things:
UPDATE members
SET server_players=CONCAT(SUBSTRING(server_players, 1, LENGTH(server_players)-1), '17;}')
WHERE server_id=1
The query code for removing an arbitrary player from your list will be even more complicated. I strongly suggest not using this schema.
As you can see this is a very, very messy way to do what should be done as:
INSERT INTO server_players (server_id, player_id) VALUES (1, 17)
Where you have a table specifically for the players on a server.
Remember proper relational tables have the advantage of being quick to query because they're indexed, and you can do useful things with that data like apply a JOIN to pre-load other information.
Related
Example tables (not actual database):
In this example, I would have the SecurityCode(Unique), and Time. My current solution involves attempting to add a new Person using the security code, then querying the ID, then adding to the Times table. This is 3 separate statements and could likely be a lot faster. Any advice on how to optimise this?
Thanks.
Edit: I previously forgot to mention that this is normally done in a batch of 30-40 records.
I am also considering using SecurityCode as the foreign key in Times.
I think there are many ways of achieve this, the easiest:
Try using "IF", you only need it for the first step of your statement, the last two are independent to the result of this evaluation.
Plus, save your security code in a variable, then you will save one table scan (you already have it)
**please note its just pseudo-code**
IF (exists select * from person where securityCode = #securityCode) then
Step 1
End
Step 2
Step 3
Can you try it?
The fastest way seemed to be to batch ignore insert all security codes, then batch insert all Times with a subquery to select the correct ID from Person.
I'd like to select * from 2 tables, but have each table's column name be prefixed with a string, to avoid duplicate column name collissions.
For example, I'd like to have a view like so:
CREATE VIEW view_user_info as (
SELECT
u.*,
ux.*
FROM
user u,
user_ex ux
);
where the results all had each column prefixed with the name of the table:
e.g.
user_ID
user_EMAIL
user_ex_ID
user_ex_TITLE
user_ex_SIN
etc.
I've put a sql fiddle here that has the concept, but not the correct syntax of course (if it's even possible).
I'm using MySql, but would welcome generic solutions if they exist!
EDIT: I am aware that I could alias each of the fields, as mentioned in one of the comments. That's what I'm currently doing, but I find at the start of a project I keep having to sync up my tables and views as they change. I like the views to have everything in them from each table, and then I manually select out what I need. Kind of a lazy approach, but this would allow me to iterate quicker, and only optimize when it's needed.
I find at the start of a project I keep having to sync up my tables and views as they change.
Since the thing you're trying to do is not really supported by standard SQL, and you keep modifying database structures in development, I wonder if your best approach would be to write a little script that recreates that SELECT statement for you. Maybe wrap it in a method call in the development language of your choice?
Essentially you'd need to query INFORMATION_SCHEMA for the tables and columns of interest, probably via a join, and write the results out in SQL style.
Then just run the script every time you make database structural changes that are important to you, and watch your code magically keep up.
I am a bit rusty with mysql and trying to jump in again..So sorry if this is too easy of a question.
I basically created a data model that has a table called "Master" with required fields of a name and an IDcode and a then a "Details" table with a foreign key of IDcode.
Now here's where its getting tricky..I am entering:
INSERT INTO Details (Name, UpdateDate) Values (name, updateDate)
I get an error: saying IDcode on details doesn't have a default value..so I add one then it complains that Field 'Master_IDcode' doesn't have a default value
It all makes sense but I'm wondering if there's any easy way to do what I am trying to do. I want to add data into details and if no IDcode exists, I want to add an entry into the master table. The problem is I have to first add the name to the fund Master..wait for a unique ID to be generated(for IDcode) then figure that out and add it to my query when I enter the master data. As you can imagine the queries are going to probably get quite long since I have many tables.
Is there an easier way? where everytime I add something it searches by name if a foreign key exists and if not it adds it on all the tables that its linked to? Is there a standard way people do this? I can't imagine with all the complex databases out there people have not figured out a more easier way.
Sorry if this question doesn't make sense. I can add more information if needed.
p.s. this maybe a different question but I have heard of Django for python and that it helps creates queries..would it help my situation?
Thanks so much in advance :-)
(decided to expand on the comments above and put it into an answer)
I suggest creating a set of staging tables in your database (one for each data set/file).
Then use LOAD DATA INFILE (or insert the rows in batches) into those staging tables.
Make sure you drop indexes before the load, and re-create what you need after the data is loaded.
You can then make a single pass over the staging table to create the missing master records. For example, let's say that one of your staging table contains a country code that should be used as a masterID. You could add the master record by doing something along the lines of:
insert
into master_table(country_code)
select distinct s.country_code
from staging_table s
left join master_table m on(s.country_code = m.country_code)
where m.country_code is null;
Then you can proceed and insert the rows into the "real" tables, knowing that all detail rows references a valid master record.
If you need to get reference information along with the data (such as translating some code) you can do this with a simple join. Also, if you want to filter rows by some other table this is now also very easy.
insert
into real_table_x(
key
,colA
,colB
,colC
,computed_column_not_present_in_staging_table
,understandableCode
)
select x.key
,x.colA
,x.colB
,x.colC
,(x.colA + x.colB) / x.colC
,c.understandableCode
from staging_table_x x
join code_translation c on(x.strange_code = c.strange_code);
This approach is a very efficient one and it scales very nicely. Variations of the above are commonly used in the ETL part of data warehouses to load massive amounts of data.
One caveat with MySQL is that it doesn't support hash joins, which is a join mechanism very suitable to fully join two tables. MySQL uses nested loops instead, which mean that you need to index the join columns very carefully.
InnoDB tables with their clustering feature on the primary key can help to make this a bit more efficient.
One last point. When you have the staging data inside the database, it is easy to add some analysis of the data and put aside "bad" rows in a separate table. You can then inspect the data using SQL instead of wading through csv files in yuor editor.
I don't think there's one-step way to do this.
What I do is issue a
INSERT IGNORE (..) values (..)
to the master table, wich will either create the row if it doesn't exist, or do nothing, and then issue a
SELECT id FROM master where someUniqueAttribute = ..
The other option would be stored procedures/triggers, but they are still pretty new in MySQL and I doubt wether this would help performance.
I am pretty new to this so sorry for my lack of knowledge.
I set up a few tables which I have successfully written to and and accessed via a Perl script using CGI and DBI modules thanks to advice here.
This is a member list for a local band newsletter. Yeah I know, tons of apps out there but, I desire to learn this.
1- I wanted to avoid updating or inserting a row if an piece of my input matches column data in one particular column/field.
When creating the table, in phpmyadmin, I clicked the "U" (unique) on that columns name in structure view.
That seemed to work and no dupes are inserted but, I desire a hard coded Perl solution so, I understand the mechanics of this.
I read up on "insert ignore" / "update ignore" and searched all over but, everything I found seems to not just skip a dupe.
The column is not a key or autoinc just a plain old field with an email address. (mistake?)
2- When I write to the database, I want to do NOTHING if the incoming email address matches one in that field.
I desire the fastest method so I can loop through their existing lists export data, (they cannot figure out the software) with no racing / locking issues or whatever conditions in which I am in obvious ignorance.
Since I am creating this from scratch, 1 and 2 may be in fact partially moot. If so, what would be the best approach?
I would still like an auto increment ID so, I can access via the ID number or loop through with some kind of count++ foreach.
My stone knife approach may be laughable to the gurus here but, I need to start somewhere.
Thanks in advance for your assistance.
With the email address column declared UNIQUE, INSERT IGNORE is exactly what you want for insertion. Sounds like you already know how to do the right thing!
(You could perform the "don't insert if it already exists" functionality in perl, but it's difficult to get right, because you have to wrap the test and update in a transaction. One of the big advantages of a relational database is that it will perform constraint checks like this for you, ensuring data integrity even if your application is buggy.)
For updating, I'm not sure what an "update ignore" would look like. What is in the WHERE clause that is limiting your UPDATE to only affect the 1 desired row? Perhaps that auto_increment primary key you mentioned? If you are wanting to write, for example,
UPDATE members SET firstname='Sue' WHERE member_id = 5;
then I think this "update ignore" functionality you want might just be something like
UPDATE members SET firstname='Sue' WHERE member_id = 5
AND email != 'sue#example.com';
which is an odd thing to do, but that's my best guess for what you might mean :)
Just do the insert, if data would make the unique column not be unique you'll get an SQL error, you should be able to trap this and do whatever is appropriate (e.g. ignore it, log it, alert user ...)
I have a database table full of some really ugly and messy data. In a separate table, I have a cleaner version of the data and they are linked by an id, but I need to keep the messy dataset and can't overwrite it as I use it to check against data differences.
I'm trying to merge the data into a new table, OR use a single query across both tables and give the clean table results priority in the result.
So if id=3 uglydata=x7z cleandata=xyz, then I'd get the clean data, if cleandata was null, I'd get the ugly data.
I tried selecting cleandata AS uglydata, hoping that MySQL would just overwrite the other field, but that doesn't work (and yes, it's weird and I figured that wouldn't work).
Is their a nice way to do this?
The other solution I can think if is just to insert into the new table from the clean data first, and then insert from the ugly data as the bid is unique.
But I'm hoping I'll be able to prioritize results by type or something.
SELECT IFNULL(cleandata, uglydata)
FROM uglytable
LEFT OUTER JOIN cleantable
ON clean_id = ugly_id