Inserting in MySQL Table - mysql

I am trying to insert data into mysql table through mysql C client, through the step written below.
The command is of the Form : (A variable string generated at run time)
INSERT INTO department values('Statistics','Taylor',395051.74)
which is correct for MySQL.
if (mysql_query(con, command))
{
printf("Done\n");
}
printf("\n%s\n",command);
But my database shows no change. No rows get inserted, is there any way the above steps could not work?

Note that mysql_query returns a zero if it is successful, and an error code if it's unsucessful MySQL Docs. I think you might be treating it backward. So I think it's issuing an error you're not catching.
As a guess of what might be wrong, try telling it what columns you're inserting into:
INSERT INTO department (`column1`,`column2`,`column3`)
values ('Statistics','Taylor',395051.74)

Related

OPENQUERY SQL Server MYSQL UPDATE

I have to work on a linked server. My goal: Update an entire table in mysql server(version:8.0.21) via OPENQUERY in SQL Server(version 13.0.1742.0). I tried this but it generates an error Row cannot be located for updating. Some values may have been changed since it was last read and this one The rowset was using optimistic concurrency and the value of a column has been changed after the containing row was last fetched or resynchronized.
update linkedTable
set
linkedTable.id_parent=unlinkedTable.IdCat1,
linkedTable.code=unlinkedTable.CodeFamilleFAT,
linkedTable.niveau=unlinkedTable.NiveauCategorieFAT,
linkedTable.langue=unlinkedTable.CodeLangueFAT,
linkedTable.nom=unlinkedTable.LibelleCommercialFAT,
linkedTable.descriptionA=unlinkedTable.DescriptifCom1FAT,
linkedTable.vignette=null,
linkedTable.id_categorie=unlinkedTable.id
from openquery(NAMELINKEDSERVER, 'select id_categorie, id_parent, code, niveau, langue, nom, description as descriptionA, vignette from DatabaseMySQL.Table') as linkedTable
inner join DatabaseSQLserver.dbo.Table as unlinkedTable on unlinkedTable.Id = linkedTable.id_categorie
Then I tried this:
update linkedTable
set
linkedTable.id_parent=unlinkedTable.IdCat1,
linkedTable.code=unlinkedTable.CodeFamilleFAT,
linkedTable.niveau=unlinkedTable.NiveauCategorieFAT,
linkedTable.langue=unlinkedTable.CodeLangueFAT,
linkedTable.nom=unlinkedTable.LibelleCommercialFAT,
linkedTable.descriptionA=unlinkedTable.DescriptifCom1FAT,
linkedTable.vignette=null,
linkedTable.id_categorie=unlinkedTable.id
from openquery(NAMELINKEDSERVER, 'select id_categorie, id_parent, code, niveau, langue, nom, description as descriptionA, vignette from DatabaseMySQL.Table') as linkedTable
inner join DatabaseSQLserver.dbo.Table as unlinkedTable on unlinkedTable.Id = linkedTable.id_categorie
where linkedTable.id_categorie = 1
This work but only one row is updated. So I wrote a stored procedure to update each line but it took too much time.
Can someone explain why my first query didn't work (question1) and how I can reduce the time of my stored procedure (question2)?
I use while loop (count the number of id and update each id).
Thank you in advance.
Kind Regards.
I resolve the problem by checking some option on ODBC Driver in MySQL and reading some forum. I check this box.
enter image description here
This option allows to avoid the errors quoted previously. With this option, i can update multiple values without error on join or other request. Thank you Solarflare and "Another guy" (i lost the name) for correcting me (EDIT A POST). Have nice day both.

Could this simple T-SQL update fail when running on multiple processors?

Assuming that all values of MBR_DTH_DT evaluate to a Date data type other than the value '00000000', could the following UPDATE SQL fail when running on multiple processors if the CAST were performed before the filter by racing threads?
UPDATE a
SET a.[MBR_DTH_DT] = cast(a.[MBR_DTH_DT] as date)
FROM [IPDP_MEMBER_DEMOGRAPHIC_DECBR] a
WHERE a.[MBR_DTH_DT] <> '00000000'
I am trying to find the source of the following error
Error: 2014-01-30 04:42:47.67
Code: 0xC002F210
Source: Execute csp_load_ipdp_member_demographic Execute SQL Task
Description: Executing the query "exec dbo.csp_load_ipdp_member_demographic" failed with the following error: "Conversion failed when converting date and/or time from character string.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
End Error
It could be another UPDATE or INSERT query, but the otehrs in question appear to have data that is proeprly typed from what I see,, so I am left onbly with the above.
No, it simply sounds like you have bad data in the MBR_DTH_DT column, which is VARCHAR but should be a date (once you clean out the bad data).
You can identify those rows using:
SELECT MBR_DTH_DT
FROM dbo.IPDP_MEMBER_DEMOGRAPHIC_DECBR
WHERE ISDATE(MBR_DTH_DT) = 0;
Now, you may only get rows that happen to match the where clause you're using to filter (e.g. MBR_DTH_DT = '00000000').
This has nothing to do with multiple processors, race conditions, etc. It's just that SQL Server can try to perform the cast before it applies the filter.
Randy suggests adding an additional clause, but this is not enough, because the CAST can still happen before any/all filters. You usually work around this by something like this (though it makes absolutely no sense in your case, when everything is the same column):
UPDATE dbo.IPDP_MEMBER_DEMOGRAPHIC_DECBR
SET MBR_DTH_DT = CASE
WHEN ISDATE(MBR_DTH_DT) = 1 THEN CAST(MBR_DTH_DT AS DATE)
ELSE MBR_DTH_DT END
WHERE MBR_DTH_DT <> '00000000';
(I'm not sure why in the question you're using UPDATE alias FROM table AS alias syntax; with a single-table update, this only serves to make the syntax more convoluted.)
However, in this case, this does you absolutely no good; since the target column is a string, you're just trying to convert a string to a date and back to a string again.
The real solution: stop using strings to store dates, and stop using token strings like '00000000' to denote that a date isn't available. Either use a dimension table for your dates or just live with NULL already.
Not likely. Even with multiple processors, there is no guarantee the query will processed in parallel.
Why not try something like this, assuming you're using SQL Server 2012. Even if you're not, you could write a UDF to validate a date like this.
UPDATE a
SET a.[MBR_DTH_DT] = cast(a.[MBR_DTH_DT] as date)
FROM [IPDP_MEMBER_DEMOGRAPHIC_DECBR] a
WHERE a.[MBR_DTH_DT] <> '00000000' And IsDate(MBR_DTH_DT) = 1
Most likely you have bad data are are not aware of it.
Whoops, just checked. IsDate has been available since SQL 2005. So try using it.

MySQL insert to bit(1) column via ODBC 5.2

I've searched and can't seem to find quite what I'm looking for.
I'm running a PL/SQL script in Oracle, and attempting to insert records into a table in MySQL via database link using MySQL ODBC 5.2 Unicode Driver.
The link works fine, I can do complex queries in Oracle using it, and do various inserts and updates on records there.
Where it fails is in trying to insert a record into a MySQL table that has a column of type bit(1).
It is basically a cursor for loop, with the insert statement looking something like:
INSERT INTO "app_user"#mobileapi (USERNAME, VERSION, ACCOUNT_EXPIRED, ACCOUNT_LOCKED, PASSWD, PASSWORD_EXPIRED)
VALUES (CU_rec.USERNAME, CU_rec.VERSION, CU_rec.ACCOUNT_EXPIRED, CU_rec.ACCOUNT_LOCKED, CU_rec.PASSWD, CU_rec.PASSWORD_EXPIRED)
Some of the target columns, like ACCOUNT_EXPIRED, ACCOUNT_LOCKED, etc. are the bit(1) columns in MySQL. Given that I can convert the data types in the cursor CU_rec to pretty much anything I want in Oracle, how can I get them inserted into the target? I've tried everything I can think of, and I just keep getting:
Error report:
ORA-28500: connection from ORACLE to a non-Oracle system returned this message:
[MySQL][ODBC 5.2(w) Driver][mysqld-5.6.10]Data too long for column 'ACCOUNT_EXPIRED' at row 1 {HY000,NativeErr = 1406}
ORA-02063: preceding 2 lines from MOBILEAPI
ORA-06512: at line 44
28500. 00000 - "connection from ORACLE to a non-Oracle system returned this message:"
*Cause: The cause is explained in the forwarded message.
*Action: See the non-Oracle system's documentation of the forwarded
message.
Any help at all would be greatly appreciated.
Your problem is Oracle's default datatype conversion over ODBC; according to their own documentation they convert SQL_BINARY to a raw. Although not directly related, Oracle's comparison of MySQL and Oracle within SQL Developer also alludes to the fact that the automatic conversion from a MySQL bit is to an Oracle raw.
Extremely confusingly, MySQL's documentation indicates that a bit is converted to a SQL_BIT or a SQL_CHAR, which implies that it may work in the other direction1.
According to Microsoft's ODBC docs you should, theoretically, be able to use the CONVERT() function to transform this into a character, which should, theoretically, be translatable by MySQL.
insert into some_table#some_db (bit_col)
values( {fn convert(some_col, SQL_CHAR)} );
Failing that there's another couple of options, but it does depend on what you're attempting to insert into the MySQL database from Oracle and what the datatype is in Oracle. For instance you could use the Oracle CAST() function to convert between datatypes. For instance, the following would convert an integer to a binary double.
select cast(1 as binary_double) from dual
Unfortunately, you can't cast an integer to a raw, only a character or a rowid, so in order to convert to a raw you'd have to do the following:
select cast(to_char(1) as raw(1)) from dual
I've no idea whether MySQL will accept this but with some testing you should be able to work it out.
1. For clarity, I've never tried it in either direction.
Hah! I found a solution. Dropping it here in case it helps someone else. It's not pretty, but it works.
I used the old EXECUTE IMMEDIATE trick.
Basically, I created a variable sql_stmt varchar2(4000) and wrote code like:
sql_stmt := 'insert into "app_user"#mobileapi (USERNAME, VERSION, ACCOUNT_EXPIRED, ACCOUNT_LOCKED, CIPHER_PASSPHRASE, ENABLED, PASSWD, PASSWORD_EXPIRED)
values ('''||CU_rec.USERNAME||'','||CU_rec.VERSION||', '||CU_rec.ACCOUNT_EXPIRED||', '||CU_rec.ACCOUNT_LOCKED||', '''||CU_rec.CIPHER_PASSPHRASE||''', '||
CU_rec.ENABLED||', '''||CU_rec.PASSWD||''', '||CU_rec.PASSWORD_EXPIRED||')';
EXECUTE IMMEDIATE sql_stmt;
Something like that anyway (the quotes might not line up, as I hacked this a bit from the actual code). Looking at the contents of sql_stmt, I get:
insert into "app_user"#mobileapi (USERNAME, VERSION, ACCOUNT_EXPIRED, ACCOUNT_LOCKED, CIPHER_PASSPHRASE, ENABLED, PASSWD,PASSWORD_EXPIRED)
values ('user#email.com', 0, 0, 0, 'asdfastrwaebawavgansdhnsgjsjsh', 1, 'awercbcakwjerhcawuerawieubkahbewvkruh', 0)
The EXECUTE IMMEDIATE completes, and checking the target table, the values are there.
Possibly a crappy solution, but better than nothing.

MySQL only inserting first row

I'm trying to insert a ton of rows into my MySQL database. I have a query like this, but with about 700 more repetitive entries in it but for some reason the query is only inserting the first row to the database. In this case it would be '374','4957','0'.
INSERT INTO table VALUES ('374','4957','0'),('374','3834','0'),('374','4958','0'),('374','5076','0'),('374','4921','0'),('374','3835','0'),('374','4922','0'),('374','3836','0'),('374','3837','0'),('374','4879','0'),('374','3838','0')
I can't figure out what I'm doing wrong.
Thank you in advance.
Don't mean to state the obvious, but if the first field '374' is your primary key field, than this is the issue.
Otherwise, are there any error messages received from the database? That is always a good place to look for bugs.
For better understanding why something is not working next time use code like this:
$sql = "INSERT INTO table VALUES ('374','4957','0'),('374','3834','0')";
if (!mysqli_query($link, $sql)) {
printf("Errormessage: %s\n", mysqli_error($link));
}
That should display error message returned from MySQL.
More information: PHP manual - mysqli_error
Try to write the column names before values.
For example:
INSERT INTO table (column1,column2,column3) VALUES ...

Codeignighter Record wont insert

Using CI for the first time and i'm smashing my head with this seemingly simple issue. My query wont insert the record.
In an attempt to debug a possible problem, the insert code has been simplified but i'm still getting no joy.
Essentially, i'm using;
$data = array('post_post' => $this->input->post('ask_question'));
$this->db->insert('posts', $data);
I'm getting no errors (although that possibly due to disabling them in config/database.php due to another CI related trauma :-$ )
Ive used
echo print $this->db->last_query();
to get the generated query, shown as below:
INSERT INTO `posts` (`post_post`) VALUES ('some text')
I have pasted this query into phpMyAdmin, it inserts no problem. Ive even tried using $this->db->query() to run the outputted query above 'manually' but again, the record will not insert.
The scheme of the DB table 'posts' is simply two columns, post_id & post_post.
Please, any pointers on whats going on here would be greatly appreciated...thanks
OK..Solved, after much a messing with CI.
Got it to work by setting persistant connection to false.
$db['default']['pconnect'] = FALSE;
sigh
Things generally look ok, everything you have said suggests that it should work. My first instinct would be to check that what you're inserting is compatible with your SQL field.
Just a cool CI feature; I'd suggest you take a look at the CI Database Transaction class. Transactions allow you to wrap your query/queries inside a transaction, which can be rolled back on failure, and can also make error handling easier:
$this->db->trans_start();
$this->db->query('INSERT INTO posts ...etc ');
$this->db->trans_complete();
if ($this->db->trans_status() === FALSE)
{
// generate an error... or use the log_message() function to log your error
}
Alternatively, one thing you can do is put your Insert SQL statement into $this->db->query(your_query_here), instead of calling insert. There is a CI Query feature called Query Binding which will also auto-escape your passed data array.
Let me know how it goes, and hope this helps!