Avoid data for being inserted into new rows in mysql database - mysql

I have saved a source code .ccp type file under column 'file' in file_details table in a mysql database. It is done not by using a program but by directly inserting through localhost/phpmyadmin interface. The type of the data field is BLOB. Then by accessing saved codes I am calculating the number of spaces of each source codes and I want to insert the values to the database column called 'spaces' in the same table. Here I have given the query that I used.
Statement stmt = conn.createStatement();
Statement count_to_db=conn.createStatement();
String query = "SELECT prog_num,file FROM file_details";
ResultSet rs = stmt.executeQuery(query);
while (rs.next()) {
String file_content = rs.getString("file");
CalculatingSpaces calSpaces = new CalculatingSpaces();
String num_of_Spaces= calSpaces.CalculatingSpaces(rs, pw, file_content);
int spaces=Integer.parseInt(num_of_Spaces);
String space_query="INSERT INTO file_details(spaces) VALUES ('"+spaces+"')";
count_to_db.executeUpdate(space_query);
}
But when I am inserting the spaces they are inserted to a new row but not for the same row where I have uploaded the BLOB files. Is there any way to insert the number of spaces in-front of the BLOB files which spaces related to.
PS : So the problem is Can't I insert data column by column to a database?

If you're adding your table row for the first time, you'll have to use the SQL INSERT command.
If you want to update an existent row, you have to use the SQL UPDATE command.
Don't forget that you need to know which row(s) you want to UPDATE, so you'll also need to define a way to uniquely identify your rows (aka the Primary Key).
See http://dev.mysql.com/doc/refman/5.0/en/update.html for details

Related

Rails - How to reference model's own column value during update statement?

Is it possible to achieve something like this?
Suppose name and plural_name are fields of Animal's table.
Suppose pluralise_animal is a helper function which takes a string and returns its plural literal.
I cannot loop over the animal records for technical reasons.
This is just an example
Animal.update_all("plural_name = ?", pluralise_animal("I WANT THE ANIMAL NAME HERE, the `name` column's value"))
I want something similar to how you can use functions in MySQL while modifying column values. Is this out-of-scope or possible?
UPDATE animals SET plural_name = CONCAT(name, 's') -- just an example to explain what I mean by referencing a column. I'm aware of the problems in this example.
Thanks in advance
I cannot loop over the animal records for technical reasons.
Sorry, this cannot be done with this restriction.
If your pluralizing helper function is implemented in the client, then you have to fetch data values back to the client, pluralize them, and then post them back to the database.
If you want the UPDATE to run against a set of rows without fetching data values back to the client, then you must implement the pluralization logic in an SQL expression, or a stored function or something.
UPDATE statements run in the database engine. They cannot call functions in the client.
Use a ruby script to generate a SQL script that INSERTS the plural values into a temp table
File.open(filename, 'w') do |file|
file.puts "CREATE TEMPORARY TABLE pluralised_animals(id INT, plural varchar(50));"
file.puts "INSERT INTO pluralised_animals(id, plural) VALUES"
Animal.each.do |animal|
file.puts( "( #{animal.id}, #{pluralise_animal(animal.name)}),"
end
end
Note: replace the trailing comma(,) with a semicolon (;)
Then run the generated SQL script in the database to populate the temp table.
Finally run a SQL update statement in the database that joins the temp table to the main table...
UPDATE animals a
INNER JOIN pluralised_animals pa
ON a.id = pa.id
SET a.plural_name = pa.plural;

How do I insert data from a dataframe into selected columns in a MySQL table using R

I have a data frame made up of 3 columns named INTERNAL_ID, NT_CLONOTYPE and SAMPLE_ID. I need to write a script in R that will transfer this data into the appropriate 3 columns with the exact names in a MySQL table. However, the table has more than 3 columns, say 5 (INTERNAL_ID, COUNT, NT_CLONOTYPE, AA_CLONOTYPE, and SAMPLE_ID). The MySQL table already exists and may or may not include preexisting rows of data.
I'm using the dbx and RMariaDB libraries in R. I've been able to connect to the MySQL database with dbxConnect(). When I try to run dbxUpsert()
-----
conx <- dbxConnect(adapter = "mysql", dbname = "TCR_DB", host = "127.0.0.1", user = "xxxxx", password = "xxxxxxx")
table <- "TCR"
records <- newdf #dataframe previously created with the update data.
dbxUpsert(conx, table, records, where_cols = c("INTERNAL_ID"))
dbxDisconnect(conx)
I expect to obtain an updated mysql table with the new rows, which may or may not have null entries in the columns not contained in the data frame.
Ex.
INTERNAL_ID COUNT NT_CLONOTYPE AA_CLONOTYPE SAMPLE_ID
Pxxxxxx.01 CTTGGAACTG PMA.01
The connection and disconnection all run fin, but instead of the output I obtain the following error:
Error in .local(conn, statement, ...) :
could not run statement: Field 'COUNT' doesn't have a default value
I'm suspecting it's because the number of columns in each file are not the same, but I'm not sure. And if such, how can I get around this.
I figured it out. I changed the table entry for "COUNT" to default to NULL. This allowed for the program to proceed by ignoring "COUNT".

Populating Autocomplete list with new tags

I'm using Lucee 5.x and Maria DB (MySQL).
I have a user supplied comma delimited list. I need to query the database and if the item isn't in the database, I need to add it.
user supplied list
green
blue
purple
white
database items
black
white
red
blue
pink
orange
lime
It is not expected that the database list would grow to more than 30 items but end-users always find 'creative' ways to use the tools we provide them.
So using the user supplied list above, only green and purple should be added to the database.
Do I compare the user supplied list against the database items or vice versa? Would the process change if the user supplied list count exceeds what is in the database (meaning if the user submits 10 items and the database only contains 5 items)? I'm not sure which loop is the better way to determine which items are new. Needs to be in cfscript and I'm looking at the looping options as outlined here (https://www.petefreitag.com/cheatsheets/coldfusion/cfscript/)
FOR Loop
FOR IN Loop (Array)
FOR IN Loop (Query)
I tried MySQL of NOT IN but that left me with the existing database values in addition to the new ones. I know this should be simple and I'm over complicating this somewhere and/or am too close to the problem to see the solution.
You could do this:
get a list with existing items from database
append user supplied list
remove duplicates
update db if items were added
<cfscript>
var userItems = '"green","blue","purple","white"';
var dbItems = '"black","white","red","blue","pink","orange","lime"';
var result = ListRemoveDuplicates( ListAppend(dbItems, userItems));
if (ListLen(result) neq ListLen(dbItems)) {
// update db
}
</cfscript>
Update (only new items)
<cfscript>
var userItems = '"green","blue","purple","white"';
var dbItems = '"black","white","red","blue","pink","orange","lime"';
var newItems = '';
ListEach(userItems, function (item) {
if (not ListFind(dbItems, item)) {
newItems = ListAppend(newItems, item);
}
})
</cfscript>
trycf.com gist:
(https://trycf.com/gist/f6a44821165338b3c10b7808606979e6/lucee5?theme=monokai)
Again, since this is an operation that the database can do, I'd feed the input data to the database and then let it decide how to deal with multiple keys. I don't recommend using CF to loop through your values to check them and then doing the INSERT. This will require multiple trips to the database and then processing on the application server that isn't really needed.
My suggestion is to use MariaDB's INSERT....ON DUPLICATE KEY UPDATE... syntax. This will also require that whatever field you are trying to insert on actually has a UNIQUE constraint on it. Without that constraint, then your database itself doesn't care if you have duplicate data, when can cause its own set of issues.
For the database, we have
CREATE TABLE t1 (mycolor varchar(50)
, CONSTRAINT constraint_mycolor UNIQUE (mycolor)
) ;
INSERT INTO t1(mycolor)
VALUES ('black'),('white'),('red'),('blue'),('pink'),('orange'),('lime')
;
The ColdFusion is:
<cfscript>
myInputValues = "green,blue,purple,white" ;
myQueryValues = "" ;
function sanitizeValue ( String inVal required ) {
// do sanitization stuff here
var sanitizedInVal = arguments.inVal ;
return sanitizedInVal ;
}
myQueryValues = myInputValues.listMap(
function(i) {
return "('" & sanitizeValue(i) & "')" ;
}
) ;
// This will take parameterization out of the cfquery tag and
preform sanitization and validation before building the
query string.
myQuery = new query();
myQuery.name = "myQuery";
myQuery.setDataSource("dsn");
sqlString = "INSERT INTO t1(mycolor) VALUES "
& myQueryValues
& " ON DUPLICATE KEY UPDATE mycolor=mycolor;"
;
myQuery.setSQL(sqlString);
myQueryResult = myQuery.execute().getResult();
</cfscript>
First, build up your input values (myInputValues). You'll want to do validation and sanitization on them to prevent nastiness from entering your database. I created a sanitizeValue function to be the placeholder for the sanitization and validation operations.
myQueryValues will become a string list of the values in the proper format that we will use to insert into the database.
Then we just build up a new query(), using myQueryValues in the sqlString to get our query. Again, since we are building a string for multiple values to INSERT, I don't think there's a way to user queryparam for those VALUES. But since we cleaned up our string earlier, it should do much of what cfqueryparam does anyway.
We use MariaDB's INSERT INTO .... ON DUPLICATE KEY UPDATE ... syntax to only insert unique values. Again, this requires that the database itself has a constraint to prevent duplicates in whatever column we're inserting.
For a demo: https://dbfiddle.uk/?rdbms=mariadb_10.2&fiddle=4308da3addb9135e49eeee451c6e9e58
This should do what you're looking to do without beating up on your database too much. I don't have a Lucee or MariaDB server set up to test, so you'll have to give it a shot and see how it performs. I don't know how big your database is or will become, but this should still query pretty quickly.

How to get database sql values from an active record object?

My original problem is that I need to insert a lot of records to DB, so to speed up, I want to use mysqlimport which takes a file of row values and load them to specified table. So suppose I have model Book, I couldn't simply use book.attributes.values as one of the fields is a hash that is serialized to db (using serialize), so I need to know what is the format this hash will be stored in in the db. Same for time and dates fields. Any help?
How about using SQL insert statements instead of serialization?
book = Book.new(:title => 'Much Ado About Nothing', author: 'William Shakespeare')
sql = book.class.arel_table.create_insert
.tap { |im| im.insert(record.send(
:arel_attributes_with_values_for_create,
record.attribute_names)) }
.to_sql

insert row before the first row

I'm trying to write to a database at the beginning so that the new row is the first row.The code i have here will write to it, but it won't start before the first row. I've tried using resultset and tried to use beforeFirst(), but it tells me that i can't use either one. I'm stumped as to how to do it. any help would be appreciated! :D
String host = "jdbc:derby://localhost:1527/Scores";
String uName = "root";
String uPass= "root";
Connection con = DriverManager.getConnection( host, uName, uPass );
String SQLw = "INSERT into ROOT.HISCORES (Intials,Score) values (?,?)";
pst = con.prepareStatement(SQLw);
pst.setString(1,jtxtIntials.getText());
pst.setString(2,Score);
pst.execute();
The rows in a relational database are indeed stored in some physical order, but using SQL, you are unable to control this order directly. Thinking of the order in which the rows are physically stored is absolutely against the philosophy of SQL, and hence such a functionality is not included in it.
If you are interested in ordering your rows, you should add an extra attribute, for numbering the rows, by example, and use the ORDER BY clause whenever retrieving them, as mentioned in the comments.