I am currently using mutler to upload the files to the server. Below is my query:
var insertSQL = "INSERT INTO ic_photos (icFrontURL,icBackURL,selfieURL,customer_id) VALUES ('" + frontICPath + "','"+backICPath + "','" + selfiePath + "','" + customerID + "')";
Console.log returns
"INSERT INTO ic_photos (icFrontURL,icBackURL,selfieURL,customer_id) VALUES ('public\images\frontIC_1526709299585_potato.png','public\images\backIC_1526709299595_potato2.jpg','public\images\selfie_1526709299596_potato3.jpg','41')"
But when it goes into mysql table, it shows the following value:
'publicimagesfrontIC_1526709040516_potato.png'
The slashes are missing. How can I fix this when I make the insert query ?
This replaces the backslash with two backslashes. When it got inserted into the table, it will become one backslash
frontICPath = frontICPath.replace(/\\/g, "\\\\");
In most, if not all string-based interpretation environments, a backslash is considered a special character. To use a backslash explicitly, please use: "\\".
The two slashes above consist of 4 slashes in total through the text editor.
Related
I'm on node and want to write this in my mysql db:
var x = JSON.stringify(['aa"a']);
console.log(x);
mysqlConnection.query("UPDATE `table` SET field = '" + x + "' WHERE id = 1");
The console.log() produces: ["aa\"a"]
When I read the string from the db later, I get: ["aa"a"]
The backslash is missing, making the string useless, as calling JSON.parse() would produce an error.
You're mashing your SQL together as a string. \ is an escape character (in SQL as well as JSON), so it escapes the " when passed to the SQL engine.
Use placeholders (whichever MySQL API library you are using should have a way of using them) instead of manually shoving variables into the string of SQL.
I have a query using the IN operator, in an array with 1000+ values. I've searched the error so far but couldn't find what I want : my SELECT is not optimized because I'm using the OR operator and it takes quite a lot of time.
I'm chopping my query into different arrays at the moment :
query = "SELECT\n" +
" filename,\n" +
" status\n" +
"FROM\n" +
" table1\n" +
"WHERE\n" +
" filename IN " + allFileNames[0];
for(int m=1; m<allFileNames.length; m++) {
query +=
" OR\n" +
" filename IN " + allFileNames[m];
}
What this does is essentially I have the allFileNames array at the moment, and each element of the array contains a string with 1000 file names. I'm using a OR operator on each element of the array.
How can I optimize all of this ? Is it possible without creating a temporary table ? Maybe using a substring, but I haven't quite found the solution yet.
Thanks in advance!
EDIT: I can't use temporary table since I don't have writing access to the database
Assuming single allFileNames entry is a single file name, then you can flatten everything under one IN statement.
query = "SELECT\n" +
" filename,\n" +
"FROM\n" +
" table1\n" +
"WHERE\n" +
" filename IN (" + allFileNames.stream().collect(Collectors.joining(","))
+")"
Because all of your conditions, joined by OR statements are using same property, you should merge them.
But if your allFileNames collections is too large, then you may consider executing your query in batches:
split your allFileNames collection to batches of 100-1000 then execute query for each batch
aggregate all results
I am new to android development, I need help with separation columns in csv files.
I can already do the data is saved in the next row and column. But when I export the data, appear some commas and quotation marks (") in the first column and the last. So I had to put some empty columns so that the figures do not mix with these unwanted commas and quotation marks. But they keep coming, where am I doing wrong?
Below the code:
if (listdata.size() > 1) {
for (int index = 0; index < listdata.size(); index++) {
person = (br.com.att.actoap.Adapter.Person) listdata.get(index);
String arrStr5[] = {" ", ";" + "/" + ";" + ";" + person.getSampleN() + ";" + person.getPeso() + ";" + person.getFatc() + ";" + person.getFsMol() + ";" + person.getFaAci() + ";" + person.getVolum() + ";" + person.getResu() + ";" + ";" + "/" + ""};
csvWrite.writeNext(arrStr5);
}
Follow the error:
http://i.stack.imgur.com/OnPpM.png
I don't want this commas, how i can do to remove this?
Thanks!!
It would appear that cvswrite is actually writing ONE column to the CVS file as a corrent CSV file, with commas ',' and doublequotes '"'.
But your version of Excel expects a "localised" variant of a CSV file (read broken) and expects semicolons ';' as the separator instead of commas.
Possible solutions are
Configure Excel to read real CSV files and use your library correctly.
Configure your library to export semicolon separated files
Use TAB separated files instead.
I have an issue with an sql statement and i dont know how to handle it. Here is the problem:
query = "INSERT INTO `mmr`(`userID`, `RunningProjects`, `MainOrders`) VALUES ("
+ session.getAttribute("id")
+ ",'"
+ request.getParameter("RunningProjects")
+ "','"
+ request.getParameter("MainOrders")')";
The values are obtained from the post form which contains free text. The problem is, whenever a user enters characters like ', i will get an error because that tells the compiler that the value is over here(i suppose) and now look for the next value. I don't know how to include these characters and send them to database without having an error. Any help would be appreciated. Thank you.
The character ' is used to surround literals in MySQL. And if any data contains such character as part of it, we have to escape it. This can be done using Prepared Statement in Java.
Change your query code accordingly.
query = "INSERT INTO `mmr`(`userID`, `RunningProjects`, `MainOrders`)
VALUES ( ?, ?,? )";
Now define a PreparedStatement instance and use it to bind values.
PreparedStatement pst = con.prepareStatement( query );
pst.setString( 1, session.getAttribute("id") );
pst.setString( 2, request.getParameter("RunningProjects") );
pst.setString( 3, request.getParameter("MainOrders") );
int result = pst.executeUpdate();
And, I suggest use of beans to handle business logic.
change
query = "INSERT INTO `mmr`(`userID`, `RunningProjects`, `MainOrders`) VALUES ("
+ session.getAttribute("id")
+ ",'"
+ request.getParameter("RunningProjects")
+ "','"
+ request.getParameter("MainOrders")
+ "')";
I think you are using normal statement in your JDBC code. Instead, I would suggest you to use Prepared statement. Prepared statement is generally used to eliminate this kind of problem and caching issue. If you will use prepared statement I think your problem will be solved
I'm using the extra() modifier in a view.
(See http://docs.djangoproject.com/en/1.1/ref/models/querysets/#extra-select-none-where-none-params-none-tables-none-order-by-none-select-params-none )
Here's the the code:
def viewname(request)
...
exact_matchstrings=[]
exact_matchstrings.append("(accountprofile.first_name LIKE '" + term + "')")
exact_matchstrings.append("(accountprofile.first_name LIKE '" + term + '\%' + "')")
extraquerystring = " + ".join(exact_matchstrings)
return_queryset = return_queryset.extra(
select = { 'match_weight': extraquerystring },
)
The two append statements above are almost completely alike except that the second adds a % SQL wildcard character. This is causing an error; the statement without the % causes no problems. What's going on with the %? I'm surprised that django thinks this character is not defined, since it's in SQL specification. For example, the following SQL statement executes just fine:
select (first_name like "Car") + (first_name like "Car%") from accountprofile;
But trying to run it via the extra() modifier in my view code and evaluating the resulting queryset gives me an error. I think "%" needs to be escaped, so I tried that already. Any ideas?
Just ran into this issue ourselves doing a extra query with LIKE. To escape a % you need to do %%
Percentage sign not working
It looks like you are missing some quotes from the 2nd string. And I'm not sure that you need to escape the percent (%) unless this is required by django.
_matchstrings.append("(accountprofile.first_name LIKE '" + term + "%" + "')")