CF8 and AES decrypting MySQL AES: encodings are not same - mysql

This has become more of an exercise in what am I doing wrong than mission critical, but I'd still like to see what (simple probably) mistake I'm making.
I'm using mysql (5.1.x) AES_ENCRYPT to encrypt a string. I'm using CF's generateSecretKey('AES') to make a key (I've tried it at defaul and 128 and 256 bit lengths).
So let's say my code looks like this:
<cfset key = 'qLHVTZL9zF81kiTnNnK0Vg=='/>
<cfset strToEncrypt = '4111111111111111'/>
<cfquery name="i" datasource="#dsn#">
INSERT INTO table(str)
VALUES AES_ENCRYPT(strToEncrypt,'#key#');
</cfquery>
That works fine as expected and I can select it using SELECT AES_DECRYPT(str,'#key#') AS... with no problems at all.
What I can't seem to do though is get CF to decrypt it using something like:
<cfquery name="s" datasource="#dsn#">
SELECT str
FROM table
</cfquery>
<cfoutput>#Decrypt(s.str,key,'AES')#</cfoutput>
or
<cfoutput>#Decrypt(toString(s.str),key,'AES')#</cfoutput>
I keep getting "The input and output encodings are not same" (including the toString() - without that I get a binary data error). The field type for the encrypted string in the db is blob.

This entry explains that mySQL handles AES-128 keys a bit differently than you might expect:
.. the MySQL algorithm just or’s the bytes of a given passphrase
against the previous bytes if the password is longer than 16 chars and
just leaves them 0 when the password is shorter than 16 chars.
Not highly tested, but this seems to yield the same results (in hex).
<cfscript>
function getMySQLAES128Key( key ) {
var keyBytes = charsetDecode( arguments.key, "utf-8" );
var finalBytes = listToArray( repeatString("0,", 16) );
for (var i = 1; i <= arrayLen(keyBytes); i++) {
// adjust for base 0 vs 1 index
var pos = ((i-1) % 16) + 1;
finalBytes[ pos ] = bitXOR(finalBytes[ pos ], keyBytes[ i ]);
}
return binaryEncode( javacast("byte[]", finalBytes ), "base64" );
}
key = "qLHVTZL9zF81kiTnNnK0Vg==";
input = "4111111111111111";
encrypted = encrypt(input, getMySQLAES128Key(key), "AES", "hex");
WriteDump("encrypted="& encrypted);
// note: assumes input is in "hex". either convert the bytes
// to hex in mySQL first or use binaryEncode
decrypted = decrypt(encrypted, getMySQLAES128Key(key), "AES", "hex");
WriteDump("decrypted="& decrypted);
</cfscript>
Note: If you are using mySQL for encryption be sure to see its documentation which mentions the plain text may end up in various logs (replication, history, etectera) and "may be read by anyone having read access to that information".
Update: Things may have changed, but according to this 2004 bug report the .mysql_history file is only on Unix. (Keep in mind there may be other log files) Detailed instructions for clearing .mysql_history can be found in the manual, but in summary:
Set the MYSQL_HISTFILE variable to /dev/null (on each log in)
Create .mysql_history as a symbolic link to /dev/null (only once)

Related

yubihsm2 signatures are invalid when signing ETH transactions

I am trying to figure out how to get this yubihsm2 to work with signing eth transactions. I have been using the python lib and so far i have had some basic setup. Below is an abbreviation of what i have
web3_endpoint = ''
web3 = Web3(HTTPProvider(web3_endpoint))
hsm = YubiHsm.connect("http://localhost:12345")
session = hsm.create_session_derived(1, "password")
key = session.get_object(1,OBJECT.ASYMMETRIC_KEY)
#key = AsymmetricKey.generate(session, 1, "EC Key", 1, CAPABILITY.SIGN_ECDSA, ALGORITHM.EC_K256)
pub_key = key.get_public_key()
#raw_pub = pub_key.public_bytes(
# encoding=serialization.Encoding.DER,
# format=serialization.PublicFormat.SubjectPublicKeyInfo
# )
raw_pub = pub_key.public_bytes(
encoding=serialization.Encoding.X962,
format=serialization.PublicFormat.UncompressedPoint
)
print ("Public key (Uncompressed):\n",binascii.b2a_hex(raw_pub))
unindexPub = raw_pub[1:]
public_key_hash = Web3.keccak(unindexPub)
address_bytes = public_key_hash[-20:]
address = address_bytes.hex()
print(address)
Now so far i can consistently get the same public key and it looks correct. I then get the same public key each time. When i say correct, the formatting looks correct and is the correct number of bytes.
1). should i be using the commented out public key formatting or the uncompressed X962 encoding that i have above.
From there, this is where things get a bit weird
transaction = {
'to': Web3.toChecksumAddress('0x785AB1daE1b0Ee3f2412aCF55e4153A9517b07e1'),
'gas': 21000,
'gasPrice': Web3.toWei(5, 'gwei'),
'value': 1,
'nonce': 1,
'chainId': 4,
}
serializable_transaction = serializable_unsigned_transaction_from_dict(transaction)
transaction_hash = serializable_transaction.hash()
print(transaction_hash.hex())
# sign the transaction hash and calculate v value
signature = key.sign_ecdsa(transaction_hash,hashes.SHA3_256())
r, s = ecdsa.util.sigdecode_der(signature, ecdsa.SECP256k1.generator.order())
print("r: "+str(r)+"\ns: "+str(s))
v = 28
# encode the transaction along with the full signature and send it
encoded_transaction = encode_transaction(serializable_transaction, vrs=(v, r, s))
web3.eth.sendRawTransaction(encoded_transaction)
I am settings v to 28.. i also test it with 27.. I could use the correct amount with the chainid.. but it's not necessary right from the perspective of trying to get a valid signature (recoverable to get the same public key each time). Sometimes i am getting the error "invalid sender" and other times i am getting the error "insufficient gas." If i take the signature output and use a javascript lib to try to find the public key, each time i am getting a different public key. But i keep consistently generating the same public key from the yubihsm2 in this python app.
I have also commented out in sign_ecdsa the hashing function as i am passing in the data already hashed (in order to use keccak256).
Is there something i am missing? Why are these transactions not signing correctly for eth?
i am getting some of those serialization helpers from enter link description here
helper serialization functions
Thanks

Why does Coldfusion add an extra quote to each existing quote in cflog?

Trying to save info to the logs using cflog, and save in JSON format. But for some reason, an extra dbl-quote is getting added next to each already existing dbl-quote.
Example: I do something like this, turning a simple struct into a JSON string:
<cfset local.fname = "Max">
<cfset local.lname = "Smith">
<cfset local.id = "QA-123">
<cflog text="#serializeJSON(local)#">
And in the logs, it gets saved to look like:
"INFO","http-apr-8888-exec-6","10/04/2021","19:24:46","","{""fname"":""Max"",""lname"":""Smith"",""id"":""QA-123""}"
Then if I try to save it this way, I get no quotes, and thus invalid JSON.
<cflog text='{fname:Max,lname:smith,id:QA-123}'>
results in:
"INFO","http-apr-8888-exec-6","10/04/2021","19:24:46","","{fname:Max,lname:Smith,id:QA-123}"
And
<cflog text='{"fname":"Max","lname":"smith","id":"QA-123"}'>
results in the same as the first example:
"INFO","http-apr-8888-exec-6","10/04/2021","19:24:46","","{""fname"":""Max"",""lname"":""Smith"",""id"":""QA-123""}"
Why is it doing this, and how do I end up with the log entry I want, without any extra quotes?:
"INFO","http-apr-8888-exec-6","10/04/2021","19:24:46","","{"fname":"Max","lname":"Smith","id":"QA-123"}"
We're running CF10 (older version because we're phasing out of CF), and viewing the logs through Splunk. Not sure if Splunk is a CSV parser, but figured out a way to get the logs written as desired.
Created a function named writeLog(), using Java sys.out.println instead of the cflog tag:
<cfscript>
function writeLog(required message) {
var logString = serializeJSON(arguments);
sys.out.println('{"timestamp":"#dateTimeFormat(now(), "yyyy-mm-dd'T'hh:mm:ss:ssssssZ")#",#Right(logString, Len(logString) - 1)#');
}
</cfscript>
And then call it like so:
<cfscript>
var emailData = structNew();
emailData.toAddress = ARGUMENTS.to;
emailData.fromAddress = ARGUMENTS.from;
emailData.subject = ARGUMENTS.subject;
APPLICATION.general.writeLog(message="Sending email", argumentCollection=emailData);
</cfscript>
And (with some adjustments to Splunk) the resulting log looks like:
{
FROMADDRESS: no-reply#blah.com
SUBJECT: Welcome to the team
TOADDRESS: someone#example.com
message: Sending email
timestamp: 2021-10-05T11:10:21:000021-0400
}
I've go through your issue. Yes we have this from Coldfusion 11, Coldfusion 2016. But not in Coldfusion 2018 ( Update 12 ). May be they fixed this case in recent updates about Coldfusion 2018.
Please try to update your version up to date then check with these issue.
But for now, I can give you some solution by using replace() method.
Sample Code :
<cfset myStr = {} >
<cfset myStr.fname = "Kannan">
<cfset myStr.lname = "Pasumpon">
<cfset myStr.id = "CF-123">
<cfset jsonData = replace(serializeJSON( myStr ), '"',"'","All")>
<cflog text="#jsonData#" file="testingWorks">
Here I just replace all my double quotes value with single quotes. And logged it in testingWorks.log file. The new result will be like below,
Result In Log File :
"Information","http-nio-8501-exec-6","10/05/21","12:21:04","","{'LNAME':'Pasumpon','ID':'CF-123','FNAME':'Kannan'}"
Likewise, You can achieve the this things as per your needs.
Note : The update version resolved this case by default. So I would suggest that update your version first instead of changed / replace the notations.

AES encryption on mysql and node.js

I have struggle below question with days, and posted same question earlier and didn't get any positive feedback.
Im using mysql in build aes_encrypt method to encrypt new and existing data.
https://dev.mysql.com/doc/refman/8.0/en/encryption-functions.html
SET ##SESSION.block_encryption_mode = 'aes-256-ecb';
INSERT INTO test_aes_ecb ( column_one, column_two )
values ( aes_encrypt('text','key'), aes_encrypt('text', 'key'));
I used ecb ciper, so It no need to use iv for that. Issue is I can't decrypt it from node.js side.
Im using sequelize and tried to call data through model --> decrypt from node side.
I tried with below libraries,
"aes-ecb": "^1.3.15",
"aes256": "^1.1.0",
"crypto-js": "^4.1.1",
"mysql-aes": "0.0.1",
Below are code snippets from sequelize call
async function testmysqlAESModel () {
const users = await test.findAll();
console.log('users', users[0].column_one);
var decrypt = AES.decrypt( users[0].column_one, 'key' );
}
Its returning buffer data and couldn't decrypt from node side, Can someone provide proper example for that? Im struggling for days.
EDIT
Inserted record to mysql as below query.
SET ##SESSION.block_encryption_mode = 'aes-256-ecb';
INSERT INTO test_aes_ecb ( id, column_one, column_two )
VALUES (1, 2,AES_ENCRYPT('test',UNHEX('gVkYp3s6v9y$B&E)H#McQeThWmZq4t7w')));
In nodejs called like this,
testmysqlAESModel();
async function testmysqlAESModel () {
const users = await test.findAll();
console.log('users', users[0].column_one);
var decipher = crypto.createDecipheriv(algorithm, Buffer.from("gVkYp3s6v9y$B&E)H#McQeThWmZq4t7w", "hex"), "");
var encrypted = Buffer.from(users[0].column_one); // Note that this is what is stored inside your database, so that corresponds to users[0].column_one
var decrypted = decipher.update(encrypted, 'binary', 'utf8');
decrypted += decipher.final('utf8');
console.log(decrypted);
}
Im getting below error,
I used below link to create 256bit key.
https://www.allkeysgenerator.com/Random/Security-Encryption-Key-Generator.aspx
Still couldn't fix, can you provide sample project or any kind of supporting code snippet for that ?
There are multiple issues here:
Ensure that your key has the correct length. AES is specified for certain key length (i.e. 128, 196 and 256 bit). if you use any other key length, then your key will be padded (zero extended) or truncated by the crypto library. This is a non-standard process, and different implementations will do this differently. To avoid this, use a key in the correct length and store it has hex instead of ascii (to avoid charset issues)
Potential issues regarding password to key inference. Some AES implementations use methods to infer keys from passwords/passphrases. Since you are using raw keys in MySQL, you do not want to infer anything but want to use raw keys in NodeJS as well. This means that if you are using the native crypto module, that you want to use createDecipheriv instead of createDecipher.
Caution: The AES mode you are using (ECB) is inherently insecure, because equal input leads to equal output. There are ways around that using other AES modes, such as CBC or GCM. You have been warned.
Example:
MySQL SELECT AES_ENCRYPT('text',UNHEX('F3229A0B371ED2D9441B830D21A390C3')) as test; returns the buffer [145,108,16,83,247,49,165,147,71,115,72,63,152,29,218,246];
Decoding this in Node could look like this:
var crypto = require('crypto');
var algorithm = 'aes-128-ecb';
var decipher = crypto.createDecipheriv(algorithm, Buffer.from("F3229A0B371ED2D9441B830D21A390C3", "hex"), "");
var encrypted = Buffer.from([145,108,16,83,247,49,165,147,71,115,72,63,152,29,218,246]); // Note that this is what is stored inside your database, so that corresponds to users[0].column_one
var decrypted = decipher.update(encrypted, 'binary', 'utf8');
decrypted += decipher.final('utf8');
console.log(decrypted);
This prints text again.
Note that F3229A0B371ED2D9441B830D21A390C3 is the key in this example, you would obviously have to create your own. Just ensure that your key has the same length as the example, and is a valid hex string.

weird escape behaviour when writing string from node to mysql db

I'm on node and want to write this in my mysql db:
var x = JSON.stringify(['aa"a']);
console.log(x);
mysqlConnection.query("UPDATE `table` SET field = '" + x + "' WHERE id = 1");
The console.log() produces: ["aa\"a"]
When I read the string from the db later, I get: ["aa"a"]
The backslash is missing, making the string useless, as calling JSON.parse() would produce an error.
You're mashing your SQL together as a string. \ is an escape character (in SQL as well as JSON), so it escapes the " when passed to the SQL engine.
Use placeholders (whichever MySQL API library you are using should have a way of using them) instead of manually shoving variables into the string of SQL.

To read SO's data dump effectively

I use currently Vim to read SO's data dump. However, my Macbook slows down when I roll down just a few rows. This suggests me that there must be more efficient ways to read the data.
I know little MySQL. The files are in .xml -format. It is rather hard to read the data at the moment in .xml. It may be more efficient to convert the xml -files to MySQL and then read the files. I know only MS db -tool for such actions. However, I would like to know another tool too.
Problems
to parse .xml to SQL -queries such that MySQL understand it. We need to know data structures of the data.
to run the data in MySQL
to find some tool similar to MS db -tool by which we can read the data effectively
How do you read SO's data dump effectively?
--
[edit]
How can you run the 523 SQL queries to create the database in your terminal? I have the commands at the moment in a text -file.
How can you "switch to [the recovery mode] to a simple recovery mode in the database?
I made my first ever python program to read them and output SQL insert statements for use with mysql (It's ugly but worked). You'll need to create the tables first though by hand.
import xml.sax.handler
import xml.sax
import sys
class SOHandler(xml.sax.handler.ContentHandler):
def __init__(self):
self.errParse = 0
def startElement(self, name, attributes):
if name != "row":
self.table = name;
self.outFile = open(name+".sql","w")
self.errfile = open(name+".err","w")
else:
skip = 0
currentRow = u"insert into "+self.table+"("
for attr in attributes.keys():
currentRow += str(attr) + ","
currentRow = currentRow[:-1]
currentRow += u") values ("
for attr in attributes.keys():
try:
currentRow += u'"{0}",'.format(attributes[attr].replace('\\','\\\\').replace('"', '\\"').replace("'", "\\'"))
except UnicodeEncodeError:
self.errParse += 1;
skip = 1;
self.errfile.write(currentRow)
if skip != 1:
currentRow = currentRow[:-1]
currentRow += u");"
#print len(attributes.keys())
self.outFile.write(currentRow.encode("utf-8"))
self.outFile.write("\n")
self.outFile.flush()
print currentRow.encode("utf-8");
def characters(self, data):
pass
def endElement(self, name):
pass
if len(sys.argv) < 2:
print "Give me an xml file argument!"
sys.exit(1)
parser = xml.sax.make_parser()
handler = SOHandler()
parser.setContentHandler(handler)
parser.parse(sys.argv[1])
print handler.errParse