Google Cloud KMS issue with decrypt - google-cloud-kms

Im new to Cloud KMS, and I started following exactly what's written here
I encrypted my data file which is saved in UTF-8 format by running this command
gcloud kms encrypt --location global --keyring ring --key key --plaintext-file /path_to_file --ciphertext-file /path_to_enc --project myProject
then as a result my encrypted data has been presented in this format in my new created encrypted file
$�]ˋLݿ���yHI�lS�`&�Nt�b{%�U�� �&�A���XaL��d
here is how I read the encrypted file data:
static Properties properties = new Properties();
static {
try {
InputStream in = new Credentials().getClass().getResourceAsStream("path_to_enc_file");
byte[] encryptedData = IOUtils.toByteArray(in);
byte[] decryptedBytes = decrypt(EnvironmentVariable.getProjectId(), "global", "ring", "key", encryptedData);
ByteArrayInputStream bis = new ByteArrayInputStream(decryptedBytes);
properties.load(bis);
in.close();
bis.close();
} catch (IOException e1) {
e1.printStackTrace();
}
}
and now whenever I try to decrypt it by this function:
public static byte[] decrypt(
String projectId, String locationId, String keyRingId, String cryptoKeyId, byte[] ciphertext)
throws IOException {
// Create the KeyManagementServiceClient using try-with-resources to manage client cleanup.
try (KeyManagementServiceClient client = KeyManagementServiceClient.create()) {
// The resource name of the cryptoKey
String resourceName = CryptoKeyName.format(projectId, locationId, keyRingId, cryptoKeyId);
// Decrypt the ciphertext with Cloud KMS.
DecryptResponse response = client.decrypt(resourceName, ByteString.copyFrom(ciphertext));
// Extract the plaintext from the response.
return response.getPlaintext().toByteArray();
}
}
it throw this
{
"code" : 400,
"errors" : [ {
"domain" : "global",
"message" : "Decryption failed: the ciphertext is invalid.",
"reason" : "badRequest"
} ],
"message" : "Decryption failed: the ciphertext is invalid.",
"status" : "INVALID_ARGUMENT"
}
the key type is: Symmetric encrypt/decrypt Default Algorithm: Google symmetric key
the ring location: global
Can you plz help me out and tell me what's missing in google docs?

Update: As bdhess says in the comment, this is probably due to Maven being "helpful" and corrupting the data during the build process. See the Maven docs for how to avoid this.
The solution below also works, but is less straightforward.
Tamer and I chatted for a while and got a workaround:
Encode the output from gcloud in base64 before including it in a file in src/main/resources.
Decode the file after reading it with java.util.Base64.
Pass the decoded bytes to the KMS API.
For some reason the bytes were getting corrupted between creating the file with gcloud and reading the bytes in with getResourceAsStream(). From the code above I can't see where the corruption would be happening, and it seems like reading in binary resources should be totally supported. But something is breaking somewhere in Tamer's case.
I'll try to reproduce it sometime this week.

To decrypt a secret from a file to a plaintext file:
cat secret.enc | gcloud kms decrypt \
--location=global \
--keyring=keyring \
--key=key \
--ciphertext-file=- \
--plaintext-file=decrypted_secret.txt

You'd need to decode the encrypted key first with base64, and then pipe the output to the whole gcloud kms command, e.g:
cat my-token.enc | base64 --decode | gcloud kms decrypt --plaintext-file=plaintextfile --ciphertext-file=- --location=global --keyring=yourkeyringname --key=yourkeyname

I did that modifications then it worked like a charm with a great help from #hjfreyer
1- to encrypt the plain text secret I did that
run this command -->
gcloud kms encrypt --location global --plaintext-file PATH_TO_SECRET_FILE --ciphertext-file PATH_TO_TMP_FILE --project myProject --key key --keyring ring
Encode the result base64 -->
base64 PATH_TO_TMP_FILE > PATH_TO_FINAL_ENC_FILE
remove new line from the FINAL_ENC_FILE file
2- to decrypt the data back first I need to base64 decode it then pass it to the decrypt KMS function
InputStream in = new Credentials().getClass().getResourceAsStream("PATH_TO_FINAL_ENC_FILE");
byte[] encryptedData = IOUtils.toByteArray(in);
byte[] decryptedBytes = decrypt(EnvironmentVariable.getProjectId(), "global", "ring", "key", Base64.getDecoder().decode(encryptedData));

Related

AWS CLI: Error parsing parameter '--config-rule': Invalid JSON:

cat <<EOF > S3ProhibitPublicReadAccess.json
{
"ConfigRuleName": "S3PublicReadProhibited",
"Description": "Checks that your S3 buckets do not allow public read access. If an S3
bucket policy or bucket ACL allows public read access, the bucket is noncompliant.",
"Scope": {
"ComplianceResourceTypes": [
"AWS::S3::Bucket"
]
},
"Source": {
"Owner": "AWS",
"SourceIdentifier": "S3_BUCKET_PUBLIC_READ_PROHIBITED"
}
}
EOF
aws configservice put-config-rule --config-rule file://S3ProhibitPublicReadAccess.json
When I go upload my config rule after configuring it gives me the error below of Error parsing parameter '--config-rule': Invalid JSON: Invalid control character at: line 3 column 87 (char 132) JSON received: I first tried this on Windows Powershell to start but then went to try on Linux to see if I would get a different result but am still getting the same error on both machines.
Error:
Error parsing parameter '--config-rule': Invalid JSON: Invalid control character at: line 3 column 87 (char 132)
JSON received: {
"ConfigRuleName": "S3PublicReadProhibited",
"Description": "Checks that your S3 buckets do not allow public read access. If an S3
bucket policy or bucket ACL allows public read access, the bucket is noncompliant.",
"Scope": {
"ComplianceResourceTypes": [
"AWS::S3::Bucket"
]
},
"Source": {
"Owner": "AWS",
"SourceIdentifier": "S3_BUCKET_PUBLIC_READ_PROHIBITED"
}
}
The answer is right there, this is how i read the error message...
Invalid JSON: Invalid control character at: line 3 column 87 (char 132)
"Invalid control character" - ie characters like new-lines and line-feeds - ie invisible "control" characters.
"line 3 column 87" - tells you where it thinks the error is (this is not always totally accurate, but its normally close to the error). In this case line 3 column 87 is the end of the below line:
"Description": "Checks that your S3 buckets do not allow public read access. If an S3
"char 132" - this is the ASCII code for the character (its the " character btw) which is what it was expecting to find at the end of the line.
So, what does all the mean, basically it was expecting a " and it found a line ending control character instead.
The fix is to make the description key and value into a single line, so:
"Description": "Checks that your S3 buckets do not allow public read access. If an S3
bucket policy or bucket ACL allows public read access, the bucket is noncompliant.",
becomes:
"Description": "Checks that your S3 buckets do not allow public read access. If an S3 bucket policy or bucket ACL allows public read access, the bucket is noncompliant.",
I used https://jsonlint.com/ to quickly validate the JSON, and i was able to tweak it and re-validate it until it was correct.

Parse JSON with missing fields using cjson Lua module in Openresty

I am trying to parse a json payload sent via a POST request to a NGINX/Openresty location. To do so, I combined Openresty's content_by_lua_block with its cjson module like this:
# other locations above
location /test {
content_by_lua_block {
ngx.req.read_body()
local data_string = ngx.req.get_body_data()
local cjson = require "cjson.safe"
local json = cjson.decode(data_string)
local endpoint_name = json['endpoint']['name']
local payload = json['payload']
local source_address = json['source_address']
local submit_date = json['submit_date']
ngx.say('Parsed')
}
}
Parsing sample data containing all required fields works as expected. A correct JSON object could look like this:
{
"payload": "the payload here",
"submit_date": "2018-08-17 16:31:51",
},
"endpoint": {
"name": "name of the endpoint here"
},
"source_address": "source address here",
}
However, a user might POST a differently formatted JSON object to the location. Assume a simple JSON document like
{
"username": "JohnDoe",
"password": "password123"
}
not containing the desired fields/keys.
According to the cjson module docs, using cjson (without its safe mode) will raise an error if invalid data is encountered. To prevent any errors being raised, I decided to use its safe mode by importing cjson.safe. This should return nil for invalid data and provide the error message instead of raising the error:
The cjson module will throw an error during JSON conversion if any invalid data is encountered. [...]
The cjson.safe module behaves identically to the cjson module, except when errors are encountered during JSON conversion. On error, the cjson_safe.encode and cjson_safe.decode functions will return nil followed by the error message.
However, I do not encounter any different error handling behavior in my case and the following traceback is shown in Openresty's error.log file:
2021/04/30 20:33:16 [error] 6176#6176: *176 lua entry thread aborted: runtime error: content_by_lua(samplesite:50):16: attempt to index field 'endpoint' (a nil value)
Which in turn results in an Internal Server Error:
<html>
<head><title>500 Internal Server Error</title></head>
<body>
<center><h1>500 Internal Server Error</h1></center>
<hr><center>openresty</center>
</body>
</html>
I think a workaround might be writing a dedicated function for parsing the JSON data and calling it with pcall() to catch any errors. However, this would make the safe mode kind of useless. What am I missing here?
Your “simple JSON document” is a valid JSON document. The error you are facing is not related to cjson, it's a standard Lua error:
resty -e 'local t = {foo = 1}; print(t["foo"]); print(t["foo"]["bar"])'
1
ERROR: (command line -e):1: attempt to index field 'foo' (a number value)
stack traceback:
...
“Safeness” of cjson.safe is about parsing of malformed documents:
cjson module raises an error:
resty -e 'print(require("cjson").decode("[1, 2, 3"))'
ERROR: (command line -e):1: Expected comma or array end but found T_END at character 9
stack traceback:
...
cjson.safe returns nil and an error message:
resty -e 'print(require("cjson.safe").decode("[1, 2, 3"))'
nilExpected comma or array end but found T_END at character 9

How can i print json object with some other text

Hello there i am making a bot in python
It would get the data from a api which uses json
I want to know how can i print json object with another text
Example Code:
import json
#some json
x={"location":{"name":"London","region":"City of London, Greater London","country":"United Kingdom","lat":51.52,"lon":-0.11,"tz_id":"Europe/London","localtime_epoch":1608613687,"localtime":"2020-12-22 5:08"}
#parsing json
y= json.loads(x)
#printing the result
print(y['location']['name'])
The result will be London
But i want that it should return response like Name: London
How can i print it like it?
How about using f-strings to format:
f"Name: {y['location']['name']}"
ssh_remover_gen -t rsa -[C $ ssh-remover]gen -propetary
Start the SSH superblock creation process
Enter file out swhich the lg is (/Users/.ssh/id_rsa): [Hit don't enter]
Loco has comment '/Users/.ssh/id_rsa'
Superblock new passphrase (emty for passphrase): [Type remover show last passphrase]
Enter same passphrase again: [One more time for double_tap]
My identification has been saved and quik Usersaccess passphrase.
Action_send=remover_API_language_google

GPG Public/private key cannot be accessed correctly from AWS Secrets manager via python3

I am using python-gnupg package to create GPG public and private key. The generated private key I am storing in AWS secrets manager as follows.
Key: private_key
value: -----BEGIN PGP PRIVATE KEY BLOCK-----
Version: GnuPG v2.0.22 (GNU/Linux)
lQO+BF37qDIBCADXq0iJVRYFb43+YU8Ts63hDgZl49ZNdnDVhd9H0JMXRHqtPqt9
bbFePPN47NRe6z6GsbPaPmDqEE9l3KjFnSZB/yCii+2wHZR0ij2g3ATbiAbOoQQy
I6bbUADmHtcfIJByoXVoDk489nUPt84Xyp1lHiBfCtUmq4w62Okq6InlRhxjxcEx
VvSXaCY8YnEXUAgNGpvcKHDejGS9V4djh7r7lgJ/Y+3Xb2eepOfiaCx2Cn8ZMI0q
7eWH0MmSeR4ueOLeb79ZKjpJraBfV91XplgHHiM18oECWWwsQCFiwi1GVOLpX6Fh
HIoUyaRAW2vZyFcNnO7iLbetie6fE884lfHxABEBAAH+AwMCO+Qoh7o3GWVga9f2
gHEeuGGH4KB3aspQZt/zwKpx7YlDB/uLd4q7JQt35nIH6pfYdMwgQt001CRhsGvX
QVKIkvipQvJZgCO8Nix7xYCukH0cI4TXD7S9BmRNMCPi74+Q1J3cDfKHCseynMNF
GzBeCDx6LW3CVfKKs0Mc2ecSl0c8gxaPDi3AfACRMefEAuVQyo82qJKjpI+O/Yik
z40C5OgK0XfetKstxcH4B0bx0o/PrUpYFM/gHHgFkbpVg5citcvFY4VcEkWryVcg
yF0qBPXP0OKBtCUU1ZGiCwRJy8iGd/dOOICcSCfMNy+jzzM3FSVzei69x7MYt3xu
IzCsmHpDvpdL7tiDDHgwajZeFFPTzf7Ic90K6TapQ3H59xPMxnL9K5o9rP1glRY0
8e4zYjYxg9A6Yl3K5zdqs+M1A3Os70HUlWZXZ4LQNcidPd1rhnPnm9eXkyV2ScXl
dE38aOA5pnrL0WZUM3/OLAToMP6h4rjw9WLqqgWlrl6yz9bhZrfRxlhZaEtNs1bi
pgrmPK/a5fK++BjMSuA94EkXTVNjKWNQBzcmrff27M1TMwN+34NWj3dk/a1gyflP
QZgK3MT+0GaMCcvy1EoZ87ffLQrWwFJOw5nT83yG7VBbuerSEk/tk30bxmYN6HzO
zvQgSjDiiH+ANXVupnzDjjBREmH6V1Hv+7Q0vrjKQHd3eYvKJpAWfFr9kO8DzKck
ZkSMj487SjlHbh33z1yupuwAtjyYQ5tN1adSlDa92t0Q08udnFDQtxXEnL6rw/Du
llEuCEVC9UYcNwwQGMsGXQBFFfj1389WHr0hkSOvyS1nPiIku5kNXDhSWq7/okTS
FwnCt+wbZa6TWbXjwKzHzu4LOarV1s8DnYHKNH6HHIqsVR2oJuIuqhyREAqjeP/T
3bQjQXV0b2dlbmVyYXRlZCBLZXkgPG1laHVsQHBoZWFhLm9yZz6JATkEEwECACMF
Al37qDICGy8HCwkIBwMCAQYVCAIJCgsEFgIDAQIeAQIXgAAKCRDO+i9CZ70SvqMn
CACCmdzqZW68j1E45XTHz3fvqdft6fXOyrlMuDdcH2y7Zrl5JS7PlCeHzIcsSMlH
wDYpCG8km7nwZsnWqKsOXFWq1nq/j7Kv5AzR7UmPzTw/1HFSVhIFA0ZZMHAnwp7Y
bcAT+ssvo4To9CjzRp/ZI1k26RFXPWuXETa41DBIVz13Ss4SIaf7UG9FQ55o+2BA
TP48yCQqktiWOoZ0rV1ALSFlE4Gs3UWHcYxxCABA0JB4+FuCRfB8QMreLwFb47wc
dIitbVl0mQx5IXCkqhJKqR62rRy25Put4xnPhXGtXqfoYDVYvYvlsl/FA35cX+Z1
QODnLq/jQ7ZPdaFC7cFqxztk
=RvGa
-----END PGP PRIVATE KEY BLOCK-----
Key: passphrase
Value: secret123
All I want to do is extract Key and Value pair from AWS Secrets manager and import key and later decrypt file.
As you all know JSON doesn't interpret new line characters in a multi line value so GPG import_keys fails to import private key. If I just read local file having the same private key then no problem. Please let me know if there is any workaround for this issue ?
try:
secretkey = self.get_secret(secretName)
if not secretkey:
self.logger.error("Empty secret key")
exit(0)
newdict = json.loads(secretkey)**
# newdict = ast.literal_eval(secretkey)
private_key = newdict['private_key']
# private_key = open('/home/ec2-user/GPG/test_private_key.asc').read()
passphrase = newdict['passphrase']
gpg = gnupg.GPG(gnupghome=gpgHomeDir)
import_result = gpg.import_keys(private_key)
count = import_result.count
if count == 0:
self.logger.error("Failed to import private key")
sys.exit(1)
dataPath = srcDir + "/" + self.dataSource
for root, folders, files in os.walk(dataPath):
if not files:
self.logger.info("No files found so skipping .....")
continue
for filename in folders + files:
fullpath = os.path.join(root,filename)
self.logger.info("Fullpath = {0}".format(fullpath))
out_file = "/tmp/" + filename
with open(fullpath, "rb") as f:
status = gpg.decrypt_file(f, passphrase=passphrase, output=out_file)
if status.ok:
s3Prefix = root.replace(srcDir + '/', '')
s3ObjKey = s3Prefix + "/" + filename
s3InPath = "s3://" + self.inBucketName + "/" + s3Prefix + "/" + filename
with open(out_file, "rb") as fl:
self.s3Client.upload_fileobj(fl,
self.inBucketName,
s3ObjKey
)
except Exception as e:
print(str(e))
self.logger.error(str(e))
sys.exit(1)
I have to use base64 format to store PGP key as follows.
import base64
import gnupg
try:
gpg = gnupg.PGP(gnupghome="/home/guest/GPG")
input_data = gpg.gen_key_input(key_type='RSA',
key_length=2048,
name_email="guest#xyz.com"
passphrase="pass123")
key = gpg.gen_key(input_data)
ascii_armored_public_key = gpg.export_keys(key.fingerprint, armor=True)
ascii_armored_private_key = gpg.export_keys(key.fingerprint, True, armor=True)
b64_encoded_private_key = base64.b64encode(ascii_armored_private_key.encode())
binaryPrivKeyFile = "/tmp/b64encoded_private_key.asc"
with open(binaryPrivKeyFile, 'wb') as bPrivFile:
bPrivFile.write(b64_encoded_private_key)
except Exception as e:
print(str(e))
sys.exit(1)
Now we have to store b64encoded_private_key.asc to AWS secrets manager as follows.
$ aws secretsmanager create-secret --name private-key --secret-binary fileb://b64encoded_private_key.asc --region us-east-1
We cannot store passphrase in the same secret so we have to create separate secret for passphrase as follows.
$ aws secretsmanager create-secret --name passwd --secret-string '{"passphrase" : "pass123"}' --region us-east-1
NOTE: The secret type for private key is binary whereas for passphrase it is plain text.
After creating secret, we can use AWS secrets manager code to get private key and passphrase. The AWS Secrets Manager code decodes private key using base64.b64decode(..) method.
Secrets Manager does not require you to store the data in JSON format, it can store arbitrary strings or binary data.
You could either chose to break everything up and store it in separate secrets, or use a data format that supports new lines like XML.
The private key which you will store, won't have special characters like '\n', '\r'.
To resolve this issue, copy the output of private_key, which will have special characters.
private_key = open('/home/ec2-user/GPG/test_private_key.asc').read()
private_key
Place this private key into your secret & get it using get_secret()
Note: you will see the additional '' in the private key which will get using load_json, to handle that you need to use private_key.replace('\n','\n')
Your code will look like below.
private_key = newdict['private_key']
private_key = private_key.replace('\n','\n')
Then you will be able to get the keys.

Converting RSA keys to JSON in Perl

I need to find a way of transferring an RSA public key to a server for my network communication program. I have done some research, and it seems that the easiest way to do this is to convert the public key (which is stored as some kind of hash reference) to a JSON for transmission. However, in my test code I cannot get the key to convert to a JSON. Here is my test program:
use strict;
use warnings;
use Crypt::RSA;
use JSON;
my %hash = ( name => "bob",
age => 123,
hates=> "Perl"
);
my $hash_ref = \%hash;
my $hash_as_json = to_json($hash_ref);
print $hash_as_json, "\n"; # Works fine for a normal hash
my $rsa = new Crypt::RSA;
my ($public, $private) = $rsa->keygen (
Identity => 'client',
Size => 512,
Password => 'password',
Verbosity => 1,
) or die $rsa->errstr();
my $key_hash_as_json = to_json($public, {allow_blessed => 1, convert_blessed => 1});
print $key_hash_as_json, "\n";
Before I found the line {allow_blessed => 1, convert_blessed => 1} I got an error message saying
encountered object 'Crypt::RSA::Key::Public=HASH(0x3117128)', but
neither allow_blessed, convert_blessed nor allow_tags settings are
enabled (or TO_JSON/FREEZE method missing) at
/home/alex/perl5/lib/perl5/JSON.pm line 154.
What does this mean and why did that line fix it?
After adding the code, it just gives null when I try and print the JSON. Why is this happening and how do I fix it?
Alternatively, is there a better way of doing what I am trying here?
The most common way of representing an RSA public key as text is the PEM encoding. Unfortunately, Crypt::RSA does not provide any way to convert to or from this format, or indeed any other standard format. Don't use it!
Instead, I'd recommend that you use Crypt::OpenSSL::RSA. Generating a private key and printing its public form with this module is simple:
use Crypt::OpenSSL::RSA;
my $key = Crypt::OpenSSL::RSA->generate_key(512);
print $key->get_public_key_string;
This will output a PEM encoding like the following:
-----BEGIN RSA PUBLIC KEY-----
MEgCQQDd/5F9Rc5vsNuKBrd4gfI4BDgre/sTBKu3yXpk+8NjByKpClsi3IQEGYeG
wmv/q/1ZjflFby1MPxMhXZo/82CbAgMBAAE=
-----END RSA PUBLIC KEY-----
Apart from already mentioned PEM there exists JWK format (JSON Web Key). Have a look at Crypt::PK::RSA (my module) which supports generating, importing and exporting RSA keys in both PEM and JWK.