GPG Public/private key cannot be accessed correctly from AWS Secrets manager via python3 - json

I am using python-gnupg package to create GPG public and private key. The generated private key I am storing in AWS secrets manager as follows.
Key: private_key
value: -----BEGIN PGP PRIVATE KEY BLOCK-----
Version: GnuPG v2.0.22 (GNU/Linux)
lQO+BF37qDIBCADXq0iJVRYFb43+YU8Ts63hDgZl49ZNdnDVhd9H0JMXRHqtPqt9
bbFePPN47NRe6z6GsbPaPmDqEE9l3KjFnSZB/yCii+2wHZR0ij2g3ATbiAbOoQQy
I6bbUADmHtcfIJByoXVoDk489nUPt84Xyp1lHiBfCtUmq4w62Okq6InlRhxjxcEx
VvSXaCY8YnEXUAgNGpvcKHDejGS9V4djh7r7lgJ/Y+3Xb2eepOfiaCx2Cn8ZMI0q
7eWH0MmSeR4ueOLeb79ZKjpJraBfV91XplgHHiM18oECWWwsQCFiwi1GVOLpX6Fh
HIoUyaRAW2vZyFcNnO7iLbetie6fE884lfHxABEBAAH+AwMCO+Qoh7o3GWVga9f2
gHEeuGGH4KB3aspQZt/zwKpx7YlDB/uLd4q7JQt35nIH6pfYdMwgQt001CRhsGvX
QVKIkvipQvJZgCO8Nix7xYCukH0cI4TXD7S9BmRNMCPi74+Q1J3cDfKHCseynMNF
GzBeCDx6LW3CVfKKs0Mc2ecSl0c8gxaPDi3AfACRMefEAuVQyo82qJKjpI+O/Yik
z40C5OgK0XfetKstxcH4B0bx0o/PrUpYFM/gHHgFkbpVg5citcvFY4VcEkWryVcg
yF0qBPXP0OKBtCUU1ZGiCwRJy8iGd/dOOICcSCfMNy+jzzM3FSVzei69x7MYt3xu
IzCsmHpDvpdL7tiDDHgwajZeFFPTzf7Ic90K6TapQ3H59xPMxnL9K5o9rP1glRY0
8e4zYjYxg9A6Yl3K5zdqs+M1A3Os70HUlWZXZ4LQNcidPd1rhnPnm9eXkyV2ScXl
dE38aOA5pnrL0WZUM3/OLAToMP6h4rjw9WLqqgWlrl6yz9bhZrfRxlhZaEtNs1bi
pgrmPK/a5fK++BjMSuA94EkXTVNjKWNQBzcmrff27M1TMwN+34NWj3dk/a1gyflP
QZgK3MT+0GaMCcvy1EoZ87ffLQrWwFJOw5nT83yG7VBbuerSEk/tk30bxmYN6HzO
zvQgSjDiiH+ANXVupnzDjjBREmH6V1Hv+7Q0vrjKQHd3eYvKJpAWfFr9kO8DzKck
ZkSMj487SjlHbh33z1yupuwAtjyYQ5tN1adSlDa92t0Q08udnFDQtxXEnL6rw/Du
llEuCEVC9UYcNwwQGMsGXQBFFfj1389WHr0hkSOvyS1nPiIku5kNXDhSWq7/okTS
FwnCt+wbZa6TWbXjwKzHzu4LOarV1s8DnYHKNH6HHIqsVR2oJuIuqhyREAqjeP/T
3bQjQXV0b2dlbmVyYXRlZCBLZXkgPG1laHVsQHBoZWFhLm9yZz6JATkEEwECACMF
Al37qDICGy8HCwkIBwMCAQYVCAIJCgsEFgIDAQIeAQIXgAAKCRDO+i9CZ70SvqMn
CACCmdzqZW68j1E45XTHz3fvqdft6fXOyrlMuDdcH2y7Zrl5JS7PlCeHzIcsSMlH
wDYpCG8km7nwZsnWqKsOXFWq1nq/j7Kv5AzR7UmPzTw/1HFSVhIFA0ZZMHAnwp7Y
bcAT+ssvo4To9CjzRp/ZI1k26RFXPWuXETa41DBIVz13Ss4SIaf7UG9FQ55o+2BA
TP48yCQqktiWOoZ0rV1ALSFlE4Gs3UWHcYxxCABA0JB4+FuCRfB8QMreLwFb47wc
dIitbVl0mQx5IXCkqhJKqR62rRy25Put4xnPhXGtXqfoYDVYvYvlsl/FA35cX+Z1
QODnLq/jQ7ZPdaFC7cFqxztk
=RvGa
-----END PGP PRIVATE KEY BLOCK-----
Key: passphrase
Value: secret123
All I want to do is extract Key and Value pair from AWS Secrets manager and import key and later decrypt file.
As you all know JSON doesn't interpret new line characters in a multi line value so GPG import_keys fails to import private key. If I just read local file having the same private key then no problem. Please let me know if there is any workaround for this issue ?
try:
secretkey = self.get_secret(secretName)
if not secretkey:
self.logger.error("Empty secret key")
exit(0)
newdict = json.loads(secretkey)**
# newdict = ast.literal_eval(secretkey)
private_key = newdict['private_key']
# private_key = open('/home/ec2-user/GPG/test_private_key.asc').read()
passphrase = newdict['passphrase']
gpg = gnupg.GPG(gnupghome=gpgHomeDir)
import_result = gpg.import_keys(private_key)
count = import_result.count
if count == 0:
self.logger.error("Failed to import private key")
sys.exit(1)
dataPath = srcDir + "/" + self.dataSource
for root, folders, files in os.walk(dataPath):
if not files:
self.logger.info("No files found so skipping .....")
continue
for filename in folders + files:
fullpath = os.path.join(root,filename)
self.logger.info("Fullpath = {0}".format(fullpath))
out_file = "/tmp/" + filename
with open(fullpath, "rb") as f:
status = gpg.decrypt_file(f, passphrase=passphrase, output=out_file)
if status.ok:
s3Prefix = root.replace(srcDir + '/', '')
s3ObjKey = s3Prefix + "/" + filename
s3InPath = "s3://" + self.inBucketName + "/" + s3Prefix + "/" + filename
with open(out_file, "rb") as fl:
self.s3Client.upload_fileobj(fl,
self.inBucketName,
s3ObjKey
)
except Exception as e:
print(str(e))
self.logger.error(str(e))
sys.exit(1)

I have to use base64 format to store PGP key as follows.
import base64
import gnupg
try:
gpg = gnupg.PGP(gnupghome="/home/guest/GPG")
input_data = gpg.gen_key_input(key_type='RSA',
key_length=2048,
name_email="guest#xyz.com"
passphrase="pass123")
key = gpg.gen_key(input_data)
ascii_armored_public_key = gpg.export_keys(key.fingerprint, armor=True)
ascii_armored_private_key = gpg.export_keys(key.fingerprint, True, armor=True)
b64_encoded_private_key = base64.b64encode(ascii_armored_private_key.encode())
binaryPrivKeyFile = "/tmp/b64encoded_private_key.asc"
with open(binaryPrivKeyFile, 'wb') as bPrivFile:
bPrivFile.write(b64_encoded_private_key)
except Exception as e:
print(str(e))
sys.exit(1)
Now we have to store b64encoded_private_key.asc to AWS secrets manager as follows.
$ aws secretsmanager create-secret --name private-key --secret-binary fileb://b64encoded_private_key.asc --region us-east-1
We cannot store passphrase in the same secret so we have to create separate secret for passphrase as follows.
$ aws secretsmanager create-secret --name passwd --secret-string '{"passphrase" : "pass123"}' --region us-east-1
NOTE: The secret type for private key is binary whereas for passphrase it is plain text.
After creating secret, we can use AWS secrets manager code to get private key and passphrase. The AWS Secrets Manager code decodes private key using base64.b64decode(..) method.

Secrets Manager does not require you to store the data in JSON format, it can store arbitrary strings or binary data.
You could either chose to break everything up and store it in separate secrets, or use a data format that supports new lines like XML.

The private key which you will store, won't have special characters like '\n', '\r'.
To resolve this issue, copy the output of private_key, which will have special characters.
private_key = open('/home/ec2-user/GPG/test_private_key.asc').read()
private_key
Place this private key into your secret & get it using get_secret()
Note: you will see the additional '' in the private key which will get using load_json, to handle that you need to use private_key.replace('\n','\n')
Your code will look like below.
private_key = newdict['private_key']
private_key = private_key.replace('\n','\n')
Then you will be able to get the keys.

Related

Connection issues in Storage trigger GCF

For my application, new file uploaded to storage is read and the data is added to a main file. The new file contains 2 lines, one a header and other an array whose values are separated by a comma. The main file will need maximum of 265MB. The new files will have maximum of 30MB.
def write_append_to_ecg_file(filename,ecg,patientdata):
file1 = open('/tmp/'+ filename,"w+")
file1.write(":".join(patientdata))
file1.write('\n')
file1.write(",".join(ecg.astype(str)))
file1.close()
def storage_trigger_function(data,context):
#Download the segment file
download_files_storage(bucket_name,new_file_name,storage_folder_name = blob_path)
#Read the segment file
data_from_new_file,meta = read_new_file(new_file_name, scale=1, fs=125, include_meta=True)
print("Length of ECG data from segment {} file {}".format(segment_no,len(data_from_new_file)))
os.remove(new_file_name)
#Check if the main ecg_file_exists
file_exists = blob_exists(bucket_name, blob_with_the_main_file)
print("File status {}".format(file_exists))
data_from_main_file = []
if ecg_file_exists:
download_files_storage(bucket_name,main_file_name,storage_folder_name = blob_with_the_main_file)
data_from_main_file,meta = read_new_file(main_file_name, scale=1, fs=125, include_meta=True)
print("ECG data from main file {}".format(len(data_from_main_file)))
os.remove(main_file_name)
data_from_main_file = np.append(data_from_main_file,data_from_new_file)
print("data after appending {}".format(len(data_from_main_file)))
write_append_to_ecg_file(main_file,data_from_main_file,meta)
token = upload_files_storage(bucket_name,main_file,storage_folder_name = main_file_blob,upload_file = True)
else:
write_append_to_ecg_file(main_file,data_from_new_file,meta)
token = upload_files_storage(bucket_name,main_file,storage_folder_name = main_file_blob,upload_file = True)
The GCF is deployed
gcloud functions deploy storage_trigger_function --runtime python37 --trigger-resource patch-us.appspot.com --trigger-event google.storage.object.finalize --timeout 540s --memory 8192MB
For the first file, I was able to read the file and write the data to the main file. But after uploading the 2nd file, its giving Function execution took 70448 ms, finished with status: 'connection error' On uploading the 3rd file, it gives the Function invocation was interrupted. Error: memory limit exceeded. Despite of deploying the function with 8192MB memory, I am getting this error. Can I get some help on this.

How to efficiently parse JSON data with multiple keys in Python 2.7?

I'm writing a script that will check the CVS COVID vaccine availability for cities in my state of VA. I have been successful getting the data I'm looking for, but my code is hard coded in some areas. I'm specifically asking for help improving my code in the areas number 1 & 2 below:
The JSON file can be found here:
https://www.cvs.com//immunizations/covid-19-vaccine.vaccine-status.VA.json?vaccineinfo
I'm trying to access the data in the responsePayloadData key. The only way I could figure out how to do this is to make it the only key. For that reason, I deleted the other key responseMetaData:
#remove the key that we don't need
del obj['responseMetaData']
I'm also not sure how to dynamically loop through the VA items without hard coding the number of cities I know are there in the data:
for x, y in obj.items():
for a in range(34):
Here's the full code:
import requests
import json
import time
from datetime import datetime
import urllib2
try:
import indigo
except:
pass
strAvail = "False"
strAvailCity = "None"
try:
# download raw json object from CVS Virginia Website
url = "https://www.cvs.com//immunizations/covid-19-vaccine.vaccine-status.VA.json?vaccineinfo"
data = urllib2.urlopen(url).read().decode()
except urllib2.HTTPError, err:
return {"error": err.reason, "error_code": err.code}
# parse json object
obj = json.loads(data)
# remove the key that we don't need
del obj['responseMetaData']
# loop through the JSON dictionary and check availability
# status options: {"Fully Booked", "Available"}
for x, y in obj.items():
for a in range(34):
# print('City: ' + y['data']['VA'][a]['city'])
# print('Total Available: ' + y['data']['VA'][a]['totalAvailable'])
# print('Percent Available: ' + y['data']['VA'][a]['pctAvailable'])
# print('Status: ' + y['data']['VA'][a]['status'])
# print("------------------------------")
# If there is availability anywhere in the state, take some action.
if y['data']['VA'][a]['status'] == "Available":
strAvail = True
strAvailCity = y['data']['VA'][a]['city']
# Log timestamp for this check to the JSON
now = datetime.now()
strDateTime = now.strftime("%m/%d/%Y %I:%M %p")
EDIT: Since the JSON is not available outside the US. I've pasted it below:
{"responsePayloadData":{"currentTime":"2021-02-11T14:55:00.470","data":{"VA":[{"totalAvailable":"1","city":"ABINGDON","state":"VA","pctAvailable":"0.19%","status":"Fully Booked"},{"totalAvailable":"0","city":"ALEXANDRIA","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"ARLINGTON","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"BEDFORD","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"BLACKSBURG","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"CHARLOTTESVILLE","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"CHATHAM","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"CHESAPEAKE","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"1","city":"DANVILLE","state":"VA","pctAvailable":"0.19%","status":"Fully Booked"},{"totalAvailable":"2","city":"DUBLIN","state":"VA","pctAvailable":"0.39%","status":"Fully Booked"},{"totalAvailable":"0","city":"FAIRFAX","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"FREDERICKSBURG","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"GAINESVILLE","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"HAMPTON","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"HARRISONBURG","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"LEESBURG","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"LYNCHBURG","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"MARTINSVILLE","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"MECHANICSVILLE","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"MIDLOTHIAN","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},
{"totalAvailable":"0","city":"NEWPORT NEWS","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"NORFOLK","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"PETERSBURG","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"PORTSMOUTH","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"RICHMOND","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"ROANOKE","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},
{"totalAvailable":"0","city":"ROCKY MOUNT","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"STAFFORD","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"SUFFOLK","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},
{"totalAvailable":"0","city":"VIRGINIA BEACH","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"WARRENTON","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"WILLIAMSBURG","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"WINCHESTER","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"},{"totalAvailable":"0","city":"WOODSTOCK","state":"VA","pctAvailable":"0.00%","status":"Fully Booked"}]}},"responseMetaData":{"statusDesc":"Success","conversationId":"Id-beb5f68730b34e6aa3bbc1fd927ea12b","refId":"Id-b4a7256078789eb59b8912b4","operation":"getInventorybyCity","statusCode":"0000"}}
Regarding problem 1, you can just access the data by key. You don't need to delete the other key:
payload = obj['responsePayloadData']
For the second problem, you can just iterate over the items in the list associated with obj['data']['VA']:
for city in payload['data']['VA']:
print(city)
{'city': 'ABINGDON',
'pctAvailable': '0.19%',
'state': 'VA',
'status': 'Fully Booked',
'totalAvailable': '1'}
{'city': 'ALEXANDRIA',
'pctAvailable': '0.00%',
'state': 'VA',
'status': 'Fully Booked',
'totalAvailable': '0'}
...

Google Cloud KMS issue with decrypt

Im new to Cloud KMS, and I started following exactly what's written here
I encrypted my data file which is saved in UTF-8 format by running this command
gcloud kms encrypt --location global --keyring ring --key key --plaintext-file /path_to_file --ciphertext-file /path_to_enc --project myProject
then as a result my encrypted data has been presented in this format in my new created encrypted file
$�]ˋLݿ���yHI�lS�`&�Nt�b{%�U�� �&�A���XaL��d
here is how I read the encrypted file data:
static Properties properties = new Properties();
static {
try {
InputStream in = new Credentials().getClass().getResourceAsStream("path_to_enc_file");
byte[] encryptedData = IOUtils.toByteArray(in);
byte[] decryptedBytes = decrypt(EnvironmentVariable.getProjectId(), "global", "ring", "key", encryptedData);
ByteArrayInputStream bis = new ByteArrayInputStream(decryptedBytes);
properties.load(bis);
in.close();
bis.close();
} catch (IOException e1) {
e1.printStackTrace();
}
}
and now whenever I try to decrypt it by this function:
public static byte[] decrypt(
String projectId, String locationId, String keyRingId, String cryptoKeyId, byte[] ciphertext)
throws IOException {
// Create the KeyManagementServiceClient using try-with-resources to manage client cleanup.
try (KeyManagementServiceClient client = KeyManagementServiceClient.create()) {
// The resource name of the cryptoKey
String resourceName = CryptoKeyName.format(projectId, locationId, keyRingId, cryptoKeyId);
// Decrypt the ciphertext with Cloud KMS.
DecryptResponse response = client.decrypt(resourceName, ByteString.copyFrom(ciphertext));
// Extract the plaintext from the response.
return response.getPlaintext().toByteArray();
}
}
it throw this
{
"code" : 400,
"errors" : [ {
"domain" : "global",
"message" : "Decryption failed: the ciphertext is invalid.",
"reason" : "badRequest"
} ],
"message" : "Decryption failed: the ciphertext is invalid.",
"status" : "INVALID_ARGUMENT"
}
the key type is: Symmetric encrypt/decrypt Default Algorithm: Google symmetric key
the ring location: global
Can you plz help me out and tell me what's missing in google docs?
Update: As bdhess says in the comment, this is probably due to Maven being "helpful" and corrupting the data during the build process. See the Maven docs for how to avoid this.
The solution below also works, but is less straightforward.
Tamer and I chatted for a while and got a workaround:
Encode the output from gcloud in base64 before including it in a file in src/main/resources.
Decode the file after reading it with java.util.Base64.
Pass the decoded bytes to the KMS API.
For some reason the bytes were getting corrupted between creating the file with gcloud and reading the bytes in with getResourceAsStream(). From the code above I can't see where the corruption would be happening, and it seems like reading in binary resources should be totally supported. But something is breaking somewhere in Tamer's case.
I'll try to reproduce it sometime this week.
To decrypt a secret from a file to a plaintext file:
cat secret.enc | gcloud kms decrypt \
--location=global \
--keyring=keyring \
--key=key \
--ciphertext-file=- \
--plaintext-file=decrypted_secret.txt
You'd need to decode the encrypted key first with base64, and then pipe the output to the whole gcloud kms command, e.g:
cat my-token.enc | base64 --decode | gcloud kms decrypt --plaintext-file=plaintextfile --ciphertext-file=- --location=global --keyring=yourkeyringname --key=yourkeyname
I did that modifications then it worked like a charm with a great help from #hjfreyer
1- to encrypt the plain text secret I did that
run this command -->
gcloud kms encrypt --location global --plaintext-file PATH_TO_SECRET_FILE --ciphertext-file PATH_TO_TMP_FILE --project myProject --key key --keyring ring
Encode the result base64 -->
base64 PATH_TO_TMP_FILE > PATH_TO_FINAL_ENC_FILE
remove new line from the FINAL_ENC_FILE file
2- to decrypt the data back first I need to base64 decode it then pass it to the decrypt KMS function
InputStream in = new Credentials().getClass().getResourceAsStream("PATH_TO_FINAL_ENC_FILE");
byte[] encryptedData = IOUtils.toByteArray(in);
byte[] decryptedBytes = decrypt(EnvironmentVariable.getProjectId(), "global", "ring", "key", Base64.getDecoder().decode(encryptedData));

failing to decrypt blob passwords only once in a while using amazon kms

import os, sys
AWS_DIRECTORY = '/home/jenkins/.aws'
certificates_folder = 'my_folder'
SUCCESS = 'success'
class AmazonKMS(object):
def __init__(self):
# making sure boto3 has the certificates and region files
result = os.system('mkdir -p ' + AWS_DIRECTORY)
self._check_os_result(result)
result = os.system('cp ' + certificates_folder + 'kms_config ' + AWS_DIRECTORY + '/config')
self._check_os_result(result)
result = os.system('cp ' + certificates_folder + 'kms_credentials ' + AWS_DIRECTORY + '/credentials')
self._check_os_result(result)
# boto3 is the amazon client package
import boto3
self.kms_client = boto3.client('kms', region_name='us-east-1')
self.global_key_alias = 'alias/global'
self.global_key_id = None
def _check_os_result(self, result):
if result != 0 and raise_on_copy_error:
raise FAILED_COPY
def decrypt_text(self, encrypted_text):
response = self.kms_client.decrypt(
CiphertextBlob = encrypted_text
)
return response['Plaintext']
when using it
amazon_kms = AmazonKMS()
amazon_kms.decrypt_text(blob_password)
getting
E ClientError: An error occurred (AccessDeniedException) when calling the Decrypt operation: The ciphertext refers to a customer master key that does not exist, does not exist in this region, or you are not allowed to access.
stacktrace is
../keys_management/amazon_kms.py:77: in decrypt_text
CiphertextBlob = encrypted_text
/home/jenkins/.virtualenvs/global_tests/local/lib/python2.7/site-packages/botocore/client.py:253: in _api_call
return self._make_api_call(operation_name, kwargs)
/home/jenkins/.virtualenvs/global_tests/local/lib/python2.7/site-packages/botocore/client.py:557: in _make_api_call
raise error_class(parsed_response, operation_name)
This happens in a script that runs once an hour.
it's only failing 2 -3 times a day.
after a retry it succeed.
Tried to upgraded from boto3 1.2.3 to 1.4.4
what is the possible cause for this behavior ?
My guess is that the issue is not in anything you described here.
Most likely the login-tokes time out or something along those lines. To investigate this further a closer look on the way the login works here is probably helpful.
How does this code run? Is it running inside AWS like on Lambda or EC2? Do you run it from your own server (looks like it runs on jenkins)? How is the login access established? What are those kms_credentials used for and how do they look like? Do you do something like assumeing a role (which would probably work through access tokens which after some time will no longer work)?

Converting RSA keys to JSON in Perl

I need to find a way of transferring an RSA public key to a server for my network communication program. I have done some research, and it seems that the easiest way to do this is to convert the public key (which is stored as some kind of hash reference) to a JSON for transmission. However, in my test code I cannot get the key to convert to a JSON. Here is my test program:
use strict;
use warnings;
use Crypt::RSA;
use JSON;
my %hash = ( name => "bob",
age => 123,
hates=> "Perl"
);
my $hash_ref = \%hash;
my $hash_as_json = to_json($hash_ref);
print $hash_as_json, "\n"; # Works fine for a normal hash
my $rsa = new Crypt::RSA;
my ($public, $private) = $rsa->keygen (
Identity => 'client',
Size => 512,
Password => 'password',
Verbosity => 1,
) or die $rsa->errstr();
my $key_hash_as_json = to_json($public, {allow_blessed => 1, convert_blessed => 1});
print $key_hash_as_json, "\n";
Before I found the line {allow_blessed => 1, convert_blessed => 1} I got an error message saying
encountered object 'Crypt::RSA::Key::Public=HASH(0x3117128)', but
neither allow_blessed, convert_blessed nor allow_tags settings are
enabled (or TO_JSON/FREEZE method missing) at
/home/alex/perl5/lib/perl5/JSON.pm line 154.
What does this mean and why did that line fix it?
After adding the code, it just gives null when I try and print the JSON. Why is this happening and how do I fix it?
Alternatively, is there a better way of doing what I am trying here?
The most common way of representing an RSA public key as text is the PEM encoding. Unfortunately, Crypt::RSA does not provide any way to convert to or from this format, or indeed any other standard format. Don't use it!
Instead, I'd recommend that you use Crypt::OpenSSL::RSA. Generating a private key and printing its public form with this module is simple:
use Crypt::OpenSSL::RSA;
my $key = Crypt::OpenSSL::RSA->generate_key(512);
print $key->get_public_key_string;
This will output a PEM encoding like the following:
-----BEGIN RSA PUBLIC KEY-----
MEgCQQDd/5F9Rc5vsNuKBrd4gfI4BDgre/sTBKu3yXpk+8NjByKpClsi3IQEGYeG
wmv/q/1ZjflFby1MPxMhXZo/82CbAgMBAAE=
-----END RSA PUBLIC KEY-----
Apart from already mentioned PEM there exists JWK format (JSON Web Key). Have a look at Crypt::PK::RSA (my module) which supports generating, importing and exporting RSA keys in both PEM and JWK.