Passing the value from a ruby block to a resource in chef - json

I am writing a chef resource, which will generate a password, and I am calling that resource in a recipe with the list of inputs.
Following is my scenario: Once my resource got executed, a new set of passwords will be generated in a folder, and I want to retrieve that password which is newly generated. But I am unable to retrieve that password because the value I am trying to retrieve is executing at the convergence phase.
Simple code block to explain my scenario:
Chef::Log.info("Creating new keys")
create_password 'New Password is being generated' do
action :change_passwords
password_util_dir node[:password][:passwd_util_dir]
rgbu_chef node[:password][:rgbu_chef]
old_data_bags node[:password][:old_data_bags]
new_data_bags node[:password][:new_data_bags]
end
The code above will create new passwords in a folder.
Later, I am trying to take the passwords through a JSON Parser:
text =::File.read("#{new_password_dir}")
data_hash = JSON.parse(text)
new_wls_password = data_hash['rase_wlsadmin_pwd']
The #{new_password_dir} is the directory location of the newly created password.json file.
I am trying to use the value of "new_wls_password" in the another resource like below:
Chef::Log.info("Updating WLSADMIN Password")
passwd_backup 'Updating wlsadmin password' do
action :update_wlsadmin
osuser node[:password][:wls_config_user]
usergroup node[:password][:wls_install_group]
new_wls_password "#{new_wls_password}"
end
Here, the new password which I am trying to retrieve is empty, since the following three lines are executed in the first place:
text =::File.read("#{new_password_dir}")
data_hash = JSON.parse(text)
new_wls_password = data_hash['rase_wlsadmin_pwd']
So, by that time, the new passwords resource has not been run.
I tried many stack overflow suggestions, like:
putting those three lines in a ruby_block like this
ruby_block "new_password" do
block do
text =::File.read("#{new_password_dir}")
data_hash = JSON.parse(text)
node.set[:new_wls_password] = data_hash['rase_wlsadmin_pwd']
end
end
Then I tried fetching the value into the resource as below
Chef::Log.info("Updating WLSADMIN Password")
passwd_backup 'Updating wlsadmin password' do
action :update_wlsadmin
osuser node[:password][:wls_config_user]
usergroup node[:password][:wls_install_group]
new_wls_password "#{node[:new_wls_password]"
end
With the above approach still the value is empty
Trying the value with lazy and calling that value.
Passing the value from one ruby block to another ruby block, which I can do, but not with the resources.
Please, can you help?
EDIT #1 :
I need to pass the value from the resource to the template.
Something like this, after running the following resource:
Chef::Log.info("Creating new keys")
create_password 'New Password is being generated' do
action :change_passwords
password_util_dir node[:password][:passwd_util_dir]
rgbu_chef node[:password][:rgbu_chef]
old_data_bags node[:password][:old_data_bags]
new_data_bags node[:password][:new_data_bags]
end
A new set of passwords will be generated in a folder, like the /tmp/password.json file.
After the resource execution above I am writing a template like:
template "#{Chef::Config[:file_cache_path]}/domain.properties" do
source 'domain_properties.erb'
variables
({ :domain_name => "#{domain_name}",
:admin_url => "#{admin_url}",
:new_wls_password => "#{new_wls_password}" })
end
Here, how can I parse the newly created value of "new_wls_password" ?

You can use lazy attribute like below:-
Chef::Log.info("Updating WLSADMIN Password")
passwd_backup 'Updating wlsadmin password' do
action :update_wlsadmin
osuser node[:password][:wls_config_user]
usergroup node[:password][:wls_install_group]
new_wls_password lazy { JSON.parse(File.read("/tmp/password.json"))['rase_wlsadmin_pwd'] }
end
Template resource can be written as:-
template "#{Chef::Config[:file_cache_path]}/domain.properties" do
source 'domain_properties.erb'
variables (lazy{{ :domain_name => "#{domain_name}",
:admin_url => "#{admin_url}",
:new_wls_password => JSON.parse(File.read("/tmp/password.json"))['rase_wlsadmin_pwd'] }})
end
Output:-
* template[/tmp/kitchen/cache/domain.properties] action create
- create new file /tmp/kitchen/cache/domain.properties
- update content in file /tmp/kitchen/cache/domain.properties from none to fa22e0
--- /tmp/kitchen/cache/domain.properties 2017-01-12 03:30:13.002968715 +0000
+++ /tmp/kitchen/cache/.chef-domain20170112-11387-1ytkyk2.properties 2017-01-12 03:30:13.002968715 +0000
## -1 +1,4 ##
+domain_name= mmm
+admin_url= nnn
+new_wls_password= xH#3zIS9Q4Hc#B

Related

dash error loading dependencies when adding a editable table

I'm trying to add an editable table in my dash app and use the edited data in the following step.
First, I have a callback that is triggered by a button, receives a file, the function processes the file and the output is the editable table and a json data. The table is showing on the screen.
Then, I want the user to do the necessary changes in the table and click on another button.
The click will trigger another callback that should receive the edited data from the table + the json data from the previous callback and as output, another json data.
However, when I test the function, I get "error loading dependencies".
I'm using a very old version of dash 0.43 and at the moment I can't update (many functions were deprecated and I can't change them all now).
#dash_application.callback(
[Output('journey-data-store-raw', 'data'),
Output('new-lable-table', 'children'), #this goes to a div in the layout file
Output('source-table-updated', 'data')],
[Input('parser-process', 'n_clicks')],
[State('upload-data', 'contents'),
State('upload-data', 'filename'),
State('decimal-selection', 'value')]
)
def source_data(_, content, name, decimal_selection):
"""Show the sources to allow the user to edit it"""
if name:
clean_data = get_clean_data(content, name, decimal_selection)
clean_data.to_csv('clean_data_test.csv')
return df_to_json(clean_data), dash_table.DataTable(**get_sources(clean_data)), json.dumps({'updated': True})
else:
print('deu ruim')
raise dash.exceptions.PreventUpdate
#dash_application.callback(
[Output('journey-data-store', 'data'),
Output('color-store', 'data')],
[Input('run-analysis', 'n_clicks')],
[State('sources-table', 'data'),
State('journey-data-store-raw', 'data')]
)
def store_data(_, new_table, raw_data):
"""Stores the datafile and colors in a Store object"""
i = 0
for row in new_table:
if row['new source labels'] != '':
i = 1
break
if i > 0:
# call function to "parser file"
# colors = get_colors(new_data)
# return df_to_json(clean_data), json.dumps(colors)
# the return is only a test, I'd develop the function later I just wanna test and make
# the call back work
return raw_data, json.dumps(get_colors(df_from_json(raw_data)))
else:
return raw_data, json.dumps(get_colors(df_from_json(raw_data)))
I tried to exclude the button and the sources-table of the callback, so it would trigger when the first callback is finished (journey-data-store-raw is available). But is not happening either.
i tried to run in a private window.

Passing parameter from feature file to another feature file [duplicate]

https://github.com/intuit/karate#calling-other-feature-files
The link above contains an example of calling a feature file in order to reuse the code. The feature file which is reused is called with the inputs
Background:
* configure headers = read('classpath:my-headers.js')
* def signIn = call read('classpath:my-signin.feature') { username:'john', password: 'secret' }
* def authToken = signIn.authToken
The called my-signin.feature:
Scenario:
Given url loginUrlBase
And request { userId: '#(username)', userPass: '#(password)' }
When method post
Then status 200
And def authToken = response
...
In this example the my-signin.feature must be run with the inputs username and password. I know that if you had the following:
Background:
* def username = "foo"
* def password = "secret"
at the top of the my-signing.feature file, the parameters input by the feature attempting to reuse the feature file would be overwritten.
My question is:
If reuse is the main interest of being able to call other feature files, is there a way to have the calling feature file overwrite the username and password parameters if they had been defined in the background?
It seems to me that having the background overwrite the input parameters instead of vice versa makes it harder to reuse *.feature files. I know I found it a little frustrating on my project not being able to reuse tests I had already written without refactoring out the reusable code into another file.
Any called feature in karate will have a magic variable __arg, you can check for this before assigning values to your variables in your called script.
Background:
* def username = (__arg == null) ? "foo" : __arg.username
* def password = (__arg == null)? "secret" : __arg.password
this will check for values passed,
if none passed it will assign default
* def signIn = call read('classpath:my-signin.feature')
if passed with arguments passed arguments will be assigned
* def signIn = call read('classpath:my-signin.feature') { username: 'notfoo', password: 'notsecret' }
For simplicity don't have anyother parameters that need to passed other than this.

Rails 2 hook to modify data before it is read/written to MySQL DB

I have a rails 2 application that I am trying to modify so that before an attribute is written to my MySql DB, it is encoded, and on read, it is decoded (not all attributes, just pre-determined ones).
I have looked at some gems, specifically attr-encrypted, but it doesn't do exactly what I want (I am also trying to avoid re-naming any of my existing table columns, which appears to be a requirement for attr-encrypted).
I have added a before_save filter to my model to do the attribute modification before it is saved to the DB, and I have overridden my attribute getter to do the decode. While this works, I want to do the decode lower in the stack (i.e. right after DB read) in order to have everything function correctly, without requiring system wide changes (it also simplifies the logic when deciding when to encode/decode).
So what it means is that I want to do the following:
1) On DB read, do the reverse, so that if i do a Model.last, the value for my attribute would be the decoded value (without having to explicitly call the attribute getter).
2) Override the find_by_* methods so that doing a search by my encoded attribute will encode the search term first, then do the db query using that value.
How would I go about doing that?
Update: this method unfortunately does not work in Rails 2. Custom serializers were probably added in Rails 3.
Original answer follows:
I think you can try to use a custom serializer as described in this blog post. This feature should be present even in Rails 2 (otherwise I guess these SO questions regarding it would not exist).
Sample serializer which encodes the attribute into Base64:
# app/models/model.rb
class Model < ActiveRecord::Base
serialize :my_attr, MyEncodingSerializer
end
# lib/my_encoding_serializer.rb
class MyEncodingSerializer
require "base64"
def self.load(value)
# called when loading the value from DB
value.present? ? Base64.decode64(value) : nil
end
def self.dump(value)
# called when storing the value into DB
value.present? ? Base64.encode64(value) : nil
end
end
Test in the rails console:
>> Model.create(my_attr: "my secret text")
D, [2016-03-14T07:17:26.493598 #14757] DEBUG -- : (0.1ms) BEGIN
D, [2016-03-14T07:17:26.494676 #14757] DEBUG -- : SQL (0.6ms) INSERT INTO `models` (`my_attr`) VALUES ('bXkgc2VjcmV0IHRleHQ=\n')
D, [2016-03-14T07:17:26.499356 #14757] DEBUG -- : (4.4ms) COMMIT
=> #<Model id: 4, my_attr: "my secret text">
You can see that the my_attr value gets automatically encoded before saving to the DB.
Loading from DB of course works transparently too:
>> Model.last
D, [2016-03-14T07:19:01.414567 #14757] DEBUG -- : Model Load (0.2ms) SELECT `models`.* FROM `models` ORDER BY `models`.`id` DESC LIMIT 1
=> #<Model id: 4, my_attr: "my secret text">
All finder helpers should work too, for example:
>> Model.find_by_my_attr("other text")
D, [2016-03-14T07:20:06.125670 #14757] DEBUG -- : Model Load (0.3ms) SELECT `models`.* FROM `models` WHERE `models`.`my_attr` = 'b3RoZXIgdGV4dA==\n' LIMIT 1
=> nil # nothing found here for wrong my_attr value
>> Model.find_by_my_attr("my secret text")
D, [2016-03-14T07:21:04.601898 #14757] DEBUG -- : Model Load (0.6ms) SELECT `models`.* FROM `models` WHERE `models`.`my_attr` = 'bXkgc2VjcmV0IHRleHQ=\n' LIMIT 1
=> #<Model id: 4, my_attr: "my secret text"> # FOUND!
It looks like rails 2 has the 'after_initialize' callback which should get you what you want (at a bit of a performance hit):
class Model < ActiveRecord::Base
after_initialize do |model|
# your decryption code here
end
end
http://guides.rubyonrails.org/v2.3.11/activerecord_validations_callbacks.html#after-initialize-and-after-find

mysql and rspec bug

In Sequel Pro, created a table using this statement:
CREATE TABLE dogs(
id INT PRIMARY KEY NOT NULL,
name TEXT,
color TEXT
);
*auto increment, under extra in structures, is checked so Sequel Pro generate primary keys automatically*
Using mysql2, I author the method insert, in ruby file classdog.rb, to insert a new dog into Table dogs.
classdog.rb is below in its entirety:
require 'mysql2'
require "debugger"
class Dog
attr_accessor :name, :color, :id,
##db = Mysql2::Client.new(:host => '127.0.0.1', :username => 'root', :database => 'dogs')
def initialize(name, color)
#name = name
#color = color
end
def self.db
##db
end
def db
##db
end
def insert
db.query("INSERT INTO dogs(name, color) VALUE('#{name}', '#{color}')")
end
end
dog = Dog.new("simba", "grey")
puts dog.insert
To check if my code is working, I create this rspec file:
require "./classdog"
describe Dog do
describe "#insert" do
it "should insert a dog into the database" do
dog = Dog.new("simba", "grey")
sql_command = "SELECT * FROM dogs WHERE name = '#{dog.name}'";
row_hash = {"id" => 1, "name" => "simba", "color" => "grey"}
expect(Dog.db.query(sql_command).first).to eq(row_hash)
end
end
end
When I run my spec file in ruby using this command:
rspec spec_classdog.rb
My tests passes.
But there 2 things I don't understand:
The table itself only inserts a new dog when I run my spec file, spec_classdog.rb, using rspec. But when I run my ruby file, classdog.rb, no new dog is inserted.
Why is this happening? I expected that running my ruby file could result in new insertions while rspec is just to check to make sure that my method works. It is because I am not passing the parameters name and color to insert method (meaning something like this: dog.insert("spot", "black")?
When I have the following code in my classdog.rb file:
dog = Dog.new("simba", "grey")
puts dog.inspect
puts dog.name
puts dog.color
puts dog.id
Ruby puts:
Notice that dog.id has no output, as seen very clearly below:
dog = Dog.new("simba", "grey")
puts dog.id
why isn't ruby revealing the the id of dog in dog.id?
Is it because id was set as a primay key when the TABLE dog was created?
Will adding a specific column named dog_id help?
#PeterAlfvin: here is an image showing output of running puts dog.insert
Here's at least some of your problems:
Mysql doesn't auto-create primary key columns for you unless you specify auto_increment on the column
The insert method does not provide an id value, so it will always fail, since id is required to be non-null.
Given the above, any entries in your database were not created by the code you've shown.
Given that you've addressed that issue, then you've got the following:
The id value is only being created by mysql in the database, not in the Ruby Dog object, so it will always be nil in the object unless/until you set it (which you are not currently doing).
It has nothing to do with id being a primary key
Creating a dog_id attribute/column/field would have not effect on this
Ruby is revealing the value of dog.id; it's string representation just happens to be the empty string
The reason why a running spec spec_classdog.rb results in the inserting of a new dog is because my rspec file contains sql statements - therefor running rspec file results in sql statement begin carried out.
My rb file does not contain any rspec statements - classdog.rb simply exists to tell each dog object what I want it to do in Ruby-land. Also remember that in Ruby-land dog objects disappear after it is created & has carried out its call; it does not persist. Hence the need for database - resolves the issue of persistence.
See [this link]: How to create/maintain ID field in Sequel Pro via Ruby and mysql2 for answer to the 2nd part of the question.

Rails 3 - how to find last inserted ID from mysql table

in my controller I have following sequence of commands:
SAVE DATA INTO FIRST TABLE
_get ID of inserted item into table from first step_
SAVE DATA INTO SECOND TABLE WITH ID FROM FIRST COMMAND
if FIRST.save && SECOND.save
do something
And I am wondering, how to get id of item, which is immediately inserted into database... I tried to googling, but I can't find this information...
Thanks in advance for your hints
# SAVE DATA INTO FIRST TABLE
first_instance = FirstModel.new( :foo => :bar )
first_save = first_instance.save
# _get ID of inserted item into table from first step_
first_instance_id = first_instance.id
# SAVE DATA INTO SECOND TABLE WITH ID FROM FIRST COMMAND
second_save = SecondModel.new( :first_model_id => first_instance_id ).save
if first_save && second_save
# do something
end
After saving a model, you can access it's id variable:
#user = User.new
puts #user.id
# => nil
#user.save
puts #user.id
# => 1
Could you just search your database by the updated_at field in your model?
To get the most recent record:
#model1 = Model1.order("updated_at DESC").limit(1)
or better yet, upon saving Model1 in the first place:
#model1 = model1.save
To assign:
#model2.model1_id = #model1.id
Note: if you actually want to save the ID of a specific record, finding the last isn't the best way to go.
This is because another record could be inserted by a different user, right after you inserted Model1 and right before you call Model2.
If you want the two to save together or not at all, you can look into transactions: http://api.rubyonrails.org/classes/ActiveRecord/Transactions/ClassMethods.html
If you're happy with Model1 saving on its own before worrying about Model2, then simply assign the variables as I did above.