Difficulty creating a benchmark test using Hyperledger Caliper - configuration

I am trying to use the Hyperledger Caliper framework to measure the performance of a blockchain network based on Hyperledger Fabric, containing 3 nodes and 1 orderer and solo consensus.
I already have the network installed and functional with smart contracts responding correctly on different remote virtual machines. I know I have to create a network configuration file and one for testing configuration. At this point my doubts begin. In all the examples I saw, in this configuration file, javascripts files for testing are related, but my smart contract was written in golang.
Must my tests be written using javascript? Can I reference a golang file in this file? Would anyone have an example to provide me? I've been researching for weeks but I can't understand the examples provided by the framework.
Could anyone give me any help, even if it is a link that I haven't seen yet to search further.

The key to caliper's javascript is to call the peer. In other words, it is not executed by directly connecting a specific smart contract (golang).
Caliper(javascript) <-> Peer <-> Chaincode(golang)
caliper requests the peer's chaincode (javascript) to peer, and the peer receives the request, executes chaincode (golang), and returns the result.
That mean, it is completely independent of the chaincode language.
See the link below.
In the case of the code that calls the chaincode in the actual caliper's JavaScript, only the chaincode name, function, and input parameters are required.
hyperledger-caliper-fabric-tutorial
(ex)
...
const contractID = fabcar;
const chaincodeFunction = 'createCar';
const invokerIdentity = 'Admin#org1.example.com';
const chaincodeArguments = [assetID,'blue','ford','focus','jim'];
const result = await bc.bcObj.invokeSmartContract(ctx, contractID, version, myArgs);
console.log(result);

All caliper workloads currently execute in node.js which is predominantly a javascript engine (Chaincode can be written in any fabric supported language). Support for other languages may come in the future either natively or maybe through alternative just a transpiling to javascript or compiling to wasm as the node.js engine increases it's wasm capabilities.
I would recommend looking at the latest tutorial for fabric which can be found here https://hyperledger.github.io/caliper/v0.4.2/fabric-tutorial/
As there are some big improvements to caliper (over the v0.3.2 version)

Related

What the use of adding the extra "async" to airflow?

I used to install, configure and use Airflow daily. I have a new project and it's going to be the first time I install airflow 2.0 from scratch.
Reading the documentation about the difference between the two versions I saw that in the extra packet we can install with airflow "async" is available (it's not new and it's available for the v1). I never face a tutorial or a medium article talking about using "Async worker classes for Gunicorn" like you can read in the documentation.
Do anybody can explain to me the benefit for this extra package ?
The Gunicorn HTTP Server has different classes of workers: sync, async, etc. Based on the expected load, it may be beneficial to chose one worker class over the other. For example, the assumption behind choosing the sync class is that the web application (requests to which will be served by the sync workers) is resource-bound in terms of CPU and network bandwidth.
Airflow webserver uses the sync class by default.. I don't have a real-world example in favor of the async class. But I could imagine some custom plugin that extends the Airflow's webserver and has some long-lasting IO-bound logic might benefit from async workers. Although, this is just a guess...

IBM Maximo - Querying API for data with very slow response time

I have been looking everywhere for a solution to this problem.
At my work, we are trying to integrate Maximo with another system via the other systems REST API (which returns JSON responses). I am able to make this integration work on a small scale, however this API is taking upwards of 5 seconds to respond per request. Currently, I have defined this system as a JSON Resource, and I copy daily "snapshots" of the non-persistent data to a persistent attribute using an automation script. The requests all run in a sequence - which works slowly for 5 assets in testing, and will definitely not scale to 1000's of calls a day.
Assume that the API of the external system cannot be modified in any way... Is there a way to query this API in a non-blocking way? I'd imagine that if I could send a request, and send the next, etc. without needing to wait for a reply to proceed, this would solve the problem.
I looked into Invocation and Publishing Channels, and also Enterprise Services, and it seems like Enterprise Services along with JMS Queues might be what I need, however documentation says that these only support queuing incoming data... and I can't see how this solves my problem.
Any help? I am completely stuck on this.
Thank you!
I had to do something that sounds similar, once. I tried JSON Resources, but they didn't work for me. I ended up using the examples in Maximo 7.6 Scripting Features to do it. The first code sample in that document is a library script for making HTTP/S calls using out-of-the-Maximo-box libraries, and other examples in that document use IBM's JSONObject and JSONArray classes (also available out of the Maximo box) to parse responses.
To get things going concurrently / multithreaded, you could configure a cron task to call your automation script, and configure multiple instances on various schedules to call the same one and use the args or some other mechanism to prevent collisions.

python application measure network traffic

Problem Set
Assuming an application with two networking classes (these are the backend for every outgoing and incoming network connection of the application), I would like to monitor the network load created by the application itself in Python 2.7.
Already did multiple searches on SO and the Net but did not get the results or ideas i was looking for.
I would like to achieve the solution without 3rd-party application dependencies (such as Wireshark or similar products as I am not in control of end-user OS. Application shall be 100% Cross Platform).
networking classes:
mysql-Driver based on mysql-connector-python.
"general" networking class to check availability of hosts using socket library shipped with python
Question(s)
Are there python libraries that can achieve this without any 3rd party product beeing installed?
is the approach so off that another approach should be much easier / more likely possible?

Connect Sproutcore App to MySQL Database

I'm trying to build my first Sproutcore App and I struggle to connect it to a MySQL-Database or any datasource other than fixture. I can't seem to find ANY tutorial except this one from 2009 which is marked as deprecated: http://wiki.sproutcore.com/w/page/12413058/Todos%2007-Hooking%20Up%20to%20the%20Backend .
Do people usually not connect SC-Apps to a Database? If they do so, how do they find out how to? Or does the above mentioned tutorial still work? A lot of gem-commands in the introduction seems to already differ from the official Sproutcore getting-started-guide.
SproutCore apps, as client-side "in-browser" apps, cannot connect directly to a MySQL or any other non-browser database. The application itself runs only within the user's browser (it's just HTML, CSS & JavaScript once built and deployed) and typically accesses any external data via XHR requests to an API or APIs. Therefore, you will need to create a service wrapper around your MySQL database in order for your client-side app to be able to load and update data.
There are two things worth mentioning. The first is that since the SproutCore app contains all of your user interface and a great deal of business logic, your API can be quite simple and should only return raw data (such as JSON). The second is that, I should mention that the client-server design, while more tedious to implement, is absolutely necessary in practice, because you can never trust the client side code, which is in the hands of a possibly nefarious user. Therefore, your API should also act as the final gatekeeper to validate all requests from the client.
This tutorial I found helped me a lot. Its very brief and demonstrates how to implement a very simple login-app, how to send post-requests (triggered by the login-button-action) to the backend-server and how to asynchronously process the response inside the Sproutcore-App:
http://hawkins.io/2011/04/sproutcore_login_tutorial/

Unit Testing REST API

I have some experience with unit testing and mocks. In my limited experience I would use the two to test a service layer, for example, mocking (stubbing?) out the database to eliminate dependencies and concentrate on unit testing the business logic.
Now I'm creating a wrapper API implementation that will consume RESTful web services. The json result structure sent back to me is out of my hands, for example: Twitter. I'm simply building the client to interface with their api. I'm unsure how to go about unit testing the json result. Right now I'm just mocking the result of the http request with a static json structure. This ensures that the deserialzing of json to my pojos is correct, but I'm concerned about API changes. What if the api structure changes? What if the api currently returns "title" today and "groovy_title" tomorrow? My unit test wouldn't catch that.
From my understanding though - unit tests are supposed to be quick. Previously I would mock the db and now I'm mocking http, but should I actually be using the concrete http implementation so I'm notified immediately of a breaking api change? Or is there a better way to approach this situation?
I would continue to do what you are doing and mock the interface between your code and the external API. As you point out, this will not detect changes in the external API.
You could write integration tests that actually go to the external server to test for API changes. I suspect you have separated out the code that does the interaction into its own server/module, so you can literally ping the external API without being obstructed by more than 1 abstraction layer in your app.
Note, you could build these tests without using your app code; i.e. just wget or curl and do some analysis on the results...
The issues with this are numerious; off the top of my head:
You need a network connection
Slower
The external service could be down temporarily -- i.e. failure could mean different things.
etc.