Does the new version (0.24) of Orion let fuzzy search (approximate string search) over entities properties?
In addition, I tried to create an entity with an empty string, but althought the server is returning a 201 code, the entity is not created.
//url to create entity (POST)
http://some.ip:port/v2/entities
//payload:
{
"type": "Test",
"id": "Test.1",
"nombre": ""
}
//reponse
code 201
//url to list entities (GET)
http://some.ip:port/v2/entities?type=Test
//response
[]
This case doesn't work in Orion 0.24.0 due to a bug that has been recently solved in the develop branch. The fix will be available in the version next to 0.24.0, either 0.24.1 or 0.25.0 (number not yet decided at the moment of writting this) by the end of september 2015.
Regarding fuzzy search, we haven't consider yet that functionality in NGSIv2. If you find it useful/needed I'd recommend you to create a new issue in the Orion repository, explaining the feature request as detailed as you can, please.
Related
I have been reading old blocks from the solana JSON RPC API (using python), but now I am trying to subscribe to the block production on the solana network (to get up live updates).
I tried to pull updates through the RPC API using
{"jsonrpc": "2.0", "id": "1", "method": "blockSubscribe", "params": ["all"]}
This doesn't work, with response: 'code': -32601, 'message': 'Method not found'
Looking at the docs.solana.com info, it states that:
This subscription is unstable and only available if the validator was
started with the --rpc-pubsub-enable-block-subscription flag. The
format of this subscription may change in the future
I assume this means I need to run solana-test-validator --rpc-pubsub-enable-block-subscription, but this just returns:
error: Found argument '--rpc-pubsub-enable-block-subscription' which wasn't expected, or isn't valid in this context
Did you mean --rpc-port?
USAGE:
solana-test-validator --ledger <DIR> --rpc-port <PORT>
I can't seem to find any more information on how to subscribe to blocks using the RPC.
Any ideas or help with what I'm doing wrong?
Thanks in advance
You are correct that the validator has to run --rpc-pubsub-enable-block-subscription. For mainnet-beta usage, it is recommended to either find a private rpc with this enabled or have your own. Please note though, the method is marked currently as unstable.
It looks like rpc-pubsub-enable-block-subscription is not available on the test validator. You can find the full command list here.
This subscription is unstable and only available if the validator was started with the --rpc-pubsub-enable-block-subscription flag. The format of this subscription may change in the future
The title sounds quite comprehensive, but my baseline question is quite simple, I guess.
Context
I Azure, I have an IoT hub, which I am sending messages to. I use a modified version one of the samples from the Azure Iot SDK for python.
Sending works fine. However, instead of a string, I send a JSON structure.
When I watch the events flowing into the IoT hub, using the Cloud shell, it looks like this:
PS /home/marcel> az iot hub monitor-events --hub-name weathertestiothub
This extension 'azure-cli-iot-ext' is deprecated and scheduled for removal. Please remove and add 'azure-iot' instead.
Starting event monitor, use ctrl-c to stop...
{
"event": {
"origin": "raspberrypi-zero-wh",
"payload": "{ \"timestamp\": \"1608643863720\", \"locationDescription\": \"Attic\", \"temperature\": \"21.941\", \"relhumidity\": \"71.602\" }"
}
}
Issue
The data seems fine, except the payload looks strange here. BUT, the payload is literally what I send from the device, using the SDK sample.
Is this the correct way to do it? At the end, I have a very hard time to actually get the data into the Time Series Insights model. So I guess, my structure is to be blamed.
Question
What is a recommended JSON data structure to send to the IoT hub for later use?
You should add the following 2 lines to your message in your python SDK sample:
msg.content_encoding = "utf-8"
msg.content_type = "application/json"
This should resolve your formatting concern.
We've also updated our samples to reflect this: https://github.com/Azure/azure-iot-sdk-python/blob/master/azure-iot-device/samples/sync-samples/send_message.py
I ended up using the tip by #elhorton, but it was not the key change. Nonetheless, the formatting in the Azure Shell Monitor looks now much better:
"event": {
"origin": "raspberrypi-zero-wh",
"payload": {
"temperature": 21.543947753906245,
"humidity": 69.22964477539062,
"locationDescription": "Attic"
}
}
The key was:
include the message source time in ISO format
from datetime import datetime
timestampIso = datetime.now().isoformat()
message.custom_properties["iothub-creation-time-utc"] = timestampIso
Using the locationDescription as the Time Series ID Property See https://learn.microsoft.com/en-us/azure/time-series-insights/how-to-select-tsid (Maybe I could also have taken the iothub-connection-device-id, but I did not test that alone specifically)
I guess using "iothub-connection-device-id" will make "raspberrypi-zero-wh" as the name of the time series instance. I agree with your choice of using "locationDescription" as TSID; so Attic becomes the time series instance name, temperature and humidity will be your variables.
I need to create my own REST API.
I just saw strongloop and loopback and I thought it will be perfect for my project.
In fact, I was able to get mysql connected using strongloop. However, I had to create something called a "model" and I did it. But, it was like creating a new model from scratch and use it for persisting on the Datasource.
Instead, what I was looking for, is to get a REST API directly from my model on the DB.
I mean taking the models from each table on the DB and then set them up as web services.
Is that possible?
I am newbie on these technologies, although I think it is an interesting question.
Thanks
I'm not sure of a Node tool to do what you're asking, but in other languages / databases you have some choices!
The only one I'm really familiar with is postgrest.
postgrest: You import your data into a Postgres database (similar to MySQL), and it generates a REST API on top of your tables instantly. Bam. Done. I've used this and it was amazingly awesome. You can also deploy it directly on Heroku.
StrongLoop actually has a "discovery" tool for just this purpose! Read up on that page, but here's the basic code to do it. Just drop this code in a file inside /server/boot/ (the docs are wrong on that, it must be in the directory I mention). Of course, you'll need to tailor it for your use case:
var loopback = require('loopback');
var ds = loopback.createDataSource('mysql', {
"host": "yourhost",
"port": 1234,
"database": "foobar",
"username": "someuser",
"password": "somepass"
});
// Discover and build models from a given table
ds.discoverAndBuildModels('PERSON', {visited: {}, associations: true},
function (err, models) {
// Now we have a list of models keyed by the model name
// You only need the rest of this if wanted to inspect what came in...
// For example, you could find the first record from the table
// and verify info or something.
models.Person.findOne({}, function (err, person) {
if(err) {
// handle this if need be...
console.error(err);
return;
}
// Some code using `person`
});
});
Good luck!
I was supposing "discovery tool" was to find out pattern, cluster or whatever else thing into the data. But following
the recomendation of #jakarella I went with more depth.
It was even easier than that, because you can do everything via StrongLoop Arc (GUI). I always prefer cli
but to have a general look of it, sometimes is better to start with the GUI if you don't have too much idea about the subject.
Anyway,
first of all you would need to connect your datasource before (installing previously the driver).
After that, through StrongLoop Arc you can do "discover the models" choosing your tables (taking care of run Arc and restarting every time
you get a new model) and that's it, you get an API over your Datasource
(for testing go to the explorer).
I described before generally the main tasks but if anyone needs more detail pls let me know. Hope this helps to anyone else who is looking to do
something similar
thanks guys for the interest.
I'm trying to automatically archived versions in Jira.
So far, I've been able to create and update versions in Jira with the REST API, but no luck with archiving a version.
I've try just setting the field archived from false to true using something like this:
{ "update" : { "archived" : [{ "set" : true }] } }
I've also try to set the released field at the same time.
I did also try to send all version's fields with archived updated.
All without success, so I'm guessing there is something else that needs to be done in that case.
So to resume this, I am mainly trying to find the exact json content to archive a version since I already know how to use JIRA REST API.
Thanks.
As described in the API docs, versions don't use an "update" field for updating. Simply PUT the data {"archived": true} to http://JIRA-SERVER/rest/api/2/version/VERSION-ID.
With all the recent buzz around the FIDO U2F specification, I would like to implement FIDO U2F test-wise on a testbed to be ready for the forthcoming roll out of the final specification.
So far, I have a FIDO U2F security key produced by Yubico and the FIDO U2F (Universal 2nd Factor) extension installed in Chrome. I have also managed to set up the security key to work with my Google log-in.
Now, I'm not sure how to make use of this stuff for my own site. I have looked through Google's Github page for the U2F project and I have checked their web app front-end. It looks really simple (JavaScript only).
So is implementing second factor auth with FIDO as simple as implementing a few JavaScript calls? All that seems to be happening for the registration in the example is this:
var registerRequest = {
appId: enrollData.appId,
challenge: enrollData.challenge,
version: enrollData.version
};
u2f.register([registerRequest], [], function (result) {
if (result.errorCode) {
document.getElementById('status')
.innerHTML = "Failed. Error code: " + result.errorCode;
return;
}
document.location = "/enrollFinish"
+ "?browserData=" + result.clientData
+ "&enrollData=" + result.registrationData
+ "&challenge=" + enrollData.challenge
+ "&sessionId=" + enrollData.sessionId;
});
But how can I use that for an implementation myself? Will I be able to use the callback from this method call for the user registration?
What you are trying to do is implement a so called "relying party", meaning that your web service will rely on the identity assertion provided by the FIDO U2F token.
You will need to understand the U2F specifications to do that. Especially how the challenge-response paradigm is to be implemented and how app ids and facets work. This is described in the spec in detail.
You are right: The actual code necessary to work with FIDO U2F from the front end of you application is almost trivial (that is, if you use the "high-level" JavaScript API as opposed to the "low-level" MessagePort API). Your application will however need to work with the messages generated by the token and validate them. This is not trivial.
To illustrate how you could pursue implementing a relying party site, I will give a few code examples, taken from a Virtual FIDO U2F Token Extension that I have programmed lately for academic reasons. You can see the page for the full example code.
Before your users can use their FIDO U2F tokens to authenticate, they need to register it with you.
In order to allow them to do so, you need to call window.u2f.register in their browser. To do that, you need to provide a few parameters (again; read the spec for details).
Among them a challenge and the id of your app. For a web app, this id must be the web origin of the web page triggering the FIDO operation. Let's assume it is example.org:
window.u2f.register([
{
version : "U2F_V2",
challenge : "YXJlIHlvdSBib3JlZD8gOy0p",
appId : "http://example.org",
sessionId : "26"
}
], [], function (data) {
});
Once the user performs a "user presence test" (e.g. by touching the token), you will receive a response, which is a JSON object (see spec for more details)
dictionary RegisterResponse {
DOMString registrationData;
DOMString clientData;
};
This data contains several elements that your application needs to work with.
The public key of the generated key pair -- You need to store this for future authentication use.
The key handle of the generated key pair -- You also need to store this for future use.
The certificate -- You need to check whether you trust this certificate and the CA.
The signature -- You need to check whether the signature is valid (i.e. confirms to the key stored with the certificate) and whether the data signed is the data expected.
I have prepared a rough implementation draft for the relying party server in Java that shows how to extract and validate this information lately.
Once the registration is complete and you have somehow stored the details of the generated key, you can sign requests.
As you said, this can be initiated short and sweet through the high-level JavaScript API:
window.u2f.sign([{
version : "U2F_V2",
challenge : "c3RpbGwgYm9yZWQ/IQ",
app_id : "http://example.org",
sessionId : "42",
keyHandle: "ZHVtbXlfa2V5X2hhbmRsZQ"
}], function (data) {
});
Here, you need to provide the key handle, you have obtained during registration.
Once again, after the user performs a "user presence test" (e.g. by touching the token), you will receive a response, which is a JSON object (again, see spec for more details)
dictionary SignResponse {
DOMString keyHandle;
DOMString signatureData;
DOMString clientData;
};
You the need to validate the signature data contained herein.
You need to make sure that the signature matches the public key you have obtained before.
You also need to validate that the string signed is appropriate.
Once you have performed these validations, you can consider the user authenticated. A brief example implementation of the server side code for that is also contained in my server example.
I have recently written instructions for this, as well as listing all U2F server libraries (most of them bundles a fully working demo server), at developers.yubico.com/U2F. The goal is to enable developers to implement/integrate U2F without having to read the specifications.
Disclaimer: I work as a developer at Yubico.