How to disable scientific notation in Apache Drill API response - apache-drill

When decimal/double number becomes too long, Drill automatically sends it in scientific notation (1.2345E6). This quite complicates parsing the result at the other end.
Only HTTP API returns this form. Results from JDBC are OK.
Is there any way to disable this behavior?

AFAIK, there is no way to disable scientific notation, but you can use UDFs which convert decimal/double results to varchar in the desired format, for example, to_char UDF does similar things.

Related

Is there anyway to mandate that a property be an ISO Time intervals in JSON Schema?

JSON Schema seems to support ISO times, dates, date-times, and even durations (see documentation), but I can't find anyway to support ISO time ranges.
I could use regex (which JSON Schema does support) but then I wouldn't be able to check if the start and end points of the interval were actually valid dates/times (e.g. 2022-13-04 there is no 13th month). How should I proceed? Do I just have to accept any string and do the validation in the JSON consuming application?
String formats aren't validated by default; instead they're merely annotations - information for the application to act upon.
However, many implementations do have validation that can be enabled. Many also support custom formats, though you'd likely need to provide the logic yourself.
An alternative approach is to split the time range into its start and end components and validate those independently. Then all your app has to do is verify that start < end.

Binary in GraphQL

According to the docs about scalars in GraphQL there is no support for binary data up to now.
According to above page it is possible to define own Types.
How could you implement a binary scalar in GraphQL?
I came here looking for an answer and after some reflection I got to the following conclusion, it is not a direct answer to the question but I think it is related and important to consider. You shouldn't implement a binary scalar in GraphQL.
I think for small images the encode as base64 solution will work great but for bigger files it goes against the design philosophy of GraphQL about efficient data transfer, so a better alternative would be to have those files somewhere else, like a CDN and just use the reference in GraphQL.
Size matters, for small queries it could make no difference but for big complex queries it could be a big performance problem.
The documentation seems to hint that custom types would still somehow boil down to default types:
In most GraphQL service implementations, there is also a way to specify custom scalar types. For example, we could define a Date type.
Then it's up to our implementation to define how that type should be serialized, deserialized, and validated. For example, you could specify that the Date type should always be serialized into an integer timestamp, and your client should know to expect that format for any date fields
The first thing that pops to mind in this case will be a base64-encoded string. Depending on your language of choice SO likely will have a sample serialisation/deserialisation routines.
You can but have to use default data-type to create a new one. For audio, video or images you can easily convert it into base64 and pass them as a string but in that, you have to keep in mind the length of the data as it's not stored in the buffer.

Decoding parameters in Google webapps

I'm trying to better understand the google web applications, the HTML source has JSON that has been encoded in some unknown way which i would like to decode. For example the below source contains parameter such as DpimGf, EP1ykd which makes no sense
view-source:https://contacts.google.com/
..window.WIZ_global_data = {"DpimGf":false,"EP1ykd":.....
So far i have tried the following
1. Decoded using the base64 decoder, but results are unprintable/not usable.
2. Decoded using poly-line encoding used in Google Maps.
3. Built an app from scratch to perform base64->binary->XOR->ASCII char and to shift the binary values up-to 7 places [inspired by poly-line algorithm]
Is there any documentation from google or known encoding for such formats.
Assumptions : I'm pretty sure that this encoding of some sort and not encryption, because
1) Length of the encrypted text dont match the common encryption algorithms
2) There is some sort of pattern with value of the parameters,
So good chance that its just encoded without any encryption.
Because common encryption provides completely different strings each time.
3) There is a good chance that they may not decode,
because it might have a mapping at server side to a meaningful parameter.
Thanks
try using this http://ddecode.com/hexdecoder/?results=cddb95fa500e7c1af427d005985260a7. try running the whole thing in this it might help

Converting JSON string to an array in ColdFusion MX7

I have a cookie value like:
"[{"index":"1","name":"TimePeriod","hidden":false},{"index":"2","name":"Enquiries","hidden":false},{"index":"3","name":"Online","hidden":false}]"
I would like to use this cookie value as an array in ColdFusion. What would be the best possible way to do this?
The normal answer would be use the built-in deserializeJson function, but since that function wasn't available in CFMX7 (it arrived in CF8), you will need to use a UDF to achieve the same thing.
There are two sites which contain resources of this type, cflib.org and riaforge.org, each of which have a different potential solution for MX7.
Searching CFlib provides JsonDecode. (CFLib has a specific filter for "Maximum Required CF Version", so you can ensure any results that appear will work for your version.)
Searching riaforge provides JSONUtil, which runs on MX7 (but also claims better type mapping than the newer built-in functions).
Since MX7 runs on Java, you can likely also make use of any of the numerous Java libraries listed on json.org, using createObject/java.
JSON serialization was added natively in CF8.
If you are on MX7 look on riaforge.org for a library that will deSerialize JSON for you.

How to generate automatically asn.1 encoded packets?

I want to test my application and I need to generate different load. Application is SUPL RRLP protocol parser, I have ASN.1 specification for this protocol. Packets have a lot of optional fields and number of varians may be over billion - I can't go through all the options manually. I want to automate it.
The first way to generate packets automatically, the other way is to create a lot different value assignments sets and encode each into binary format.
I found some tools, for example libtasn and Asn1Editor, but the first one can't parse existing ASN.1 spec file; the second one can't encode packets by specification.
I'm afraid to create thousandth ASN.1 parser because I can introduce errors in test process.
I hoped it's easy to find something existing, but... I'm capitulating.
Maybe, someone faced with the same problem on stackowerflow and found the solution? Or know something to recommend. I'll thank you.
Please try going to https://asn1.io/asn1playground/ and try your specification there. You can ask it to generate a sample value for a given ASN.1 type. You can encode it and edit either the encoded (hex) data, or decoded values to create additional values.
You can also download a free trial of the OSS ASN.1 Tools from http://www.oss.com/asn1/products/asn1-download.html which includes OSS ASN.1 Studio. This also allows you to generate (and modify) sample values for a given ASN.1 type.
Note that these don't generate thousands of different test values for you automatically, but will parse valid value notation and encode the values for you if you are able to generate valid ASN.1 value notation.