Decoding parameters in Google webapps - html

I'm trying to better understand the google web applications, the HTML source has JSON that has been encoded in some unknown way which i would like to decode. For example the below source contains parameter such as DpimGf, EP1ykd which makes no sense
view-source:https://contacts.google.com/
..window.WIZ_global_data = {"DpimGf":false,"EP1ykd":.....
So far i have tried the following
1. Decoded using the base64 decoder, but results are unprintable/not usable.
2. Decoded using poly-line encoding used in Google Maps.
3. Built an app from scratch to perform base64->binary->XOR->ASCII char and to shift the binary values up-to 7 places [inspired by poly-line algorithm]
Is there any documentation from google or known encoding for such formats.
Assumptions : I'm pretty sure that this encoding of some sort and not encryption, because
1) Length of the encrypted text dont match the common encryption algorithms
2) There is some sort of pattern with value of the parameters,
So good chance that its just encoded without any encryption.
Because common encryption provides completely different strings each time.
3) There is a good chance that they may not decode,
because it might have a mapping at server side to a meaningful parameter.
Thanks

try using this http://ddecode.com/hexdecoder/?results=cddb95fa500e7c1af427d005985260a7. try running the whole thing in this it might help

Related

Binary in GraphQL

According to the docs about scalars in GraphQL there is no support for binary data up to now.
According to above page it is possible to define own Types.
How could you implement a binary scalar in GraphQL?
I came here looking for an answer and after some reflection I got to the following conclusion, it is not a direct answer to the question but I think it is related and important to consider. You shouldn't implement a binary scalar in GraphQL.
I think for small images the encode as base64 solution will work great but for bigger files it goes against the design philosophy of GraphQL about efficient data transfer, so a better alternative would be to have those files somewhere else, like a CDN and just use the reference in GraphQL.
Size matters, for small queries it could make no difference but for big complex queries it could be a big performance problem.
The documentation seems to hint that custom types would still somehow boil down to default types:
In most GraphQL service implementations, there is also a way to specify custom scalar types. For example, we could define a Date type.
Then it's up to our implementation to define how that type should be serialized, deserialized, and validated. For example, you could specify that the Date type should always be serialized into an integer timestamp, and your client should know to expect that format for any date fields
The first thing that pops to mind in this case will be a base64-encoded string. Depending on your language of choice SO likely will have a sample serialisation/deserialisation routines.
You can but have to use default data-type to create a new one. For audio, video or images you can easily convert it into base64 and pass them as a string but in that, you have to keep in mind the length of the data as it's not stored in the buffer.

What are the pros and cons of Base64 file upload through JSON, as opposed to AJAX or jQuery upload?

I was tasked to write image upload to remote server and save those images locally. It was quite easy to do it with Base64 transfer through JSON and storing with Node.js. However, is there a reason to not use this type of file upload, to use AJAX or other ways? (Other than the 30% bandwidth increase, which I know of. You can still include that in your answer in order for it to be full).
The idea of base64 encoding is to avoid binary data for protocols based on text. Outside this situation, it's I think always a bad idea.
Pros
Avoidance of binary data for protocols based on text, and independance from external files.
Avoidance of delimiter collision.
Cons
Time and space increased complexity; for space it's 33–36% (33% by the encoding itself, up to 3% more by the inserted line breaks).
API response payloads are larger/too large.
User Experience is negatively impacted, unless one invoke some lazy loading.
By including all image data together in one API response, the app
must receive all data before drawing anything on screen. This means
users will see on-screen loading states for longer and the app will
appear sluggish as users wait.
This is however mitigated with Axios and some lazy loader such as react-lazyload or lazyload or so.
CDN caching is harder. Contrary to image files, the Base64 strings inside an API response cannot be delivered via a CDN cache. The whole API response must be delivered by CDN. (cf., Don’t use Base64 encoded images on mobile and Why "optimizing" your images with Base64 is almost always a bad idea)
Image caching on the device is no longer possible.
Content management becomes harder on server side. Most content management tools handle images as binary files. But then when managing in binary, there is the time overhead of encoding/decoding.
No security gain and overhead in engineering to mitigate (Sanitizing, Input Validation, Escaping). Example of XSS attack: Preventing XSS with Base64 encoding: The False sense of web application security
The developers of that site might have opted to make the website appear more secure by having cryptic URLs and whatnot. However, that
doesn't mean this is security by obscurity.
If their website is vulnerable to SQL injection and they try to hide that by encoding the URLs, then it's security by obscurity. If their website is well secured against SQL injection; XSS; CSRF; etc., and they deiced to encode the URLs like that, then it's just plain stupidity.
It does not help with text encoded images such as svg (Probably Don’t Base64 SVG)
Data URIs aren't supported on IE6 or IE7, nor on Opera before 7.2 (Which browsers support data URIs and since which version?)
References
https://en.wikipedia.org/wiki/Base64
https://en.wikipedia.org/wiki/Delimiter#Delimiter_collision
SO: What is base 64 encoding used for?
https://medium.com/snapp-mobile/dont-use-base64-encoded-images-on-mobile-13ddeac89d7c
https://css-tricks.com/probably-dont-base64-svg/
https://security.stackexchange.com/questions/46362/purpose-of-using-base64-encoded-urls
https://bunnycdn.com/blog/why-optimizing-your-images-with-base64-is-almost-always-a-bad-idea/
https://www.davidbcalhoun.com/2011/when-to-base64-encode-images-and-when-not-to/
Data Encoding
Every data Encoding and Decoding can be used duo various reasons, which came up with benefits and downsides.
like:
Error-detection encodings : which can detect errors but increase data usage.
Encryption encodings: turns data to cipher which intruder wont decipher.
There are a lot of Encoding Algorithms which Alter Data in
Which has some usefullness to do that.
but with
Base64 Encoding, its encode every 6-bit data into one character (8-bit) .
3 Byte to 4 Byte but it only includes alphanumeric(62 distinc) and 2 signs.
its benefits is it Dose not have special chars and signs
Base64 Purpose
it make possible to transfer Any Data with Channels Which Prohibits us to have:
special chars like ' " / \ ...
non-printable Ascii like \0 \n \r \t \a
8-bit Ascii codes (ascii with 1 MSB )
binary files usually includes any data which if turns in ascii can be any 8-bit character.
in some protocols and application there are I/O Interfaces Which Does only accepts a handful of chars (alphanumeric with a few of signs).
duo to:
prevent to code injection (ex: SQL injection or any prgramming-language-syntax-like characters ; )
or just some character has already has a meaning in their protocol (ex: in URI QueryString character & has a meaning and cannot be in any QueryString Value)
or maybe the input is not intended to accept non-alphanumerical values. (ex: it should accept only Human Names)
but with base64 encoding you can encode anything and transfer it with
any channel you want.
Example:
you can encode an image or application and save it in DBMS with SQL
you can include some binary data in URI
you can send binary files in a protocol which has been designed to accepts only human chats as alphanumerical like IRC Channel
Base64 is a just a converting format that HTTP server cannot accept binary data in the contents except the HTTP Header type is binary or acceptable format defined by web-server.
As you might know, JSON can contain various formats and information; thus, you can contain such as
{
IMG_FILENAME="HELLO",
IMG_TYPE="IMG/JPEG",
DATA="~~~BASE64 ENCODED IMAGE~~~~"
}
You can send JSON file through AJAX or other method. But, as I told you, HTTP server have various limitation because it should keep RFC2616 (https://www.rfc-editor.org/rfc/rfc2616).
In short, Sending Through JSON can contain various data.
AJAX is just a type of sending as other ways does.
I used same solution in one of my project.
The only concern is the request body size. If all your images are small, like a few M, then you should be fine.
My server is asp.net core, its maxAllowedContentLength value is 30000000, which is approximately 28.6MB. When the image size is over this, the request failed with error "request body too large".
I think node.js should have similar setting, make sure to adjust it to meet your need.
Please note that when the request size is too big, the possibility of request timeout increases accordingly due to the network traffic. This will be an issue especially for the requests from phones.
I think the use of base64 is valid.
The only doubt is the size of the request, but this can be circumvented if you divide this base64 in the frontend, if a 30mb file you could divide each request into 5mb and in the backend put the parts together, this is useful even to do the "keep downloading" "when you have a problem with the network and corrupt some part.
Hugs
Base64 converts your data to an ASCII representation of the binary data. It allows you to embed your data in text streams such as JSON for example. Base64 increases the size of the data transferred by 33%.
multipart/form-data is the standard way of transferring binary data in HTTP requests. It allows you to use specific encodings / content types for each part you'd like to transfer. In my opinion, you should stick to multipart uploads unless you have specific requirements or device/SDK capabilities.
Checkout these links
What is difference Between Base64 and Multipart?
Base64 image upload VS Binary image upload?

What is the difference between MessagePack, Protobuf and JSON ? Can anyone tell me which one to use when

I need to understand the difference between
- message pack
- protocol buffers
- JSON
Without having jumped in deeply into the matter I'd say the following:
All three are data formats that help you serialize information in a structured form so you can easily exchange it between software components (for example client and server).
While I'm not too familiar with the other two, JSON is currently a quasi-standard due to the fact that it is practically built into JavaScript - it's not a coincidence it is called JavaScript Object Notation. The other two seem to require additional libraries on both ends to create the required format.
So when to use which? Use JSON for REST services, for example if you want to publish your API or need different clients to access it. JSON seems to have the broadest acceptance.

Protocol Buffers vs XML/JSON for data entry outside of programming effort

I would love to use protocol buffers, but I am not sure if they fit my use case. Here it is:
I have a Quiz app. This requires a bunch of data, like categories, questions, a list of answers (and which ones are correct). I do not want to be responsible for entering this data - I would prefer to pass it off to a non-programmer to serialize all this data for me, in either XML or JSON. Then my app would just read in the data file.
Does Google's Protocol Buffers fit my use case? Or should I stick to a more traditional format like XML or JSON?
I think not: Protobuf is a binary format. So then you would need to support a text format like XML or JSON and Protobuf.
Also it does not seem you would benefit from Protobufs better berformance at all.

How to generate automatically asn.1 encoded packets?

I want to test my application and I need to generate different load. Application is SUPL RRLP protocol parser, I have ASN.1 specification for this protocol. Packets have a lot of optional fields and number of varians may be over billion - I can't go through all the options manually. I want to automate it.
The first way to generate packets automatically, the other way is to create a lot different value assignments sets and encode each into binary format.
I found some tools, for example libtasn and Asn1Editor, but the first one can't parse existing ASN.1 spec file; the second one can't encode packets by specification.
I'm afraid to create thousandth ASN.1 parser because I can introduce errors in test process.
I hoped it's easy to find something existing, but... I'm capitulating.
Maybe, someone faced with the same problem on stackowerflow and found the solution? Or know something to recommend. I'll thank you.
Please try going to https://asn1.io/asn1playground/ and try your specification there. You can ask it to generate a sample value for a given ASN.1 type. You can encode it and edit either the encoded (hex) data, or decoded values to create additional values.
You can also download a free trial of the OSS ASN.1 Tools from http://www.oss.com/asn1/products/asn1-download.html which includes OSS ASN.1 Studio. This also allows you to generate (and modify) sample values for a given ASN.1 type.
Note that these don't generate thousands of different test values for you automatically, but will parse valid value notation and encode the values for you if you are able to generate valid ASN.1 value notation.