I'm trying to better understand the google web applications, the HTML source has JSON that has been encoded in some unknown way which i would like to decode. For example the below source contains parameter such as DpimGf, EP1ykd which makes no sense
view-source:https://contacts.google.com/
..window.WIZ_global_data = {"DpimGf":false,"EP1ykd":.....
So far i have tried the following
1. Decoded using the base64 decoder, but results are unprintable/not usable.
2. Decoded using poly-line encoding used in Google Maps.
3. Built an app from scratch to perform base64->binary->XOR->ASCII char and to shift the binary values up-to 7 places [inspired by poly-line algorithm]
Is there any documentation from google or known encoding for such formats.
Assumptions : I'm pretty sure that this encoding of some sort and not encryption, because
1) Length of the encrypted text dont match the common encryption algorithms
2) There is some sort of pattern with value of the parameters,
So good chance that its just encoded without any encryption.
Because common encryption provides completely different strings each time.
3) There is a good chance that they may not decode,
because it might have a mapping at server side to a meaningful parameter.
Thanks
try using this http://ddecode.com/hexdecoder/?results=cddb95fa500e7c1af427d005985260a7. try running the whole thing in this it might help
I was tasked to write image upload to remote server and save those images locally. It was quite easy to do it with Base64 transfer through JSON and storing with Node.js. However, is there a reason to not use this type of file upload, to use AJAX or other ways? (Other than the 30% bandwidth increase, which I know of. You can still include that in your answer in order for it to be full).
The idea of base64 encoding is to avoid binary data for protocols based on text. Outside this situation, it's I think always a bad idea.
Pros
Avoidance of binary data for protocols based on text, and independance from external files.
Avoidance of delimiter collision.
Cons
Time and space increased complexity; for space it's 33–36% (33% by the encoding itself, up to 3% more by the inserted line breaks).
API response payloads are larger/too large.
User Experience is negatively impacted, unless one invoke some lazy loading.
By including all image data together in one API response, the app
must receive all data before drawing anything on screen. This means
users will see on-screen loading states for longer and the app will
appear sluggish as users wait.
This is however mitigated with Axios and some lazy loader such as react-lazyload or lazyload or so.
CDN caching is harder. Contrary to image files, the Base64 strings inside an API response cannot be delivered via a CDN cache. The whole API response must be delivered by CDN. (cf., Don’t use Base64 encoded images on mobile and Why "optimizing" your images with Base64 is almost always a bad idea)
Image caching on the device is no longer possible.
Content management becomes harder on server side. Most content management tools handle images as binary files. But then when managing in binary, there is the time overhead of encoding/decoding.
No security gain and overhead in engineering to mitigate (Sanitizing, Input Validation, Escaping). Example of XSS attack: Preventing XSS with Base64 encoding: The False sense of web application security
The developers of that site might have opted to make the website appear more secure by having cryptic URLs and whatnot. However, that
doesn't mean this is security by obscurity.
If their website is vulnerable to SQL injection and they try to hide that by encoding the URLs, then it's security by obscurity. If their website is well secured against SQL injection; XSS; CSRF; etc., and they deiced to encode the URLs like that, then it's just plain stupidity.
It does not help with text encoded images such as svg (Probably Don’t Base64 SVG)
Data URIs aren't supported on IE6 or IE7, nor on Opera before 7.2 (Which browsers support data URIs and since which version?)
References
https://en.wikipedia.org/wiki/Base64
https://en.wikipedia.org/wiki/Delimiter#Delimiter_collision
SO: What is base 64 encoding used for?
https://medium.com/snapp-mobile/dont-use-base64-encoded-images-on-mobile-13ddeac89d7c
https://css-tricks.com/probably-dont-base64-svg/
https://security.stackexchange.com/questions/46362/purpose-of-using-base64-encoded-urls
https://bunnycdn.com/blog/why-optimizing-your-images-with-base64-is-almost-always-a-bad-idea/
https://www.davidbcalhoun.com/2011/when-to-base64-encode-images-and-when-not-to/
Data Encoding
Every data Encoding and Decoding can be used duo various reasons, which came up with benefits and downsides.
like:
Error-detection encodings : which can detect errors but increase data usage.
Encryption encodings: turns data to cipher which intruder wont decipher.
There are a lot of Encoding Algorithms which Alter Data in
Which has some usefullness to do that.
but with
Base64 Encoding, its encode every 6-bit data into one character (8-bit) .
3 Byte to 4 Byte but it only includes alphanumeric(62 distinc) and 2 signs.
its benefits is it Dose not have special chars and signs
Base64 Purpose
it make possible to transfer Any Data with Channels Which Prohibits us to have:
special chars like ' " / \ ...
non-printable Ascii like \0 \n \r \t \a
8-bit Ascii codes (ascii with 1 MSB )
binary files usually includes any data which if turns in ascii can be any 8-bit character.
in some protocols and application there are I/O Interfaces Which Does only accepts a handful of chars (alphanumeric with a few of signs).
duo to:
prevent to code injection (ex: SQL injection or any prgramming-language-syntax-like characters ; )
or just some character has already has a meaning in their protocol (ex: in URI QueryString character & has a meaning and cannot be in any QueryString Value)
or maybe the input is not intended to accept non-alphanumerical values. (ex: it should accept only Human Names)
but with base64 encoding you can encode anything and transfer it with
any channel you want.
Example:
you can encode an image or application and save it in DBMS with SQL
you can include some binary data in URI
you can send binary files in a protocol which has been designed to accepts only human chats as alphanumerical like IRC Channel
Base64 is a just a converting format that HTTP server cannot accept binary data in the contents except the HTTP Header type is binary or acceptable format defined by web-server.
As you might know, JSON can contain various formats and information; thus, you can contain such as
{
IMG_FILENAME="HELLO",
IMG_TYPE="IMG/JPEG",
DATA="~~~BASE64 ENCODED IMAGE~~~~"
}
You can send JSON file through AJAX or other method. But, as I told you, HTTP server have various limitation because it should keep RFC2616 (https://www.rfc-editor.org/rfc/rfc2616).
In short, Sending Through JSON can contain various data.
AJAX is just a type of sending as other ways does.
I used same solution in one of my project.
The only concern is the request body size. If all your images are small, like a few M, then you should be fine.
My server is asp.net core, its maxAllowedContentLength value is 30000000, which is approximately 28.6MB. When the image size is over this, the request failed with error "request body too large".
I think node.js should have similar setting, make sure to adjust it to meet your need.
Please note that when the request size is too big, the possibility of request timeout increases accordingly due to the network traffic. This will be an issue especially for the requests from phones.
I think the use of base64 is valid.
The only doubt is the size of the request, but this can be circumvented if you divide this base64 in the frontend, if a 30mb file you could divide each request into 5mb and in the backend put the parts together, this is useful even to do the "keep downloading" "when you have a problem with the network and corrupt some part.
Hugs
Base64 converts your data to an ASCII representation of the binary data. It allows you to embed your data in text streams such as JSON for example. Base64 increases the size of the data transferred by 33%.
multipart/form-data is the standard way of transferring binary data in HTTP requests. It allows you to use specific encodings / content types for each part you'd like to transfer. In my opinion, you should stick to multipart uploads unless you have specific requirements or device/SDK capabilities.
Checkout these links
What is difference Between Base64 and Multipart?
Base64 image upload VS Binary image upload?
I would like to know whether there is some standard that specifies binary formats using JSON as the describing language, similar to google's protocol buffers.
Protocol buffers seem very powerful but they require parsing of yet another language and considerable overhead, especially for compiled languages such as C++.
So I am wondering whether there is some accepted standard that uses JSON to describe a binary format. (Parsing the binary data might then still require some manual steps, but at least a clear and unique description of the data can be made available.)
To be clear, I am not talking about encoding binary data in JSON, I am talking about describing binary data in JSON.
Head to the ultimate Wikipedia listing and evaluate for yourself. I don't know what is the right argument to overcome your programmer's inertia. I'd consider Apache Avro the most fitting your requirement - it has JSON description.
For least friction, you could try MessagePack or BSON, which are JSON themselves, just better packed. But, by not having external declaration, need to be self descriptive, so must transport the field names on wire - so it's not as "binary" and compact as Protocol Buffers or Avro.
I want to test my application and I need to generate different load. Application is SUPL RRLP protocol parser, I have ASN.1 specification for this protocol. Packets have a lot of optional fields and number of varians may be over billion - I can't go through all the options manually. I want to automate it.
The first way to generate packets automatically, the other way is to create a lot different value assignments sets and encode each into binary format.
I found some tools, for example libtasn and Asn1Editor, but the first one can't parse existing ASN.1 spec file; the second one can't encode packets by specification.
I'm afraid to create thousandth ASN.1 parser because I can introduce errors in test process.
I hoped it's easy to find something existing, but... I'm capitulating.
Maybe, someone faced with the same problem on stackowerflow and found the solution? Or know something to recommend. I'll thank you.
Please try going to https://asn1.io/asn1playground/ and try your specification there. You can ask it to generate a sample value for a given ASN.1 type. You can encode it and edit either the encoded (hex) data, or decoded values to create additional values.
You can also download a free trial of the OSS ASN.1 Tools from http://www.oss.com/asn1/products/asn1-download.html which includes OSS ASN.1 Studio. This also allows you to generate (and modify) sample values for a given ASN.1 type.
Note that these don't generate thousands of different test values for you automatically, but will parse valid value notation and encode the values for you if you are able to generate valid ASN.1 value notation.
I am working on Tiny OS using Micaz sensors on Zigbee platform. I am also using Killerbee to analyze the data packets. Can anyone suggest how to read those hexadecimal values?
Because the node-ID I am assigning while burning the nodes are not seen in the data packets at all.
The best tool I've found for analyzing ZigBee / 802.15.4 packets is Wireshark. You just need to figure a way of getting the raw packets into pcap format. I've got a blog post on adapting the Microchip ZENA analyzer to output pcap format here:
http://jdesbonnet.blogspot.com/2011/02/using-microchip-zena-zigbee802154.html