convert the binary number into hexadecimal no using shell or perl - binary

i have a file which contain some binary number in each line .
I have to convert each binary number the line into hexadecimal
file is like
10101010101010101111
11010101010111110011
11011111110000000000
10010101111110010010
convert this numbers into hexadecimal no.
if there is any gvim command it is also useful
thank you

Related

Is there such a thing as "non-binary" data?

When you get down to the bare metal, all data is stored in bits, which are binary (1 or 0). However, I sometimes see terms like "binary file" which implies the existence of files that aren't binary. Also, for things like base64 encoding, which Wikipedia describes as a "binary-to-text encoding scheme". But if I'm not mistaken, text is also stored in a binary format on the hardware, so isn't base64 encoding ultimately converting binary to binary? Is there some other definition of "binary" I am unaware of?
You are right that deep down, everything is a binary file. However at its base, a binary file is intended to be read as an array of bytes, where each byte has a value between 0 and 255. A text file is intended to be read as an array of characters.
When, in Python, I open a file with open("myfile", "r"), I am telling it that I expect the underlying file to contain characters, and that Python just do the necessary processing to give me characters. It may convert multiple bytes into a single characters. It may canonicalize all possible newline combinations into just a single newline character. Some characters have multiple byte representations, but all will give me the same character.
When I open a file with open("myfile", "rb"), I literally want the file read byte by byte, with no interpretation of what it is seeing.

How can I convert a binary file to a base 2 representation in Linux?

I can convert a binary file to an ASCII base 64 representation in a way such as the following:
base64 piglet_squid.jpg > piglet_squid.txt
A small segment of the resulting file could be something like the following:
LY61c8T1LeF+GsbJ2FhEjnNW0NoXFS0qi2MEKKyPErh13jSWFU+Xp8uRxu6Cqxzcn8cfCJxrYrwj
wx9PYNy+ZvXfUzHmsH7XKNBaTULLsXAHtXqbmtSze8knfeHKL0+a8R9qn13iniFuwilbK8x9K+9d
PMUXGvsxxX2sufZnxO9rrTK5221Bk9jWppTa8T8R3Ok6e3/hjox715M+KabSrb8M0jejC3bg6/Fe
How could I convert that same file to a base 2 representation? A small segment of the resulting file could be something like the following:
0101000111001011101010001010010110101001010010110111110101001000101010010100
0001010101001010101010010010011010101011101010110101001101110000000110011010
0100111010111111010100100010100001011010101010111010111010000101010010110101
My preference is to do this using very standard Linux utilities.
(echo obase=2; hexdump -ve'/1 "%u\n"' piglet_squid.jpg)|bc|xargs printf %08i|fold -w64
This line converts the file piglet_squid.jpg to a base 2 representation on standard output; the line length can be altered by specifying another width with fold -w.

Check file in binary or ASCII format in as3

I read a file in action script that kind of files maybe in ASCII or binary format.
How can i check which format is used when read?
regards.
You could read the byte values from the file and try to make the guesses based on the byte values that you read. If you read a lot of bytes with values from 65-128 (approximately), it's very likely a "ASCII" format. Check www.asciitable.com and pick the ascii codes that you expect will and will not appear in an ASCII/binary file.

How to treat tcl string as a hex number and convert it into binary?

I have a tcl string set in a variable. I want to treat it as a hex to convert into binary of it. Can anybody help me to achieve this.
Here is what i am doing :
$ /usr/bin/tclsh8.5
% set a a1a2a3a4a5a6
a1a2a3a4a5a6
% set b [ string range $a 0 3 ]
a1a2
Now i want that a1a2 value of variable "b" should be treated as 0xa1a2, so that i can convert it into binary. Please help me to solve this.
If you are using Tcl 8.6, then binary decode hex is the best choice:
binary decode hex $b
If you are using an older version of Tcl, then you have to use the binary format with the H format specifier:
binary format H* $b
You can write the resulting byte array to a file or send it through a socket etc, but if you want to display it as text, I suggest converting it to a string first:
encoding convertfrom utf-8 [binary format H* $b]

CL-JSON encodes Unicode chars by outputting their Unicode escape string in ASCII format. How can I override this?

I am using CL-JSON to encode an object. It spits the encoded string in ASCII format and the non-ASCII chars are written out as a sequence of ASCII chars in "\uxxxx" form. The result is that even if I open the output file stream with external format :utf-8, the file contains only ASCII chars. When I try to view it with for example notepad++ I cannot convert it to Unicode because now all the data is just ASCII (even the "\uXXXX" sequences). I would like either to know if there is an editor that will automatically convert the file to Unicode and recognize those escape sequences, or if there is a way to tell CL-JSON to keep the output characters in Unicode. Any ideas?
EDIT: here is some more info:
CL-USER>(with-open-file (out "dump.json"
:direction :output
:if-does-not-exist :create
:if-exists :overwrite
:external-format :utf-8)
(json:encode-json '("abcd" "αβγδ") out)
(format out "~%"))
CL-USER>(quit)
bash$ file dump.json
dump.json: ASCII text
bash$ cat dump.json
["abcd","\u03B1\u03B2\u03B3\u03B4"]
bash$ uname -a
Linux suse-server 3.0.38-0.5-default #1 SMP Fri Aug 3 09:02:17 UTC 2012 (358029e) x86_64 x86_64 x86_64 GNU/Linux
bash$ sbcl --version
SBCL 1.0.50
bash$
EDIT2:
YASON does what I need, outputting chars without escaping them in \uXXXX format, but unfortunately it lacks features that I need, so it is not an option.
I know this is a temporary solution but I changed the CL-JSON source by redefining the appropriate function not to unicode-escape ranges outside ASCII. The function is named write-json-chars and it resides in file encoder.lisp in the sources.