I've made this small experimental program in Arduino to see how the functions lowByte() and highByte() work. What exactly are they supposed to return when passed a value?
On entering the character '9' in the serial monitor it prints the following:
9
0
218
255
How does that come? Also, the last 2 lines are being printed for all values inputted. Why is this happening?
int i=12;
void setup()
{
Serial.begin(9600);
}
void loop()
{
if(Serial.available())
{
i = Serial.read() - '0'; // conversion of character to number. eg, '9' becomes 9.
Serial.print(lowByte(i)); // send the low byte
Serial.print(highByte(i)); // send the high byte
}
}
If you have this data:
10101011 11001101 // original
// HighByte() get:
10101011
// LowByte() get:
11001101
An int is a 16-bit integer on Arduino. So you are reading the high and low part as a byte.
As the actual buffer is "9\n", that is why the second bit prints out 'funny' numbers due to subtracting the result with '0'.
Serial.print needs to be formatted to a byte output if that's what you want to see.
Try:
Serial.print(lowByte, BYTE)
In addition to Rafalenfs' answer, Should you provide a larger data type:
00000100 10101011 11001101 // original
// HighByte() will NOT return: 00000100, but will return:
10101011
// LowByte() will still return:
11001101
Highbyte() returns the second lowest bit (as specified by the documentation: https://www.arduino.cc/reference/en/language/functions/bits-and-bytes/highbyte/)
Related
I try to figure out how to print a certain value in a driver. In my case it is a ULONG value. At https://www.osronline.com/showthread.cfm?link=187470, it states that one should use the %U format specifier. So, I have the following code (only the relevant parts):
ULONG value;
value = 5;
DbgPrint("The value is: %U", value);
Compiling and loading works fine. But the "DbgView" output is not what I expected as you can see below:
The value is U
I hope someone can help. Thanks in advance.
Best regards
That is wrong
Per https://learn.microsoft.com/en-us/cpp/c-runtime-library/format-specification-syntax-printf-and-wprintf-functions?view=vs-2017#type-field-characters
To format ULONG, please use:
%u, for decimal integer.
%x, for unsigned hexadecimal integer; uses "abcdef."
%X, for unsigned hexadecimal integer; uses "ABCDEF."
0x5537f99e000000000000000000000000000000000000000000000000000000000000002000000000000000000000000000000000000000000000000000000000000000072268656c6c6f2200000000000000000000000000000000000000000000000000
5537f99e is the function name, which is 'setstring'
2268656c6c6f22 is the argument to the function, which is 'hello',
Please explain how this raw data to a ethereum contract is consturcted. I'm confused at those offsets.
You can find the reference here https://solidity.readthedocs.io/en/develop/abi-spec.html
if your function is
function setstring(string string_value) {
}
first 4bytes 0x5537f99e
First 4 bytes of data is derived as the first 4 bytes of the Keccak hash of the ASCII form of the signature setstring(string)
next 32 bytes 0x0000000000000000000000000000000000000000000000000000000000000020
This means the location of the data part of your string_value, measured in bytes from the start of the arguments block. In this case, the next block
next 32 bytes
0000000000000000000000000000000000000000000000000000000000000007
This means size of your string, 7. "hello"
next 32 bytes
2268656c6c6f2200000000000000000000000000000000000000000000000000
The contents of the "hello" encoded in UTF-8.
How does this denary to binary program work? I am finding it hard to comprehend what is happening behind the code.
Can someone explain the lines 6 onwards?
Number = int(input("Hello. \n\nPlease enter a number to convert: "))
if Number < 0:
print ("Can't be less than 0")
else:
Remainder = 0
String = ""
while Number > 0:
Remainder = Number % 2
Number = Number // 2
String = str(Remainder) + String
print (String)
The idea is to separate out the last part of the binary number, stick it in a buffer, and then remove it from "Number". The method is general and can be used for other bases as well.
Start by looking at it as a dec -> dec "conversion" to understand the principle.
Let's say you have the number 174 (base10). If you want to parse out each individual piece (read as "digit") of it you can calculate the number modulo the base (10), then do an integer division to "remove" that digit from the number. I.e. 174%10 and 174//10 => (Number) 17|4 (Reminder). Next iteration you have 17 from the division and when you perform the same procedure, it'll split it up into 1|7. On the next iteration you'll get 0|1, and after that "Number" will be 0 (which is the exit condition for the loop (while Number > 0)).
In each iteration of the loop you take the remainder (which will be a single digit for the specific base you use (it's a basic property of how bases work)), convert it to a string and concatenate it with the string you had from previous iterations (note the order in the code!), and you'll get the converted number once you've divided your way down to zero.
As mentioned before, this works for any base; you can use base 16 to convert to hex (though you'll need to do some translations for digits above 9), octal (base 8), etc.
Python code for converting denary into binary
denary= int(input('Denary: '))
binary= [0,0,0,0]
while denary>0:
for n,i in enumerate(binary):
if denary//(2**(3-n))>=1:
binary[n]= 1
denary -= 2**(3-n)
print(denary)
print (binary)
I am working with FastCgi, trying to generate a dynamic html webpage.
I am able to get the QUERY_STRING easily enough, but I am having trouble trying to copy it into a char array.
If there is even a shorter way of just getting the value from QUERY_STRING, please advise because I am a little over my head.
char *queryString = getenv(ENV_VARS[7]);
char newDeviceName[64];
strncpy( newDeviceName, *queryString, sizeof(*queryString) -1);
printf("------- %c ------------", newDeviceName);
This compiles with only warnings, but once i try to load the webpage, the characters are some weird Chinese looking characters. -> �ፙ�
Thank you in advance.
EDIT: More of my code
const char *ENV_VARS[] = {
"DOCUMENT_ROOT",
"HTTP_COOKIE",
"HTTP_HOST",
"HTTP_REFERER",
"HTTP_USER_AGENT",
"HTTPS",
"PATH",
"QUERY_STRING",
"REMOTE_ADDR",
"REMOTE_HOST",
"REMOTE_PORT",
"REMOTE_USER",
"REQUEST_METHOD",
"REQUEST_URI",
"SCRIPT_FILENAME",
"SCRIPT_NAME",
"SERVER_ADMIN",
"SERVER_NAME",
"SERVER_PORT",
"SERVER_SOFTWARE"
};
int main(void)
{
char deviceName[]=ADAPTERNAME;
time_t t;
/* Intializes random number generator */
srand((unsigned) time(&t));
while (FCGI_Accept() >= 0) {
printf("Content-type: text/html \r\n\r\n");
printf("");
printf("<html>\n");
printf("<script src=\"/js/scripts.js\"></script>");
/* CODE CODE CODE */
printf("<p> hi </p>");
printf("<p> hi </p>");
char *queryString = getenv(ENV_VARS[7]);
char newDeviceName[64];
if (queryString == NULL)
printf("<p> +++++ERROR++++++ </p>");
else {
strcpy( newDeviceName, queryString);
newDeviceName[sizeof(newDeviceName) - 1] = 0;
printf("<p> ------- %s ------------ </p> ", newDeviceName);
}
SOLVED: Amateur mistake, for some reason none of my new edits went into effect until after i restart my lighttpd server.
Your program has undefined behavior. Read those warnings issued by the compiler. They're important.
Don't dereference the pointer when you're passing the string to strncpy(). When you do that, you're now passing a single char. That's converted to a pointer when it's given to strncpy() (which is where you probably get your warning, i.e. passing a char to a function that expects a char*).
You also can't get the size of an array that has decayed to a pointer using sizeof. You're just getting the size of the pointer (which is probably either 8 or 4 bytes depending on your system). Since you don't know the length of the string anyway, it might even be better to just use strcpy() instead of strncpy().
Here's what your code probably should look like:
char *queryString = getenv(ENV_VARS[7]);
char newDeviceName[64];
strcpy( newDeviceName, queryString);
printf("------- %s ------------", newDeviceName); /* use %s to print strings */
The length on your strncpy is wrong [too short], the second argument is wrong, and the format string is incorrect.
Try this:
strncpy( newDeviceName, queryString, sizeof(newDeviceName) - 1);
newDeviceName[sizeof(newDeviceName) - 1] = 0;
printf("------- %s ------------", newDeviceName);
In the call to strncpy, it expects a char * for the second argument, but you pass it a char.
Also, the size is not correct. *queryString is a char and has size 1. Using sizeof(queryString) is not correct either because it will return the size of a pointer. What you actually want is the size of the detination buffer.
In the printf call the %c format specifier expects a char but you pass it a char *. You should instead use %s which expects a char * pointing to a null terminated string.
So what you want to do is this:
strncpy( newDeviceName, queryString, sizeof(newDeviceName) -1);
newDeviceName[sizeof(newDeviceName) - 1] = 0;
printf("------- %s ------------", newDeviceName);
What you want is
strncpy(newDeviceName, queryString, sizeof(newDeviceName)-1);
newDeviceName[63] = '\0'; // Guarantee NUL terminator
printf("----- %s -----", newDeviceName);
So multiple problems:
*queryString just gets you the first character, which strncpy tries to treat as a pointer.
sizeof(*queryString) is the size of a char (i.e. 1)
%c prints a single character, not the string
I am writing a sketch for Arduino that aims to convert a text string into binary 7-bit or 8-bit ASCII. For example, "Hello World" would become this 8-bit ASCII binary stream:
0100100001100101011011000110110001101111001000000111011101101111011100100110110001100100
As you can see, this is standard 7-bit ASCII padded with zeros to make it 8-bit ASCII. I don't mind which bit length I use as long as it's consistent once I've started. I've spent a couple of hours trying to work out a method to achieve that to no avail. The closest I have is something like this:
char text[] = "Hello world";
which when printed to the monitor like this:
Serial.println(text[0], BIN);
Gives me 1001000. However, this isn't at all padded (so "0" would simply be 0, not 0000000) and obviously this doesn't provide me with anything to work with, just something to look at! Does anyone have any advice for me?
You can use this as a starting point:
char inputChar = 'H';
// This will 'output' the binary representation of 'inputChar' as 8 characters of '1's and '0's, MSB first.
for ( uint8_t bitMask = 128; bitMask != 0; bitMask = bitMask >> 1 ) {
if ( inputChar & bitMask ) {
output('1');
} else {
output('0');
}
}