Byte in Actionscript 3? (from C++ to AS3) - actionscript-3

how do i convert this C++ code to AS3
void myFunc(BYTE type)
{
//send type to the network server..
}

There is no such type as BYTE in ActionScript 3 but you can use 'int' instead.
It will looks something like this:
var socket:flash.net.Socket;
//...
function myFunc( type:int ):void {
socket.writeByte( type );
}
As it said in Socket docs: "The low 8 bits of the value are used; the high 24 bits are ignored." So only 8 bits will be written to socket just like it is expected with BYTE in C++.

Related

Can you help me make sense of this class constructor? (Adafruit_ATParser)

I am building a device for my research team. To briefly describe it, this device uses a motor and load sensor connected to an Arduino to apply a rotational force to a corn stalk and record the resistance of the stalk. We are in the process of building Bluetooth into the device. We are using this BT module.
We have a BLE GATT Service with 2 characteristics for storing DATA and 1 for holding the command which is an integer that will be read by the device and acted on. Reading the command characteristic is where we encounter our problem.
void get_input(){
uint16_t bufSize = 15;
char inputBuffer[bufSize];
bleParse = Adafruit_ATParser(); // Throws error: bleParse was not declared in this scope
bleParse.atcommandStrReply("AT+GATTCHAR=3",&inputBuffer,bufSize,1000);
Serial.print("input:");
Serial.println(inputBuffer);
}
The functions I am trying to use are found in the library for the module in Adarfruit_ATParser.cpp
/******************************************************************************/
/*!
#brief Constructor
*/
/******************************************************************************/
Adafruit_ATParser::Adafruit_ATParser(void)
{
_mode = BLUEFRUIT_MODE_COMMAND;
_verbose = false;
}
******************************************************************************/
/*!
#brief Send an AT command and get multiline string response into
user-provided buffer.
#param[in] cmd Command
#param[in] buf Provided buffer
#param[in] bufsize buffer size
#param[in] timeout timeout in milliseconds
*/
/******************************************************************************/
uint16_t Adafruit_ATParser::atcommandStrReply(const char cmd[], char* buf, uint16_t bufsize, uint16_t timeout)
{
uint16_t result_bytes;
uint8_t current_mode = _mode;
// switch mode if necessary to execute command
if ( current_mode == BLUEFRUIT_MODE_DATA ) setMode(BLUEFRUIT_MODE_COMMAND);
// Execute command with parameter and get response
println(cmd);
result_bytes = this->readline(buf, bufsize, timeout, true);
// switch back if necessary
if ( current_mode == BLUEFRUIT_MODE_DATA ) setMode(BLUEFRUIT_MODE_DATA);
return result_bytes;
}
None of the examples in the library use this. They all create their own parsers. For example, the neopixel_picker example sketch has a file called packetParser.cpp which I believe retrieves data from the BT module for that specific sketch, but it never includes or uses Adafruit_ATParser.. There are no examples of this constructor anywhere and I cannot figure out how to use it. I have tried these ways:
bleParse = Adafruit_ATParser();
Adafruit_ATParser bleParse();
Adafruit_ATParser();
ble.Adafruit_ATParser bleParse();
note: ble is an object that signifies a Serial connection between arduino and BT created with:
SoftwareSerial bluefruitSS = SoftwareSerial(BLUEFRUIT_SWUART_TXD_PIN, BLUEFRUIT_SWUART_RXD_PIN);
Adafruit_BluefruitLE_UART ble(bluefruitSS, BLUEFRUIT_UART_MODE_PIN,BLUEFRUIT_UART_CTS_PIN, BLUEFRUIT_UART_RTS_PIN);
Can anyone give me a clue on how to use the Adafruit_ATParser() constructor? Also, if the constructor has no reference to the ble object, how does it pass AT commands to the BT module?
I know this is a big ask, I appreciate any input you can give me.
Like this
Adafruit_ATParser bleParse;
You were closest with this one Adafruit_ATParser bleParse();. This is a common beginner mistake because it looks right. Unfortunately it declares a function bleParse which takes no arguments and returns a Adafruit_ATParser object.
I can't answer the second question.
EDIT
I've taken the time to have a look at the code. This is what I found
class Adafruit_BluefruitLE_UART : public Adafruit_BLE
{
and
class Adafruit_BLE : public Adafruit_ATParser
{
what this means is that the Adafruit_BluefruitLE_UART class is derived from the Adafruit_BLE class which in turn is derived from the Adafruit_ATParser class. Derivation means that any public methods in Adafruit_BLE can also be used on a Adafruit_BluefruitLE_UART object. You already have an Adafruit_BluefruitLE_UART object (you called it ble) so you can just use the method you want to use on that object.
SoftwareSerial bluefruitSS = SoftwareSerial(BLUEFRUIT_SWUART_TXD_PIN, BLUEFRUIT_SWUART_RXD_PIN);
Adafruit_BluefruitLE_UART ble(bluefruitSS, BLUEFRUIT_UART_MODE_PIN,BLUEFRUIT_UART_CTS_PIN, BLUEFRUIT_UART_RTS_PIN);
ble.atcommandStrReply( ... );

Why does vector.push_back(System::Byte) not compile any more in VC++ 14.29 (C++/CLI)

I have the following code that used to compile and work fine:
std::vector<unsigned char> marshal_as(cli::array<System::Byte>^ const& from)
{
std::vector<unsigned char> result;
result.reserve(from->Length);
for (int i = 0; i < from->Length; i++)
{
result.push_back(from[i]);
}
return result;
}
After updating VisualStudio to version 16.10 - which updates the C++ compiler to version 14.29 - the code produces an error:
error C2664: 'void std::vector<unsigned
char,std::allocator<_Ty>>::push_back(const _Ty &)': cannot convert
argument 1 from 'unsigned char' to 'const _Ty &'
with
[
_Ty=unsigned char
]
message : An object from the gc heap (element of a managed array) cannot be converted to a native reference
message : see
declaration of 'std::vector<unsigned
char,std::allocator<_Ty>>::push_back'
with
[
_Ty=unsigned char
]
Changing the code in the loop body to
unsigned char b = from[i];
result.push_back(b);
fixes the problem.
I would like to understand the cause of this error. Is this somehow related to a change due to the C++ 20 standard?
Is this somehow related to a change due to the C++ 20 standard?
No. While std::vector<>::push() has subtly changed in C++20, it's not a change that materially affects what's going on here, the issue is definitely clr-specific.
I would like to understand the cause of this error.
This is almost certainly (see below) an error that was always present in your code, but was not being reported by previous versions of the C++/CLI compiler.
Consider the following function:
void foo(const int& v) {
int* ptr = &v;
// store ptr somewhere, long-term.
}
It's obvious that invoking foo() with a reference to a gc-backed int would be a recipe for disaster. Yet that's exactly what result.push_back(from[i]); does.
Your code "works" because push_back() happens to do nothing with its parameter that causes an issue. However, the compiler is not supposed to know that.
N.B. I say almost certainly because I'm having a heck of a time tracking down the call signature for cli::array<T>::operator[](std::size_t) const. It's not impossible that it used to return a T and now returns const T%.

Solidity Assembly, the mstore function, and the width of a word in bytes

I'm learning Solidity Assembly and I'm confused about something. I'm looking at this library called Seriality. Specifically, this function: https://github.com/pouladzade/Seriality/blob/master/src/TypesToBytes.sol#L21
function bytes32ToBytes(uint _offst, bytes32 _input, bytes memory _output) internal pure {
assembly {
mstore(add(_output, _offst), _input)
mstore(add(add(_output, _offst),32), add(_input,32))
}
}
That function bytes32ToBytes takes a bytes32 variable and stores it in a dynamically sized bytes array, starting at the offset passed in.
The thing that confuses me is that it uses the mstore function twice. But the mstore function stores a word, which is 32 bytes, right? So why is it called twice, given that the input is 32 bytes? Wouldn't calling it twice store 2 words, which is 64 bytes?
Thanks!
Solidity arrays are stored by writing out the size of the array to the first storage slot then writing out the data to the subsequent slots.
Knowing that mstore has the following parameters: mstore(START_LOCATION, ITEM_TO_STORE), the first mstore statement is written as follows:
mstore(add(_output, _offst), _input)
Since the first slot of the array points to the size of the array, this statement is setting the size of _output. You should be able to get the same result by replacing it with mstore(add(_output, _offst), 32) (since the size is of _input is static).
The second statement (mstore(add(add(_output, _offst),32), add(_input,32))) is the one that writes the data itself. Here, we are shifting the position of both pointers by 32 bytes (as the first 32 bytes for both arrays are pointing to the size) and storing the value of _input to where the data is stored for _output.
Chances are, _output will already be initialized before calling this method (so the length will already be set), so it will usually be unnecessary. But, it doesn't hurt. Note that a similar implementation making this assumption would look like this:
function test() public pure returns (bytes) {
bytes32 i = "some message";
bytes memory o = new bytes(32); // Initializing this way sets the length to the location "o" points to. This replaces mstore(add(_output, _offst), _input).
bytes32ToBytes(0, i, o);
return o;
}
function bytes32ToBytes(uint _offst, bytes32 _input, bytes memory _output) internal pure {
assembly {
mstore(add(add(_output, _offst),32), add(_input,32))
}
}
Not sure about the intention of the function bytes32ToBytes
If it is turning a bytes32 into a bytes, I think the right implementation should be
pragma solidity ^0.7.0;
contract DecodeEncode {
function test() public pure returns (bytes memory) {
bytes32 i = "some message";
bytes memory o = new bytes(32); // Initializing this way sets the length to the location "o" points to. This replaces mstore(add(_output, _offst), _input).
bytes32ToBytes(0, i, o);
return o;
}
function bytes32ToBytes(uint _offst, bytes32 _input, bytes memory _output) internal pure {
assembly {
mstore(add(_output, _offst), 32) //lineA
mstore(add(add(_output, _offst), 32), _input) //lineB
}
}
}
lineA sets the length of the bytes as 32 bytes
lineB sets the content of the first slot of the bytes as _input

How do i get a reference to an alchemy asm declared variable into flash?

i have a variable declared in alchemy asm:
asm("var buffer:Vector.<Number> = new Vector.<Number>(100, true);");
i can populate it with data like so:
asm("buffer[%0] = %1;" : : "r"(index) : "r"(value));
what i can't figure out is how to get a reference of that asm "buffer" variable into actionscript.
(i did think of one way... what i did was to throw the "buffer" from alchemy asm, and then catch it in actionscript, but unfortunately it seems to leak a lot of memory).
is there a better alternative to doing this?
please note that performance is critical, and using default alchemy marshaling is way too slow.
asm is only for passing numbers back and forth, which means we'll have to use Alchemy's internal int-to-object mappings. Digging through the intermediate AS3 code (to see it, set the ACHACKS_TMPS environment variable to '1'), it seems that CTypemap.AS3ValType does the mapping. So you can return an asm-created object like this:
static AS3_Val alc_return_obj(void *self, AS3_Val args) {
int len= 100;
// create custom data in AS3
asm("var as3Buffer:Vector.<Number> = new Vector.<Number>(%0, true);" : : "r"(len));
// populate the vector with multiples of pi (just for fun)
for (int idx= 0; idx < len; idx++) {
double value= 3.14159265 * idx;
asm("as3Buffer[%0] = %1;" : : "r"(idx) , "r"(value));
}
// get a C reference to the AS3 object
AS3_Val alcBuffer;
asm("%0 CTypemap.AS3ValType.createC(as3Buffer)[0];" : "=r"(alcBuffer));
return alcBuffer;
}
Note: While this is fun hackery, it might not be the best way to solve this problem. This is probably not the fastest way to get data out of Alchemy and into Flash. For that, I suggest using a ByteArray to copy data into and out of Alchemy's RAM. See this SO question for some techniques in that area.

How to define enum in as3?

Is there a way to define an enum in AS3 in a way we do it in other languages? I can define constants with defined values like that:
private const CONST_1:int = 0;
private const CONST_2:int = 1;
private const CONST_3:int = 2;
and so on. If I want to insert some other constant between 3 these I need to move all values like that:
private const CONST_1:int = 0;
private const CONST_2:int = 1;
private const CONST_2A:int = 2;
private const CONST_3:int = 3;
while in other language I would end up with only adding a new member to enum closure like that:
enum {
CONST_1 = 0,
CONST_2,
CONST_2A,
CONST_3
} MyConstEnum;
Does AS3 has something similar?
Thanks
No AS3 doesn't have enum, you have to code them yourself. You can simulate them for example by a class if you want safer type checking.
public static var NUM_ENUM_VALUES:int = 0;
public static const EV_MONDAY:int = NUM_ENUM_VALUES++;
public static const EV_TUESDAY:int = NUM_ENUM_VALUES++;
public static const EV_WEDNESDAY:int = NUM_ENUM_VALUES++;
public static const EV_THURSDAY:int = NUM_ENUM_VALUES++;
You can take a look at the variety of variable types supported by the ActionScript Virtual Machine. Variable types are annotated by traits, the variety of which can be found in the specification, table 4.8.1:
4.8.1 Summary of trait types
The following table summarizes the trait types.
Type Value
Trait_Slot 0
Trait_Method 1
Trait_Getter 2
Trait_Setter 3
Trait_Class 4
Trait_Function 5
Trait_Const 6
There is no Trait_Enum and note that under Trait_Const description, only constants from the constant pool are allowed, so that would be:
signed integers
unsigned integers
doubles
strings
type names and vector types
Enums could be made of signed or unsigned integers, for example, but the virtual machine would not perform any type-safety checking of the operations which used those types. (E.g., the getlocal or coerce opcodes used would be getlocal_i and coerce_i, respectively.)
The ABC format doesn't have any built-in provision for enum types that I know of.
Using an object type for each enum value could work, especially if the compiler emits coerce instructions for that type prior to uses of getlocal and otherwise doesn't use the object other than in istype and astype variants. For example, calling setproperty or getproperty on the object would be slower than using an integer -- especially if that property is bound to a getter or setter method.
There are replacement styles which have been linked in other answers. To evaluate the runtime performance impact of these styles, you can uses swfdump -D from the swftoools open-source Flash tools collection.