How to display VM's memory in sort when using xm top in SUSE 11 - suse

Like df -Ph can show the used size of mount in human-readable style, how can I make a human-readable showing when using xm top in SUSE 11?
Also does anyone know how to display VM's memory in sort when using xm top in SUSE 11 or SUSE 12?
enter image description here
I'd appreciate any advice.

While using xm top press 's' - this cycles through the sort order. The currently sorted column has a bold header.
See https://linux.die.net/man/1/xentop
I think there is no way to make the output human readable.

Related

Trouble framing Modbus data captured from analyzer?(Saleae Logic 2)

I’ve tapped into an RS-485 bus and dumped some data. I’d like to implement a strawman/MITM device to intercept the traffic and send commands independent of the existing master/slave.
Here are some screenshots of some logic samples I captured in Saleae Logic 2:
zoom 1
zoom 2
zoom 3
My problem is it doesn’t look like it’s framed properly. I’ve got the analyzer set to Modbus RTU master, 9600 baud, no parity bit, one stop bit.
Could anyone tell me if this looks correct or incorrect based on these screenshots? Is this Modbus protocol, or something different? Trying to see if I’m on the right path here. Thanks. Any additional info I’m happy to supply if requested.
Tried several different ways of applying analyzers trying to get a solid stream of info

Tesseract-ocr not accurate enough

I am running tesseract-ocr on my Raspberry Pi 2. I have been trying to create a book reader, and I have found that the output is completely garbled. Photo: link
I cannot get the output text file currently, but I can probably get it off my Pi later if it is required. It is basically just random characters at the start, then it works until about 2/3rds of the way down the page, then it is a vertical row of random characters, followed by a few more words that make a little bit of sense.
Does anybody have a fix for this? Or any recommendations for a better OCR that can read this type of image with 100% accuracy?
Thanks,
Connor.

PIC 18f and PORT-B

Trying to get a 4x4 keypad working with a PIC 18f4685.
I've turned on weak-pulls ups. Set the appropriate pins to either input/output but when I send a signal out I'm not getting it back on bits 6 & 7. It just gets zeros...
I've tried to debug using the PicKit3 but seems that it uses RB7 and crashes things when a button for that row is pushed. Of course that tells me that the signals must be getting through, to a point.
Is there anything else in particular that I need to set up in configuration for PORT B?
As always...your help is greatly appreciated.
Since the EE site so rudely shut you down before you could get an answer, I figured I would come here to answer your question.
Check table 10-3 on page 135, it lists all capabilities of port B pins. Note that RB6 and RB7 are also the debugging pins, so I wouldn't use these.
Also, are you writing to LATx and reading from PORTx? It's important to do this when reading and writing to the same port. If you read and write to PORTx, you can accidentally read a stale value from an output that has not had enough time to change yet, and your next write will obliterate your intended value. This is particularly pernicious on PICs that don't have a LATx register; any operation, even bit-wise operations like BSF/BCF, will do a read-modify-write of the ENTIRE port register, affecting more than the bit that you intended to modify. See the answer to this EE question: https://electronics.stackexchange.com/questions/28744/interfacing-a-keypad-with-a-microcontroller
Not sure of exactly your schematic (a sketch might help) but a common error in PIC GPIO is not setting the ADC registers to digital inputs. They come out of reset as analog inputs.
Look at register description 19-2 in the PIC18F4685 Datasheet.
ADCON1 comes out of reset as 0x00. To set all the analog pins to digital I/O, PCFG3:0 need to be set to 1.
ADCON1bits.PCFG = 0x0F;
Can you show use your code for setting the tristate registers (TRISB) and how you are reading. Have you checked the voltages at the input pins with a digital multimeter (DMM) before and during the button press? They are $10 and worth it.
Finally, did you disable the analog pins? On PIC24 chips you have to do:
AD1PCFG = 0xFFFF
before digital input reads will work. Might be the same on your chip.
Can you give us the EXACT model number of your chip?

Tesseract OCR - Handwritten font

I'm trying to use Tesseract-OCR to detect the text of images with pure text in it but these text has a handwritten font called Journal.
Example:
The result is not the best:
Maxima! size` W (35)
Is there any possibility to improve the result or rather to get the exact result?
I am surprised Tesseract is doing so well. With a little bit of training you should be able to train the lower case 'l' to be recognised correctly.
The main problem you have is the top of the large T character. The horizontal line extends across 2 (possibly 3) other character cells and this would cause a problem for any OCR engine when it tries to segment the characters for recognition. Training may be able to help in this case.
The next problem is the . and : which are very light/thin and are possibly being removed with image pre-processing before the OCR even starts.
Overall the only chance to improve the results with Tesseract would be to investigate training. Here are some links which may help.
Alternative to Tesseract OCR Training?
Tesseract OCR Library learning font
Tesseract confuses two numbers
Like Andrew Cash mentioned, it'll be very hard to perform OCR for that T letter because of its intersection with a number of next characters.
For results improvement you may want to try a more accurate SDK. Have a look at ABBYY Cloud OCR SDK, it's a cloud-based OCR SDK recently launched by ABBYY. It's in beta, so for now it's totally free to use. I work # ABBYY and can provide you additional info on our products if necessary. I've sent the image you've attached to our SDK and got this response:
Maximal size: lall (35)

How would you go about reverse engineering a set of binary data pulled from a device?

A friend of mine brought up this questiont he other day, he's recently bought a garmin heart rate moniter device which keeps track of his heart rate and allows him to upload his heart rate stats for a day to his computer.
The only problem is there are no linux drivers for the garmin USB device, he's managed to interpret some of the data, such as the model number and his user details and has identified that there are some binary datatables essentially which we assume represent a series of recordings of his heart rate and the time the recording was taken.
Where does one start when reverse engineering data when you know nothing about the structure?
I had the same problem and initially found this project at Google Code that aims to complete a cross-platform version of tools for the Garmin devices ... see: http://code.google.com/p/garmintools/. There's a link on the front page of that project to the protocols you need, which Garmin was thoughtful enough to release publically.
And here's a direct link to the Garmin I/O specification: http://www.garmin.com/support/pdf/IOSDK.zip
I'd start looking at the data in a hexadecimal editor, hopefully a good one which knows the most common encodings (ASCII, Unicode, etc.) and then try to make sense of it out of the data you know it has stored.
As another poster mentioned, reverse engineering can be hairy, not in practice but in legality.
That being said, you may be able to find everything related to your root question at hand by checking out this project and its' code...and they do handle the runner's heart rate/GPS combo data as well
http://www.gpsbabel.org/
I'd suggest you start with checking the legality of reverse engineering in your country of origin. Most countries have very strict laws about what is allowed and what isn't regarding reverse engineering devices and code.
I would start by seeing what data is being sent by the device, then consider how such data could be represented and packed.
I would first capture many samples, and see if any pattern presents itself, since heart beat is something which is regular and that would suggest it is measurement related to the heart itself. I would also look for bit fields which are monotonically increasing, as that would suggest some sort of time stamp.
Having formed a hypothesis for what is where, I would write a program to test it and graph the results and see if it makes sense. If it does but not quite, then closer inspection would probably reveal you need some scaling factors here or there. It is also entirely possible I need to process the data first before it looks anything like what their program is showing, i.e. might need to integrate the data points. If I get garbage, then it is back to the drawing board :-)
I would also check the manufacturer's website, or maybe run strings on their binaries. Finding someone who works in the field of biomedical engineering would also be on my list, as they would probably know what protocols are typically used, if any. I would also look for these protocols and see if any could be applied to the data I am seeing.
I'd start by creating a hex dump of the data. Figure it's probably blocked in some power-of-two-sized chunks. Start looking for repeating patterns. Think about what kind of data they're probably sending. Either they're recording each heart beat individually, or they're recording whatever the sensor is sending at fixed intervals. If it's individual beats, then there's going to be a time delta (since the last beat), a duration, and a max or avg strength of some sort. If it's fixed intervals, then it'll probably be a simple vector of readings. There'll probably be a preamble of some sort, with a start timestamp and the sampling rate. You can try decoding the timestamp yourself, or you might try simply feeding it to ctime() and see if they're using standard absolute time format.
Keep in mind that lots of cheap A/D converters only produce 12-bit outputs, so your readings are unlikely to be larger than 16 bits (and the high-order 4 bits may be used for flags). I'd recommend resetting the device so that it's "blank", dumping and storing the contents, then take a set of readings, record the results (whatever the device normally reports), then dump the contents again and try to correlate the recorded results with whatever data appeared after the "blank" dump.
Unsure if this is what you're looking for but Garmin has created an API that runs with your browser. It seems OSX is supported, as well as Windows browsers... I would try it from Google Chromium to see if it can be used instead of this reverse engineering...
http://developer.garmin.com/web-device/garmin-communicator-plugin/
API Features
Auto-detection of devices connected to a computer Access to device
product information like product name and software version Read
tracks, routes and waypoints from supported recreational, fitness and
navigation devices Write tracks, routes and waypoints to supported
recreational, fitness and navigation devices Read fitness data from
supported fitness devices Geo-code address and save to a device as a
waypoint or favorite Read and write Garmin XML files (GPX and TCX) as
well as binary files. Support for most Garmin devices (USB, USB
mass-storage, most serial devices) Support for Internet Explorer,
Firefox and Chrome on Microsoft Windows. Support for Safari, Firefox
and Chrome on Mac OS X.
Can you synthesize a heart beat using something like a computer speaker? (I have no idea how such devices actually work). Watch how the binary results change based on different inputs.
Ripping apart the device and checking out what's inside would probably help too.