I want to install and discuss with my own usb device on a raspberry pi with windows iot.
For that, I just create an inf file for arm like WinUSB driver on Windows 10 IoT. The device is recognized by the raspberry (seen on the startup list of connected device). Then I want to discuss with the device.
First, I have tried with the "winusb.dll" but I need to use the library "SetupApi" that does not compile for ARM. (I used this solution on windows PC and communicate correctly with the device).
Do you have an idea on how to communicate correctly with an winusb device on windows iot?
Thanks in advance for your answers.
So, I try to migrate to Windows.device.usb, and I don't succeed in connecting to the device. There there an exception when I call the FromIdAsync() function. It's exactly the same issue as : Can't access USB device in Universal App
There are some fantastic samples that can be found here.
https://github.com/Microsoft/Windows-universal-samples/tree/master/Samples/CustomUsbDeviceAccess
Download all the samples. Check where it creates watchers based on specific devices (there are two samples in there, you'd need to add your own in) the page that displays USB descriptors checks on the device type -- add in a check so that it returns DeviceType.all to see if it can query the descriptors.
I've managed to get it working on Windows 10 desktop, but have failed to craft an INF file that I can use on my IoT device. Once I get that working, I may return.
Related
I'm trying to use a USB Barcode Scanner on Windows 10 in Chrome v73.0.3683.86 via WebUSB.
The scanner is a Honeywell Voyager 1250g.
I can see the device via the device dialog - I can also open it and select a configuration.
However, when I try to claim interface(1) (There are 3 interfaces, but 1 is the bulk transfer) I get the error Failed to claim interface: Operation not supported or unimplemented on this platform in chrome://device-log/.
Is there a way around this, or is this scanner just not usable via WebUSB? Thanks!
Have you tried connecting to this device using WebUSB on other platforms? Windows has a particular additional requirement for applications (like Chrome) to access USB devices which is that the WinUSB.sys driver must be loaded for the interface.
I've written an article explaining the particular requirements on Windows here: https://developers.google.com/web/fundamentals/native-hardware/build-for-webusb/#windows
If you use the Windows Device Manager you can check which drivers are loaded for your device. If there is no driver loaded then you may be able to write a custom INF file as described in that article to instruct Windows to load the driver you want.
I sort of make shift followed this guide on how to setup remote debugging. Since I am using Adobe Animate to compile my app I assume it has done the majority of the build steps already as I get a similar screen described.
I don't understand though. Here I have port forwarding up on my router so that it goes to my PC. I have TCP port 7935 up and open. Windows firewall on or off doesn't seem to make difference. Windows firewall even prompted me to allow or deny fdb after I ran it. I can't get my phone to connect via remote debugging. I want to be able to send this to my client who is having issue with the app so I can see what's going on under the hood instead of relying on a giant sum of try/catch statements and screenshots. Any help?
I tried a dummy domain and it seems to know that it can't connect to it. When I try mine or my IPv4 it doesn't let me connect. It just freezes up the app.
I don't know whether it works or not in Animate CC, but it works via Flash Builder. I'm using Android real device and I have Android SDK tools installed on my PC
Yes, I have followed that tuts from official Adobe docs, but that doesn't work
First: Simply connect your device to your PC
Actually , you can debug your app remotely as long as your device has been connected with your PC. This step, doesn't necessarily requires FDB.
In my case , all I need was things like
adb connect 192.168.xx.xx:port
this will connect your Android device with your PC on your default network .
Second, set debug setting over network
You've done it in Animate CC, with addition you might want to check "install application on the connected device'
Third, just debug as usual
You can get all those debugging stuff including traces
I am using the ShareMediaTask in my Windows Phone 8 application, and I am trying to determine the specific capabilities and requirements on a users device. Does a users Data Connection have to be turned on on their device for the ShareMediaTask to successfully send a picture? Also, is it required to have ID_CAP_NETWORKING checked in my application's WMAppManifest for ShareMediaTask to work? Must both of these be on?
What I have is ID_CAP_NETWORKING off in my WMAppManifest, which I do not believe is require for ShareMediaTask, but my device's Data Connection switched to On in the phone settings. Is this correct?
ShareMediaTask does not have any capability requirements for inclusion within your app.
However, for the user to be able to actually share the media they will need to be able to connect to the internet and the selected services. This happens outside of your app though and so your app does not need to declare the networking capability directly.
For reference, the rules of capability requirements are defined in
"C:\Program Files (x86)\Microsoft SDKs\Windows Phone\v8.0\Tools\Marketplace\Rules.xml"
I have a set of CUDA apps that both write to the console via cout. I have a host machine with VS and NSight plug-in and a target machine with NSight service. However, when I execute the console app, it actually runs on the target machine (literally pops up a console).
So here's the question: how can I get the console to show up on the host and only the GPU stuff to execute on the target? Is this even possible?
Thanks!
The short answer is that it is currently not possible. The application on the target is executed by the Nsight Monitor process but Nsight Monitor currently does not forward the output back to host.
Currently your only option is to take care of it your self by capturing the output of your application on the target and somehow display it on the host.
If this feature is important to you i suggest you file a feature request via your Nvidia developer account.
The CUDA application completely runs on the target machine, so the console or UI for the application will be seen on the target machine only. You can set breakpoints in the GPU code in the VS side (your host machine), and it should break there.
If you feel the application quits too quickly and is not launching the kernels as expected (and you are not hitting the breakpoints), it may be that you have not deployed all the required DLLs on the target machine (e.g. CUDART).
Does QEMU provide emulation for any target with USB device controller? Actually I am developing an embedded linux based device and was thinking about testing it on QEMU.
BR,
Mooni
You can find this information in the QEMU Manual, section "3.9 USB emulation":
QEMU emulates a PCI UHCI USB
controller. You can virtually plug
virtual USB devices or real host USB
devices (experimental, works only on
Linux hosts). Qemu will automatically
create and connect virtual USB hubs as
necessary to connect multiple USB
devices.
There you also find all relevant configuration parameters.
The USB Controller is provided for the following targets:
PC System
MIPS/malta has a PIIX4 PCI/USB/SMbus controller
ARM: has various options
I'm looking for this as well. There is a bit of code for virtual usb devices in the repo but I'm specifically looking for a way to write a dcd (device controller driver) which can be accessed as a virtual device from the host running the qemu simulation. I'm looking for a way to implement this for stm32 family.