I converted my caffe model to IR successfully, error happened when I tried convert IR to pytorch:
Pytorch Emitter has not supported operator [PRelu]
How should I deal with that please?
Yes MMdnn support supports LeakyRelu. Check the link below for pytorch_emitter.py implementation from MMdnn.
pytorch_emitter.py
If you check the implementation you will find all the supported operations and it doesn't include PRelu.
Related
Apologies in advance for the perhaps stupid question. Is it possible to integrate into the CHISEL flow a Scala script that generate timing constraint specifications (SDC) for a given design? e.g. press a button and you get your CHISEL design converted to Verilog along with an SDC file, ready for synthesis.
I currently have such a toolflow in place for VHDL (using python to generate the constraints files). But in VHDL the naming conventions are quite clear, not so sure about the CHISEL backend (also I couldn't find any reference on the web doing this)
Is it possible, or this is just not how CHISEL was intended to be used ?
Thanks in advance !
Chisel has an annotation system to support tracking and linking against signals in the emitted Verilog. I've described this system in a previous question here on StackOverflow: Chisel: getting signal name in final Verilog
There is existing work to leverage this support and build physical design flows, see Hammer which is used by Chipyard.
I need to run a repo that contains deprecated cudnn functions (cudnnGetConvolutionForwardAlgorithm',cudnnGetConvolutionBackwardFilterAlgorithm','cudnnGetConvolutionBackwardDataAlgorithm'). I am on cudnn8.0 at the moment.
I know there are working versions of these functions: cudnnGetConvolutionForwardAlgorithm_v7, cudnnGetConvolutionBackwardFilterAlgorithm_v7 etc but their parameters + return type change.
Do you have any advice on how I could convert from the deprecated version to the working version?
I wouldn't convert to the _v7 function, because you'll have to convert it again when support for it is removed altogether in a future version of the CUDNN library. I'd be inclined to use the cudnnFind... functions, details of which are in the cuDNN API Reference. Since they actually test the possible algorithms that will be used and tell you the fastest one, they may give your network better performance than having a heuristic (used in cudnnGet...) which will only probably give you the fastest one. There is an additional computation cost to call cudnnFind... when you create the network, but not to run it. I haven't done testing yet to see how much extra time it takes, but I can't imagine it's anything noticeable.
I am trying to use the tensorboard callback in keras. When I run the pretrained inceptionv3 model with the tensorboard callback I am getting the following warning:
INFO:tensorflow:Summary name conv2d_95/kernel:0 is illegal; using conv2d_95/kernel_0 instead.
I saw a comment on Github addressing this issue. SeaFX on his comment pointed out that he solved it by replacing variable.name with variable.name.replace(':','_'). I am unsure how to do that. Can anyone please help me. Thanks in advance :)
Not sure on getting name replacement to work however a workaround that may be sufficient for your needs is:
import tensorflow as tf
tf.logging.set_verbosity(tf.logging.WARN)
import keras
This will turn off all INFO level logging but keep warnings, errors etc.
See this question for a discussion on the various log levels and changing them. Personally I found setting the TF_CPP_MIN_LOG_LEVEL environment variable didn't work under Jupyter notebook but I haven't tested on base Python.
How to solve Scala Problem?
I have warning by JSON usage in my project:
Object JSON in package json is depricated. This object will be
removed.
import scala.util.parsing.json._
JSON.parseRaw("[{'a':'b'},{'c':'d'}]")
Usually, this means a piece of functionality has been superseded by another implementation the use of which is preferred over the old one and a question like this simply means the OP is too lazy to google the docs. This is especially true in case of libraries in the Java language, which treats backward compatibility very seriously (to the point it becomes a pain for some). The Scala ecosystem is not so strict in this regard and upgrading to a newer version of the language means you can get a different API or even binary incompabilities. See also Scala: binary incompatibility between releases. This is not a comment against Scala. There are good reasons these incompatibilities exist.
However, I must admit that the documentation for scala.util.parsing.json does not contain any information regarding the recommended replacement for this functionality whatsoever. It took me quite a while to dig up something that just barely resembles a clear statement of what the recommended replacement is.
There seems to have been a lot of discussion in the community about the point and repercussions of this deprecation. I recommend reading this thread in the scala-users group if you're interested.
The most quoted reasons for this deprecation seem to be around poor performance and thread safety.
The deprecation was done as part of this Jira issue and the use of different parsers is recommended in the closing comment of this related task that was not completed due to the deprecation.
Alternatives include:
play-json
spray-json
argonaut
jackson
rapture-json (which allows you to choose between different implementations)
To answer your question. This is a warning, your code should not break until this object is actually removed. However, if new bugs are found in this functionality, they most likely aren't going to be fixed. Your code can also break if you upgrade to a newer version of Scala that actually has those packages removed (Version 2.11.0 and above, according to the documentation)
The answer previously provided by #toniedzwiedz is very complete and describe the whole story around the question.
I just had the same issue using Scala 2.11 and I solved adding the dependencies which are in this repository.
In particular, for Scala 2.11 is:
<dependency>
<groupId>org.scala-lang.modules</groupId>
<artifactId>scala-parser-combinators_2.11</artifactId>
<version>1.1.0</version>
</dependency>
Then you will not have the warning.
Also considere using Lift JSON as an alternative
https://github.com/lift/lift/tree/master/framework/lift-base/lift-json/
The JSON parser in the Scala standard library is deprecated. You should pick one of more robust third-party libraries like Jackson, Play-Json, json4s, etc.
Has anyone started to work with the CUDA5 SDK?
I have an old project that uses some cutil functions, but they've been abandoned in the new one.
The solution was that most functions can be translated from cutil*/cut* to a similar named sdk* equivalent from the helper*.h headers...
As an example:
cutStartTimer becomes sdkCreateTimer
Just that simple...
Has anyone started to work with the CUDA5 SDK?
Probably.
Has anyone translated some cutil definitions to CUDA5?
Maybe. But why not just use the new header files intended to replace it? Quoted from the Beta release notes:
Prior to CUDA 5.0, CUDA Sample projects referenced a utility library
with header and source files called cutil. This has been removed with
the CUDA Samples in CUDA 5.0, and replaced with header files found
in CUDA Samples\v5.0\C\common\inc
helper_cuda.h, helper_cuda_gl.h, helper_cuda_drvapi.h, helper_functions.h,
helper_image.h, helper_math.h, helper_string.h, and helper_timer.h
These files provide utility functions for CUDA device initialization,
CUDA error checking, string parsing, image file loading and saving, and
timing functions. The CUDA Samples projects no longer have references
and dependencies to cutil, and will now use these helper functions
going forward.