Use parsnip to call glm.nb - tidymodels

Looking for a wrapper to call negative binomial from MASS package similar to possionreg available in the tidy models environment? Is there a way to call this.

Related

How to write nested fsm in Chisel like SpinalHDL does?

I've found that SpinalHDL has some powerful and elegant ways to describe a rather complicated fsm. I wonder if it's possible to construct a FSM lib (like SpinalHDL has done) in Chisel?
p.s. I'm not very familiar with scala.πŸ˜Άβ€πŸŒ«οΈ
Chisel does not currently have a FSM library. We normally write state machines explicitly using a ChiselEnum for the state register.
If you like the API that Spinal has, but need/want to use Chisel, it should not be to hard to make a similar API on top of Chisel. Chisel is designed to allow you to add your own favorite abstractions by adding your own Scala library code.

Extract intermediate representation of MiDaS neural network in pytorch?

Pytorch documentation provides a concise way to apply MiDaS monocular depth estimation network for depth extraction. But how should I modify their code to extract network representation at some intermediate layer? I know that I could download the model from github and modify forward function to return what I want, but I am interested in the simplest solution, leaving outer code as is.
I'm aware of subclassing the model class and writing my own forward function, like here, but I don't know how to access the class in the code. The model instance is created straight away with midas = torch.hub.load("intel-isl/MiDaS", model_type). Maybe an example of using a forward hook will be easier.
As you said, using a forward hook on a nn.Module is the easiest way to go about it. Consider the documentation: https://pytorch.org/docs/stable/generated/torch.nn.Module.html#torch.nn.Module.register_forward_hook
Basically you just have to define a function that takes three inputs (module, input, output) and then does whatever you want with that data. To find at what Module you want to place that hook you obviously need to be familiar with the structure of the model. You can just print(midas) to get a pretty-printed representation of all the modules available. I just chose some random one, and used the print() function as a hook:
midas.pretrained.model.blocks[3].mlp.fc2.register_forward_hook(print)
This means whenever we call midas(some_input), the hook (print in this case) will be called with the corresponding arguments. Of course instead of print you can write a function that saves those files to e.g. a list that you can access from the outside, or write them to a file etc.

Extending embedded Python in C++ - Design to interact with C++ instances

There are several packages out there that help in automating the task of writing bindings between C\C++ and other languages.
In my case, I'd like to bind Python, some options for such packages are: SWIG, Boost.Python and Robin.
It seems that the straight forward process is to use these packages to create C\C++ linkable libraries (with mostly static functions) and have the higher language be extended using them.
However, my situation is that I already have a developed working system in C++ therefore plan to embed Python into it so that future development will be in Python.
It's not clear to me how, and if at all possible, to use these packages in helping to extend embedded Python in such a way that the Python code would be able to interact with the various Singleton instances already running in the system, and instantiate C++ classes and interact with them.
What I'm looking for is an insight regarding the design best fitted for this situation.
Boost.python lets you do a lot of those things right out of the box, especially if you use smart pointers. You can even inherit from C++ classes in Python, then pass instances of those back to your C++ code and have everything still work. My favorite resource on how to do various stuff is this (especially check out the "How To" section): http://wiki.python.org/moin/boost.python/ .
Boost.python is especially good if you're using smart pointers or intrusive pointers, as those translate transparently into PyObject reference counting. Also, it's very good at making factory functions look like Python constructors, which makes for very clean Python APIs.
If you're not using smart pointers, it's still possible to do all the things you want, but you have to mess with various return and lifetime policies, which can give you a headache.
To make it short: There is the modern alternative pybind11.
Long version: I also had to embed python. The C++ Python interface is small so I decided to use the C Api. That turned out to be a nightmare. Exposing classes lets you write tons of complicated boilerplate code. Boost::Python greatly avoids this by using readable interface definitions. However I found that boost lacks a sophisticated documentation and dor some things you still have to call the Python api. Further their build system seems to give people troubles. I cant tell since i use packages provided by the system. Finally I tried the boost python fork pybind11 and have to say that it is really convenient and fixes some shortcomings of boost like the necessity of the use of the Python Api, ability to use lambdas, the lack of an easy comprehensible documentation and automatic exception translation. Further it is header only and does not pull the huge boost dependency on deployment, so I can definitively recommend it.

Loading Elixir/SQLAlchemy models in .NET?

A new requirement has come down from the top: implement 'proprietary business tech' with the awesome, resilient Elixir database I have set up. I've tried a lot of different things, such as creating an implib from the provided interop DLL (which apparently doesn't work like COM dlls) which didn't work at all. CPython doesn't like the MFC stuff either, so all attempts to create a Python lib have failed (using C anyway, not sure you can create a python library from .NET directly).
The only saving grace is the developer saw fit to provide VBA, .NET and MFC Interop C++ hooks into his library, so there are "some" choices, though they all ultimately lead back to the same framework. What would be the best method to:
A) Keep my model definitions in one place, in one language (Python/Elixir/SQLAlchemy)
B) Have this new .NET access the models without resorting to brittle, hard-coded SQL.
Any and all suggestions are welcome.
After a day or so of deliberation, I'm attempting to load the new business module in IronPython. Although I don't really want to introduce to python interpreters into my environment, I think that this will be the glue I need to get this done efficiently.

How do they write different language wrappers for same library?

Generally a library will be released in a single language (for example C). If the library tuns out to be useful then many language wrappers for that library will be written. How exactly do they do it?
Kindly someone throw little light on this topic. If it is too language dependent pick language of your choice and explain it.
There are a few options that come to mind:
Port the original C library to the language/platform of your choice
Compile the C library into something (like a DLL) that can be invoked from other components
Put the library on the web, expose an API over HTTP and wrap that on the client
If I wanted to wrap a C library with a managed (.NET) layer, I'd compile the library into a DLL, exposing the APIs I wanted. Then, I'd use P/Invoke to call those APIs from my C# code.