getting standard errors for predicted values from model averaged mixed generalized linear models (GLMM) - lme4

I am using model.avg to make multi-model inferences on a suite of mixed effects generalized linear models (GLMMs) derived with LMER. I can use PREDICT to get predicted values but cannot get standard errors for those predicted values because of the random effects:
Error in predict.averaging(Mod.avg, newdata, se.fit = TRUE) :
'predict' for models '' caused errors
In addition: Warning messages:
1: In predict.merMod(object = <S4 object of class "lmerMod">, newdata = newdata, : cannot calculate predictions with both standard errors and random effects
I've tried using predictInterval and bootMer (merTools) but they can't handle the model.avg object:
Error in UseMethod("getME") :
no applicable method for 'getME' applied to an object of class "averaging"
Does anyone have a work-around to derived confidence intervals and standard errors in this situation?

Related

Does sim_slopes take into account random intercept in model?

I am using sim_slopes to test whether the slopes are different for different levels of a variable:
sim_slopes(model6, pred = Score, modx = Age, mod2 =
Status, johnson_neyman = TRUE)
The model I am basing this on is a Linear Mixed Effects model with lmer from package lme4. I was wondering if the above code takes into account the fact that the data is longitudinal and that the model uses random intercepts for each participant?

Writing a dropout layer using nn.Sequential() method + Pytorch

I am trying to create a Dropout Layer for my neural network using nn.Sequential() like this:
class DropoutLayer(nn.Module):
def __init__(self, p):
super().__init__()
self.p = p
def forward(self, input):
if self.training:
u1 = (np.random.rand(*input.shape)<self.p)
u1 *= input
return u1
else:
input *= self.p
model = nn.Sequential(Flatten(),DropoutLayer(p = 0.7),nn.LogSoftmax(dim = -1))
opt = torch.optim.Adam(modelDp.parameters(), lr=0.005)
train(modelDp, opt, 5)
But I get this error:
ValueError: optimizer got an empty parameter list
First, there is what I assume to be a small typo : you declare model = nn.Sequential(...) but then use modelDp.parameters(). I assume you just made a small copypaste mistake and these are actually the same thing.
This error is yielded because no layer in your model has trainable parameters, i.e parameters that will be affected by the gradient backpropagation step. Actually, this "network" cannot learn anything at all.
To get rid of the error and get an actual working neural network, you need to include the learning layers, which according to the previous error you had reported, are linear layers. That would be something like :
model = nn.Sequential(nn.Linear(784, 10), Flatten(), DropoutLayer(0.7), nn.LogSoftMax(dim=-1))
Now a couple additional remarks :
You may want to use the pytorch random tensors instead of Numpy's. It will be easier to deal with the devicewhen you will eventually want to move your network on GPU.
This code is going to yield another error as soon as you try it in eval mode because your second conditional branch in the forward method does not return anything. You may want to replace the instruction with return input * self.p

mxnet failed to infer type

My mxnet code - which consists of a series of complex connections and slicing raises the following error:
Error in operator concat0: [03:03:51] src/operator/./concat-inl.h:211: Not enough information to infer type in Concat.
Im not sure how to interpret that or what information to provide to help debug it. Concat0 is part of the operation:
# Define take_column function as transpose(take(transpose(x), i))
for i in range(47):
y_hat_lt = take_column(y_hat,
mx.sym.concat(mx.sym.slice(some_indices, begin=i, end=i+1), self.label_dim + mx.sym.slice(some_indices, begin=i, end=i+1), dim=0))
here some_indices is a variable which I fix to be a list. Do let me know!
It looks like MXNet is not able to infer the shape of output. Did you specify the shape for variable some_indices?
e.g. some_indices = mx.sym.var('indices', shape=(1,1))
It would be nice if you can paste a minimum reproducible code :)
Instead of taking transpose, swapping among the axis resolved the issue.
def ttake( x, i ):
""" Take from axis 1 instead of 0.
"""
a = mx.sym.swapaxes(x, dim1=0, dim2=1)
return mx.sym.flatten( mx.sym.transpose( mx.sym.take( a , i ) ) )

LSTM with rnn cuda()?

I have the following model:
model = nn.Sequential()
model:add(nn.Sequencer(nn.LookupTable(nIndex, hiddenSize)))
model:add(nn.Sequencer(nn.FastLSTM(hiddenSize, hiddenSize, rho)))
model:add(nn.Sequencer(nn.Linear(hiddenSize, nIndex)))
model:add(nn.Sequencer(nn.LogSoftMax()))
then I put the model on cuda by:
model:cuda()
and I try to forward an input (cudatensor) and it breaks .
Is FastLSTM incompatible with cuda ?
the message:
[string "local f = function() return targets:cuda() en..."]:1: attempt to call method 'cuda' (a nil value)
I managed to introduce a few computations on cuda with the following changes:
- first put the model ans the criterion on cuda by :
model=model:cuda()
criterion=criterion:cuda()
-second I built a table of cuda tensor that I provided as targets by :
local targetscudatable={}
for i = 1, #targets do
table.insert(targetscudatable, targets[i]:cuda())
end
then it works, but I wonder if I can have more data sent to cuda, like the inputs. Anyway I already had a speed increase od 500% wich is not to bad
You forgot to require the cunn package :
require 'cunn'

Django REST Errors when serializing model with ManyToManyField

I've created a class modeling a group of files within a software product build, on a Django server using the Django-REST package. The design is that the group of files (a Depot instance) should be able to be assigned to multiple Build instances (e.g. both the "alpha" and "beta" builds using the same exact audio file depot). However, at the time that the depot is created, it is being created as part of the creation of single Build on the client; it is only later that a utility script will allow an existing Depot to be added to other Builds.
It seemed natural to me that the Depot class should represent this relationship with a ManyToManyField. The problem is that the serializer does not seem to know what to do with this ManyToManyField. I've tried several workarounds, but each has its own error. I've tried having my DepotSerializer be either a rest_framework.serializers.Serializer or a rest_framework.serializers.ModelSerializer, but that seems largely unrelated to this problem.
Models.py:
class Depot(models.Model):
name = models.CharField(max_length=64)
builds = models.ManyToManyField(Build)
TYPE_EXECUTABLE = 0
TYPE_CORE = 1
TYPE_STREAMING = 2
depot_type = models.IntegerField(choices = (
(TYPE_EXECUTABLE, 'Executable'),
(TYPE_CORE, 'Core'),
(TYPE_STREAMING, 'Streaming'),
))
def __str__(self):
return self.name
Views.py:
class DepotCreate(mixins.CreateModelMixin,
generics.GenericAPIView):
serializer_class = DepotSerializer
queryset = Depot.objects.all()
def post(self, request, *args, **kwargs):
return self.create(request, *args, **kwargs)
Serializers.py version 1:
class DepotSerializer(serializers.ModelSerializer):
builds = serializers.PrimaryKeyRelatedField()
class Meta:
model = Depot
fields = ('id', 'name', 'builds', 'depot_type')
read_only_fields = ('id',)
def validate(self, attrs):
build = attrs['builds']
if build == None:
raise serializers.ValidationError("Build could not be found")
for depot in build.depot_set.all():
if depot.name == attrs['name']:
raise serializers.ValidationError("Build already contains a depot \"{}\"".format(depot.name))
return attrs
def restore_object(self, attrs, instance=None):
# existence of the build has already been validated
return Depot(**attrs)
This version results in the following error during the Depot init call:
Exception Type: TypeError
Exception Value:
'builds' is an invalid keyword argument for this function
Exception Location: /webapps/cdp_admin_django/lib/python3.4/site-packages/django/db/models/base.py in __init__, line 417
That appears to indicate that the Depot model cannot handle the 'builds' parameter despite the fact that it has a 'builds' ManyToManyField member.
Serializers.py 'restore_object' ver 2:
def restore_object(self, attrs, instance=None):
# existence of the build has already been validated
build = attrs['builds']
depotObj = Depot(name=attrs['name'], depot_type=attrs['depot_type'])
depotObj.builds.add(build)
return depotObj
This gave me the error:
Exception Type: ValueError
Exception Value:
"<Depot: depot_test4>" needs to have a value for field "depot" before this many-to-many relationship can be used.
Exception Location: /webapps/cdp_admin_django/lib/python3.4/site-packages/django/db/models/fields/related.py in __init__, line 524
After quite a bit of investigation, I found that ManyToMany relationships can give you trouble if you don't save the MYSQL entry before attempting to manipulate that field. Hence, restore_object ver 3:
def restore_object(self, attrs, instance=None):
# existence of the build has already been validated
build = attrs['builds']
depotObj = Depot(name=attrs['name'], depot_type=attrs['depot_type'])
depotObj.save()
depotObj.builds.add(build)
return depotObj
This does successfully create the table entry for this instance, but ends up throwing the following error:
Exception Type: IntegrityError
Exception Value:
(1062, "Duplicate entry '5' for key 'PRIMARY'")
Exception Location: /webapps/cdp_admin_django/lib/python3.4/site-packages/MySQLdb/connections.py in defaulterrorhandler, line 38
This error takes place during rest_framework/mixins.py call to serializer.save(force_insert=True). Which looks like it is supposed to force the creation of a new table entry, presumably disagreeing with my earlier call to Model.save.
Does anyone know the correct approach for a design like this? I feel like this can't be that unusual of a table structure.
EDIT 10/20/2014:
After the suggestion below, I experimented with writing a new ModelSerializer for one of my models; for the most part because of these types of order-of-operations problems, I'd backed off from using ModelSerializer and did all of my data-to-object field processing in views.py by reading serializer.data.
Having a PrimaryKeyRelatedField(many=True) in the ModelSerializer DID help. Notably, I was able to create a serializer instance with existent models and get the correct serializer.data. However, I still have the problem where restore_object can do everything except create a new model instance and pass down the ManyToManyField value. I still get "TypeError: '[PrimaryKeyRelatedField name]' is an invalid keyword argument for this function" if I pass the field to the model's init func. I still cannot save the model before the REST library does it itself. In addition, in this mode, the serializer populates serializer.data with the values of the Model, not the values provided in the data input. So if you do not use the PrimaryKeyRelatedField's attrs value in restore_object, it is discarded.
It appears that I need to override ModelSerializer.save to some kind of a pre-save, apply ManyToMany input, and a post-save, but I would need the attrs values so I can apply and modify the ManyToManyField at that time. I realize that the serializer does have the init_data field to see the original inputs, but in the case where the serializer is being used to deserialize a list of data into a list of new objects, I don't think there's a way to trace which serializer.init_data corresponds with which serializer.object.
In your serializer version 1, you do not have to add
builds = serializers.PrimaryKeyRelatedField()
as the model serializer will create this for you. In fact if you look at the exemple of the documentation (http://www.django-rest-framework.org/api-guide/relations/) you'll see that the PrimaryKeyRelatedField is applied when there is a FK 'to' the current model (not a M2M relation).
I would remove this from the serializer and see then what's going on.