I was reading the Sequelize documentation and I am trying to figure out how to get the associationType of a model. It seems like you should be able to import a model (e.g. Posts) and call Posts.associationType or Posts.association.associationType. Docs on Associations
I also found an old stack overflow question which mentioned that calling something like Posts instanceof sequelize.Association.BelongsTo should work as well. When I call Posts.associations it only gives me the association as a key and value. {'Comments': 'Comments}
But neither method works. I seem to be able to access the rest of the model's attributes perfectly fine.
You are only seeing the toString() version of the output. It is more clear if you try:
console.log(Object.keys(models.Posts.associations));
This will give you an array of the keys of the associations, so you can use them to access more details:
console.log(Object.keys(models.Posts.associations.Comments));
You will then see that it has an associationType property which you can use to get the string value you are looking for. Comments is the name/key for the association which can be overridden with the as attribute in the definition.
// type = "BelongsTo"
var type = models.Posts.associations.Comments.associationType;
Related
I'm retrieving data from a Mysql database, and this is done via mapping my entity properties to database fields using Doctrine with annotations as shown in this
image. Retrieving the data is not the problem, but my property name is used as key for the returned data, which is not what i want.
Example:
I have a property gridId (camelCase), which is mapped to a database field called grid_id (snake_case) as shown in the image above. The key for the rows returned for this property will be gridId, and not grid_id, which is desired in my case.
An example of the current situation:
image.
The desired situation:
$grid =
[
grid_block_id = 18
width = 50
width_pixels = 50
]
Changing either the database fields or the property names is not an option in my case.
EDIT
I'm using JMS as serializer. I guess the 4th option of #Jakumi's answer is probably the best option for my use case. Somehow use the serializer to return the property's column name as key for returned values.
My struggle when retrieving data from doctrine is that i have to manually adjust the keys fetched by doctrine as shown in this image as this is how the front-end expects it to be.
so what you're saying is ... you want to actively ignore the object relational mapping (ORM) that the doctrine ORM you're using is providing you and which actually provide the keys exactly the way your entity names them?
I assume there is some good (or not so good) reason for this. Depending on your use case, different solutions may be the best. Options including (but probably very incomplete):
option 1: add a function to the entity
class Grid {
/** all your fields **/
public function toArray() {
return [
'grid_id' => $this->gridId,
// ...
];
}
}
option 2: don't use the orm
add a function in your GridRepository like:
function getRawAll() {
$conn = $this->getEntityManager()->getConnection();
$_result = $conn->query('SELECT * FROM grid');
$results = [];
while($row = $_result->fetch()) {
$results[] = $row;
}
return $results;
}
and similar. Those are obviously circumventing the ORM. The ORM is for retrieving objects, and those have specified fields.
option 3: return the given array back to the unmapped state
use the EntityManager's getClassMetadata (ClassMetadataInfo->fieldMapping) to fetch the mapping for your fields, and "revert" the renaming.
This is probably slightly better than option 1 to some degree, since it doesn't have to be updated.
option 4: use a serializer that does your property name translation.
since you're apparently already using the jms serializer, the SerializedName annotation can probably help, although the default appears to be snake case, but maybe somewhere it's overridden? you might want to check your configuration ...
I Have Create a DB in that I am Having Multiple tables having Relationship between them.
When a try to get data from my WEb app i get this error
"'Self referencing loop detected with type 'System.Data.Entity.DynamicProxies.PrescriptionMaster_2C4C63F6E22DFF8E29DCAC8D06EBAE038831B58747056064834E80E41B5C4E4A'. Path '[0].Patient.PrescriptionMasters"
I coudn't get why i am getting this error, and when i remove the relationships between tables i get proper data From it.
I have Tried other solutions like adding
"config.Formatters.JsonFormatter.SerializerSettings.ReferenceLoopHandling
= Newtonsoft.Json.ReferenceLoopHandling.Ignore; "
in Webconfig.cs but nothing has worked for me.
Please help me, what should I do ?
The only proper way to prevent this from happening is by not sending Entity Framework objects (which may contain such loops) into the JSON Serializer (which is not too good at knowing when to stop serializing).
Instead, create ViewModels that mimic the parts of the EF Objects that your Front End actually needs, then fill those ViewModels using the EF Objects.
A quick-and-dirty way is to just use anonymous objects, for example:
return new
{
Product = new
{
Id = EF_Product.Id,
Name = EF_Product.Name
}
};
A good rule-of-thumb is to only assign simple properties (number, bool, string, datetime) from the EF Objects to the ViewModel items. As soon as you encounter an EF Object property that is yet another EF Object (or a collection of EF Objects), then you need to translate those as well to 'simple' objects that are not linked to EF.
On the other end of the spectrum there are libraries such as AutoMapper. If you decide that you need actual ViewModel classes, then AutoMapper will help mapping the EF Objects to those ViewModels in a very structured way.
Just add this to the Application_Start in Global.asax:
HttpConfiguration config = GlobalConfiguration.Configuration;
config.Formatters.JsonFormatter
.SerializerSettings
.ReferenceLoopHandling = Newtonsoft.Json.ReferenceLoopHandling.Ignore;
It will ignore the reference pointing back to the object.
In my PostgreSQL database I have:
CREATE TABLE category (
// ...
category_name_localization JSON not null,
);
In Java, I have a JDO class like so:
#javax.jdo.annotations.PersistenceCapable(table = "category" )
public class Category extends _BlueEntity implements Serializable {
//...
private org.json.simple.JSONObject category_name_localization;
#javax.jdo.annotations.Column( name = "category_name_localization" )
public org.json.simple.JSONObject getCategoryNameLocalization() {
return category_name_localization;
}
}
When I use this class, DataNucleus gives the following exception:
org.datanucleus.exceptions.NucleusUserException: Field "com.advantagegroup.blue.ui.entity.Category.category_name_localization" is a map that has been specified without a join table and neither the key nor the value has a mapped-by specified. This is invalid!
at org.datanucleus.store.rdbms.RDBMSStoreManager.newJoinTable(RDBMSStoreManager.java:2720)
at org.datanucleus.store.rdbms.mapping.java.AbstractContainerMapping.initialize(AbstractContainerMapping.java:82)
at org.datanucleus.store.rdbms.mapping.MappingManagerImpl.getMapping(MappingManagerImpl.java:680)
at org.datanucleus.store.rdbms.table.ClassTable.manageMembers(ClassTable.java:518)
at org.datanucleus.store.rdbms.table.ClassTable.manageClass(ClassTable.java:424)
at org.datanucleus.store.rdbms.table.ClassTable.initializeForClass(ClassTable.java:1250)
at org.datanucleus.store.rdbms.table.ClassTable.initialize(ClassTable.java:271)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.initializeClassTables(RDBMSStoreManager.java:3288)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2897)
at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:118)
at org.datanucleus.store.rdbms.RDBMSStoreManager.manageClasses(RDBMSStoreManager.java:1637)
at org.datanucleus.store.rdbms.RDBMSStoreManager.getDatastoreClass(RDBMSStoreManager.java:665)
at org.datanucleus.store.rdbms.RDBMSStoreManager.getPropertiesForGenerator(RDBMSStoreManager.java:2098)
at org.datanucleus.store.AbstractStoreManager.getStrategyValue(AbstractStoreManager.java:1278)
at org.datanucleus.ExecutionContextImpl.newObjectId(ExecutionContextImpl.java:3668)
at org.datanucleus.state.StateManagerImpl.setIdentity(StateManagerImpl.java:2276)
at org.datanucleus.state.StateManagerImpl.initialiseForPersistentNew(StateManagerImpl.java:482)
at org.datanucleus.state.StateManagerImpl.initialiseForPersistentNew(StateManagerImpl.java:122)
at org.datanucleus.state.ObjectProviderFactoryImpl.newForPersistentNew(ObjectProviderFactoryImpl.java:218)
at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:1986)
at org.datanucleus.ExecutionContextImpl.persistObjectWork(ExecutionContextImpl.java:1830)
at org.datanucleus.ExecutionContextImpl.persistObject(ExecutionContextImpl.java:1685)
at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:712)
at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:738)
at com.advantagegroup.blue.ui.jdo._BlueJdo.insert(_BlueJdo.java:40)
at ...
This error makes sense in a way, because org.json.simple.JSONObject extends Map. However, this field is not part of any relationships -- it is of type JSON and therefore it is natural to back it with JSONObject
How do I tell JDO / DataNucleus to chill and treat org.json.simple.JSONObject the same way it would a String or a Date?
Thanks!
DC
My understanding of this is that your default attempt is trying to persist a normal Map (since while it doesnt know what a JSONObject is, it does know what a Map is), and it will need a join table for that for RDBMS.
Since you presumably want the JSONObject persisted into a single column then you need to create a JDO AttributeConverter. I've done similar things with my own types and it works fine (i'm on v5.0.5 IIRC).
I also found this in their docs, for when you have your own Map class that it doesn't know how to handle by default in terms of replacing it with a proxy (to intercept the calls to put, putAll etc). If you add that line it will not try to wrap this field with a proxy (which it doesn't know how to do for that type, unless you tell it). If you wanted to auto-detect the JSONObject becoming "dirty" you would need to write a proxy wrapper, as per this page.
This doesn't answer how to map the column for that converter to use a "json" type in PostgreSQL, but i'd guess that if you set the sqlType you may get success in that respect.
I have an SQLAlchemy model class model, a string denoting an attribute/column attr that corresponds to an ORM relationship with another model class othermodel, and a private key or id string of such an othermodel.
I would like to find the object othermodel.get(id) to store it in a newly constructed instance, using like setattr(model(), attr, ???) – but I don't have that othermodel accessible in a variable. How do I get that?
I assume I can use some kind of introspection on model or its new instance, but how?
Digging through
model's attributes, I found a copy of othermodel in
getattr(model, attr).comparator.property.argument
turning my assignment into
setattr(model(), attr, getattr(model, attr).comparator.property.argument.get(id))
Quite a mouthful!
This looks reasonable for the moment, but I am quite sure I am missing a more obvious way to access it.
I'm learning Dart and was reading the article Using Dart with JSON Web Services, which told me that I could get help with type checking when converting my objects to and from JSON. I used their code snippet but ended up with compiler warnings. I found another Stack Overflow question which discussed the same problem, and the answer was to use the #proxy annotation and implement noSuchMethod. Here's my attempt:
abstract class Language {
String language;
List targets;
Map website;
}
#proxy
class LanguageImpl extends JsonObject implements Language {
LanguageImpl();
factory LanguageImpl.fromJsonString(string) {
return new JsonObject.fromJsonString(string, new LanguageImpl());
}
noSuchMethod(i) => super.noSuchMethod(i);
}
I don't know if the noSuchMethod implementation is correct, and #proxy seems redundant now. Regardless, the code doesn't do what I want. If I run
var lang1 = new LanguageImpl.fromJsonString('{"language":"Dart"}');
print(JSON.encode(lang1));
print(lang1.language);
print(lang1.language + "!");
var lang2 = new LanguageImpl.fromJsonString('{"language":13.37000}');
print(JSON.encode(lang2));
print(lang2.language);
print(lang2.language + "!");
I get the output
{"language":"Dart"}
Dart
Dart!
{"language":13.37}
13.37
type 'String' is not a subtype of type 'num' of 'other'.
and then a stacktrace. Hence, although the readability is a little bit better (one of the goals of the article), the strong typing promised by the article doesn't work and the code might or might not crash, depending on the input.
What am I doing wrong?
The article mentions static types in one paragraph but JsonObject has nothing to do with static types.
What you get from JsonObject is that you don't need Map access syntax.
Instead of someMap['language'] = value; you can write someObj.language = value; and you get the fields in the autocomplete list, but Dart is not able to do any type checking neither when you assign a value to a field of the object (someObj.language = value;) nor when you use fromJsonString() (as mentioned because of noSuchMethod/#proxy).
I assume that you want an exception to be thrown on this line:
var lang2 = new LanguageImpl.fromJsonString('{"language":13.37000}');
because 13.37 is not a String. In order for JsonObject to do this it would need to use mirrors to determine the type of the field and manually do a type check. This is possible, but it would add to the dart2js output size.
So barring that, I think that throwing a type error when reading the field is reasonable, and you might have just found a bug-worthy issue here. Since noSuchMethod is being used to implement an abstract method, the runtime can actually do a type check on the arguments and return values. It appears from your example that it's not. Care to file a bug?
If this was addressed, then JsonObject could immediate read a field after setting it to cause a type check when decoding without mirrors, and it could do that check in an assert() so that it's only done in checked mode. I think that would be a nice solution.