I am trying to work with JSON objects with DynamoDB and am having difficulty.
I'm trying to follow the tutorial:
http://aws.amazon.com/blogs/aws/dynamodb-update-json-and-more/
I wanted to use toJSONPretty(); on my object but the method is not recognized. I don't think I have the right gradle dependencies. I"m currently using:
compile 'com.amazonaws:aws-android-sdk-core:2.2.0'
compile 'com.amazonaws:aws-android-sdk-ddb:2.2.0'
compile 'com.amazonaws:aws-android-sdk-ddb-mapper:2.2.0'
Previously, my dynamo client was set up using the imports:
import com.amazonaws.services.dynamodbv2.AmazonDynamoDB;
import com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient;
But looking at code from Dynamo/JSON tutorials, I see the import:
import com.amazonaws.services.dynamodbv2.document.DynamoDB;
This seems to be required if you want to use the type DynamoDB as it:
DynamoDB dynamo = new DynamoDB(new AmazonDynamoDBClient(...));
I don't understand the difference between these libraries or how they relate to each other. Help!
The DynamoDB class is a higher-level abstraction of the AmazonDynamoDB API. It you create it with an instance of the AmazonDynamoDB inteface or a Regions object. AmazonDynamoDBClient implements the AmazonDynamoDB interface. Even if you pass a Regions object, an instance of a class that implements AmazonDynamoDB is created behind the scenes. Then, you get a Table so that you can perform data-plane operations like GetItem and PutItem. The Item class is one that has a toJSONPretty method. To sum up, the DynamoDB class uses an implementation of the AmazonDynamoDB interface behind the scenes to provide you with Items that you can call toJSONPretty() on.
Related
I have a implementation class which fetches items from dynamo db and the dynamo table has S3 url as one of its column value. The implementation class call this helper to get the content from S3.
Should we write junits for the individual files or in this cases junit for implementation class would be sufficient allowing impl class to call helper class and mocked the call to S3 ?
Depends on what you're going to cover with your tests.
If the utility class is also your implementation, go cover it with an individual test.
If it is provided by some external library, testing should occur within the project of this library.
const TelegramBot= require('./telegram-bot') // It's currently only local.
var bot = new TelegramBot()
console.log(bot)
// This does not print a complete JSON of the object. It misses stuff like
constructor, method, prototype and super_.
Is there some way or npm module that prints a JSON compatible output of the object?
My only work around so far is console logging it out like this and repeatedly checking the log and printing out another but I think it'll be a lot of easier by having a JSON export and using a JSON online viewer that views like a directory tree.
console.log(`
TelegramBot:
> ${Object.getOwnPropertyNames(TelegramBot)}
TelegramBot.prototype:
> ${Object.getOwnPropertyNames(TelegramBot.prototype)}
TelegramBot.prototype.constructor:
> ${Object.getOwnPropertyNames(TelegramBot.prototype.constructor)}
TelegramBot.prototype.constructor.super_:
> ${Object.getOwnPropertyNames(TelegramBot.prototype.constructor.super_)}
`)
I'm aware functions can't be seen with JSON.parse(). I don't mind if they appear as a string like "Anonymous Function()" or "FunctionWithAName()". Or something like this.
I'm doing this since I'm having another go trying to learn prototypes and I've used util.inherits(TelegramBot, EventEmitter) in the TelegramBot object.
To avoid name clashes between TelegramBot methods I've made and the super class of EventEmitter names. I'd like to keep a clear view of the whole object structure. Or do I not have to worry since they use this variable shadowing thing? If I'm correct it checks the object's instance first, then it's prototype. Not sure if EventEmitter prototype checked first or TelegramBot's.
I am using Grails 2.2.4 and have a controller endpoint which converts a domain object list to JSON. Under load (as little as 5 concurrent requests) the marshaling performance is very poor. Taking thread dumps the threads are blocked on:
java.lang.ClassLoader.loadClass(ClassLoader.java:291)
There is a single marhsaler registered to marshal all domain objects using reflection and introspection. Realizing that reflection and introspection is slower than direct method calls, I am still seeing unexpected behavior in that the class loader is caller every time and in turn blocking occurs. An example stacktrace is as follows:
java.lang.Thread.State: BLOCKED (on object monitor)
at java.lang.ClassLoader.loadClass(ClassLoader.java:291)
- waiting to lock <785e31830> (a org.grails.plugins.tomcat.ParentDelegatingClassLoader)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at java.beans.Introspector.instantiate(Introspector.java:1470)
at java.beans.Introspector.findExplicitBeanInfo(Introspector.java:431)
at java.beans.Introspector.<init>(Introspector.java:380)
at java.beans.Introspector.getBeanInfo(Introspector.java:167)
at java.beans.Introspector.getBeanInfo(Introspector.java:230)
at java.beans.Introspector.<init>(Introspector.java:389)
at java.beans.Introspector.getBeanInfo(Introspector.java:167)
at java.beans.Introspector.getBeanInfo(Introspector.java:230)
at java.beans.Introspector.<init>(Introspector.java:389)
at java.beans.Introspector.getBeanInfo(Introspector.java:167)
at java.beans.Introspector.getBeanInfo(Introspector.java:230)
at java.beans.Introspector.<init>(Introspector.java:389)
at java.beans.Introspector.getBeanInfo(Introspector.java:167)
at org.springframework.beans.CachedIntrospectionResults.<init>(CachedIntrospectionResults.java:217)
at org.springframework.beans.CachedIntrospectionResults.forClass(CachedIntrospectionResults.java:149)
at org.springframework.beans.BeanWrapperImpl.getCachedIntrospectionResults(BeanWrapperImpl.java:324)
at org.springframework.beans.BeanWrapperImpl.getPropertyValue(BeanWrapperImpl.java:727)
at org.springframework.beans.BeanWrapperImpl.getPropertyValue(BeanWrapperImpl.java:721)
at org.springframework.beans.PropertyAccessor$getPropertyValue.call(Unknown Source)
at com.ngs.id.RestDomainClassMarshaller.extractValue(RestDomainClassMarshaller.groovy:203)
...
...
A simple benchmark loading the same endpoint with the same parameters results in the loadClass call.
I was under the impression the classes would be at least cached by the class loader and not loaded on every method call to get the property to be marshaled.
The code to retrieve the property value is as follows:
BeanWrapper beanWrapper = PropertyAccessorFactory.forBeanPropertyAccess(domainObject);
return beanWrapper.getPropertyValue(property.getName());
Is there a configuration setting that is needed to ensure the classes are only loaded once? or perhaps a different way to get the property that doesn't result in class loading every time? Or perhaps a more performant way to achieve this?
Writing a custom marshaler per domain class would avoid the reflection and introspection but is going to be a lot of repeat code.
Appreciate any input.
So after much digging this is what I found out.
Using the BeanUtils.getPropertyDescriptors and getValue will always try and find a BeanInfo class describing the bean using the class loader. In this case we don't provide BeanInfo classes for our grails domain classes so this call is redundant. I found some information where you can provide a custom BeanInfoFactory to bypass this and exclude your packages but I couldn't find how to configure it with Grails.
Also searching the springframework documentation there is a configuration option you can pass Introspector.IGNORE_ALL_BEANINFO that will tell CachedIntorspectionResults to never look up the bean classes. However this was not available in version 3.1.4 of springframework which was current for grails 2.2.4. The newer versions do appear to have this option.
So, if using BeanUtils you can't by pass this initial lookup on the class loader. However subsequent loaders should be cached by CachedIntrospectionResults. Unfortunately this doesn't happen in our scenario. There looks to be a bug in the test to see if the lookup is cacheable. See more info on this below.
The fix was ultimately to fall back to use pure reflection. Rather than use:
beanWrapper.getPropertyValue(property.getName());
To use:
PropertyDescription pd = BeanUtils.getPropertyDescriptor(domainObject.getClass(), property.getName())
pd.readMethod.invoke(domainObject)
Where the pd is cached.
After fixing this the profiler still showed a lack of caching on CachedIntorspectionResults for the out of the box grails marshaller. This was due to the bad caching implementation in CachedIntrospectionResults. The work around for this was to add the correct class loader to the acceptedClassLoaders in the CachedIntrospectionResults.
public class EnhanceCachedIntrospectionResultsAcceptedClassLoadersListener implements ServletContextListener {
public void contextInitialized(ServletContextEvent event) {
CachedIntrospectionResults.acceptClassLoader(Thread.currentThread().getContextClassLoader().getParent());
}
public void contextDestroyed(ServletContextEvent event) {
CachedIntrospectionResults.clearClassLoader(Thread.currentThread().getContextClassLoader().getParent());
Introspector.flushCaches();
}
}
Note that it was required to add the parent to the accepted class loader list rather than the current class loader. Not sure if this is specific to grails or not but this fixed the issue. I'm not sure if there may be a side effect to this fix.
In summary we went from 10 requests/sec in the original setup to 120 requests/sec after using direct reflection and fixing the CachedIntrospectionResults cache.
However the real eye opened was that if we use a 1-1 marshaller per domain class we were seeing another x2 improvement in performance over the generic marshaller where we test objects for whether they're instances of class etc. We're saving a lot of code with the generic marshaller but there's a lot more work to do to get comparable performance to writing a 1-1 marshaller.
Hopefully this will be useful to someone else who runs into this ...
I'm looking for a way to realize the following use-case:
I have many modules and each one of them has a wire spec that
exposes its components
To assemble an application, I select the modules and use their wire-spec
The wire-spec of the application is the merge of wire-specs of used
modules: (3.1) I start by 'requiring' the wire-spec of each module
as objects. (3.2) Then, I merge the objects. (3.3) And, finally, I
return the result as the object defining the wire-spec of the
application.
Here is a sample of an application context-spec:
define(["jquery", "module1-wire-spec", "module2-wire-spec"], function(jquery, module1WireSpec, module2WireSpec) {
return jquery.extend(true, module1WireSpec, module2WireSpec);
});
I have read several times wire documentation hoping to find a 'native' way to do the above but I failed so far to find one.
A 'native' way would be a factory like the 'wire' factory but instead of creating a child-context for each module, I'm looking to see the components of each module as direct components of the application context.
Spring, for instance, allows importing a context definition into another one and the result is as if the content of the imported context has been inlined with the importing context.
A new feature has been added to cujojs/wire to allow import of contexts.
As of version 0.10.8, the keyword imports accepts:
a string for a single context import,
or an array for a list of contexts import.
Check here for more details.
I am looking for some help with designing some functionality in my application. I already have something similar designed but this problem is a little different.
Background:
In my application we have different Modules. Data in each module can be associated to other modules. Each Module is represented by an Object in our application.
Module 1 can be associated with Module 2 and Module 3. Currently I use a factory to provide the proper DAO for getting and saving this data.
It looks something like this:
class Module1Factory {
public static Module1BridgeDAO createModule1BridgeDAO(int moduleid) {
switch (moduleId)
{
case Module.Module2Id: return new Module1_Module2DAO();
case Module.Module3Id: return new Module1_Module3DAO();
default: return null;
}
}
}
Module1_Module2 and Module1_Module3 implement the same BridgeModule interface. In the database I have a Table for every module (Module1, Module2, Module3). I also have a bridge table for each module (they are many to many) Module1_Module2, Module1_Module3 etc.
The DAO basically handles all code needed to manage the association and retrieve its own instance data for the calling module. Now when we add new modules that associate with Module1 we simply implement the ModuleBridge interface and provide the common functionality.
New Development
We are adding a new module that will have the ability to be associated with other Modules as well as specific properties of that module. The module is basically providing the user the ability to add their custom forms to our other modules. That way they can collect additional information along with what we provide.
I want to start associating my Form module with other modules and their properties. Ie if Module1 has a property Category, I want to associate an instance From data with that property.
There are many Forms. If a users creates an instance of Module2, they may always want to also have certain form(s) attached to that Module2 instance. If they create an instance of Module2 and select Category 1, then I may want additional Form(s) created.
I prototyped something like this:
Form
FormLayout (contains the labels and gui controls)
FormModule (associates a form with all instances of a module)
Form Instance (create an instance of a form to be filled out)
As I thought about it I was thinking about making a new FormModule table/class/dao for each Module and Property that I add. So I might have:
FormModule1
FormModule1Property1
FormModule1Property2
FormModule1Property3
FormModule1Property4
FormModule2
FormModule3
FormModule3Property1
Then as I did previously, I would use a factory to get the proper DAO for dealing with all of these. I would hand it an array of ids representing different modules and properties and it would return all of the DAOs that I need to call getForms(). Which in turn would return all of the forms for that particular bridge.
Some points
This will be for a new module so I dont need to expand on the factory code I provided. I just wanted to show an example of what I have done in the past.
The new module can be associated with: Other Modules (ie globally for any instance of that module data), Other module properties (ie only if the Module instance has a certian value in one of its properties)
I want to make it easy for developers to add associations with other modules and properties easily
Can any one suggest any design patterns or strategy's for achieving this?
If anything is unclear please let me know.
Thank you,
Al
You can use springs Dependency Injection feature. This would help you achieve the flexibility of instantiating the objects using an xml configuration file.
So, my suggestion would be go with the Springs.