I'm developing a BizTalk application to query a number of web services that have been written and maintained by a third party, and I'm having some trouble getting the namespaces right on the Schemas.
Basically, I can't consume the wsdl to automatically generate the schemas because the namespaces and element names are all wrong within the generated schemas (due lazy C# wsdl generation), so I'm having to write them from scratch. This would be fine, but the Web Service endpoints are requiring that the elements within the schema all be qualified with specific namespaces, and none of them match the namespace of the overall schema.
I have figured out how to import other namespaces/schemas into my schema, but I can't figure out how to change the namespace of the elements to anything but the default. Does anyone know how to do this?
For example, the Schema root has to have a namespace of "http:/tempuri.org/", but one of the elements requires the namespace "http://schemas.datacontract.org/2004/07/ReadService.DTO.Inbound.Supplier", but within BizTalk, I can't edit the namespace of that element to change it.
The body of one of the requests looks like this:
<tem:GetSupplierIdWithExternalId>
<tem:request>
<com:Header>
<com1:Username></com1:Username>
<com1:Locale></com1:Locale>
</com:Header>
<read:ExternalSupplierId></read:ExternalSupplierId>
</tem:request>
</tem:GetSupplierIdWithExternalId>
"tem" in this case is http://tempuri.org/". "com", "com1" and "read" are all different namespaces, which, as Gruff has pointed out, are all default namespaces for WCF projects.
Generating from WSDL in Biztalk creates 2 issues:
The default namespace applied to the root note is not tempuri.org (as it recognises this as a default), it's the standard Biztalk http://..Folder.SchemaName namespace. Changing this to tempuri.org causes a cascade of errors that have to be fixed, and it doesn't resolve the more major issue which is:
Because of the way the WCF functions the WSDL has been generated from are written, the major element names (GetSupplierIdWithExternalId above) are all named incorrectly - in most cases, something like "GetSupplierIdWithExternalIdRequest", because that's the name of the function that schema is generate from. Again it's due to lazy programming on the endpoints, because the name of the element isn't being properly defined, it's just assumed by the generation process.
If I try and create a single flat file schema, I can only define a single namespace for the whole file, and if I set that to tempuri.org I get:
<ns0:GetSupplierWithExternalId xmlns:ns0="http://tempuri.org/">
<Header>
<Username>Username_0</Username>
<Locale>Locale_0</Locale>
</Header>
<ExternalSupplierId>ExternalSupplierId_0</ExternalSupplierId>
</ns0:GetSupplierWithExternalId>
...which fails the a SOAP request because the namespaces on the internal elements aren't correct.
Thanks in advance!
You will need to define the element with the namespace of "http://schemas.datacontract.org/2004/07/ReadService.DTO.Inbound.Supplier" in its own schema file, and import it into the schema root and compose the root that way. The element will keep the namespace it was defined as.
Looking at the namespace "http://schemas.datacontract.org/2004/07/ReadService.DTO.Inbound.Supplier", it seems it is the default namespace that WCF gives the data contract because it was not explicitly defined. (The CLR namespace of the class is ReadService.DTO.Inbound.Supplier) When the DataContractSerializer serializes the message when sending the request, it will serialize it with that namespace. You should not try and change it in the BizTalk schema, otherwise there will be a schema mismatch.
UPDATE:
In your update you mention 2 issues when generating the schema from the WSDL.
Can you paste a screenshot of this?
Are you sure GetSupplierIdWithExternalIdRequest is incorrect? If you search in the WSDL for that term, can you find it?
The operation's request and response wrappers typically get suffixed with -Request and -Response, so this might be perfectly correct.
Related
For better or worse our codebase relies heavily on Newtonsoft.Json. There are various type converters etc. based on this framework and it is simply not worth the effort to rewrite them using some other JSON framework (if even possible).
We use appsettings.json (+ other custom files) to load settings into our applications.
Typically we used to configure this like:
configurationBuilder.AddJsonFile(pathToFile, ...);
Now, we need to use some of the converters we usually rely on when parsing regular JSON for the settings JSON as well. These are usually being automatically picked up by Newtonsoft.Json so we thought the solution would simply be to reference
Microsoft.Extensions.Configuration.NewtonsoftJson and change to:
configurationBuilder.AddNewtonwoftJsonFile(pathToFile, ...);
However this does not appear to be the case. The converters are not being invoked when we call.
var someSettings = configurationSection.Get<SomeSettings>();
If we paste the same settings into a string and manually parse it the usual Newtonsoft.Json way. This works just fine.
Our conclusion is that either we are doing something wrong (hopefully) or the "binding" part where a configuration section is transferred into actual properties on the object is not part of the Newtonsoft.Json configuration extension.
Any suggestions?
From what I've read, one would generally use a global variable so that all controllers have access to some data.
Is there an "best-practice" way of accessing global data in the view templates? The use case would be for storing semi-static data like the website's brandname or location address. If in the future that data changes (ie, rebranding), it would be trivial to update the view to reflect those changes.
This thread suggests that using $rootScope is bad, and a better way would be to use a Service. However, in my case this gets messy because I have to mentally remember to include the service and create a scope var in each controller that has a template that will reference the static data.
I've seen suggestions of storing this data in a database, and then querying for it when needed. But that advice tends to be for server-side frameworks, and I would rather not do a GET query to the server just to grab static data in Angular.
I could leave it hardcoded as I have it now, and just run a grep to search and update whatever templates.
Is there a way to assign static data to variables once, and then have it be accessed in the templates without going through hoops? And all the while following Angular best practices? Or perhaps hardcoding the the easiest/cleanest approach?
Service Factory behave like singleton, when injected in different module you actually access the same data so it works perfectly for communication between controller.
Each component dependent on a service gets a reference to the single instance generated by the service factory.
If you want access those data in your template, just include the object in your scope, display. This will automatically implement two-way binding and is a good practice for MVC pattern.
To know the difference between Service and Factory : angular.service vs angular.factory
But try to avoid as much as possible to use global variable :D
BUT
This apply in a perfect world with perfect developer ... I love using a global variable like SETTINGS (uppercase to make it sounds constant) and which include some data required before angular initialization for exemple.
Would work well for such data like title and stuff like you have. However, you still need to add it manually in your scope (which for a title would be ... once ? Yeah seems ok)
I want to set an arbitrary attribute for rendering to JSON.
I had followed the answer in this question: how to append data to json in ruby/rails? to do
model = Model.find
model[:extra_info] = "More detail."
model.to_json
It works perfectly, but in my tests I'm getting a deprecation warning that setting arbitrary attributes is no longer supported, use attr_writer.
I tried using
model.write_attribute(:extra_info, "More detail.")
which works in unit testing, but on the server, raises an exception:
private method `write_attribute' called for Model
What's the non-deprecated clean way to do this.
I'm aware I could set it in the JSON call with methods as in Add virtual attribute to json output, but in this case the variable to be added is not part of the models concern, so it doesn't have access to the data needed to construct the extra attribute, and it would be nasty and messy to do so.
So what's the correct way for the controller to get this data pushed into the model so the JSON renders properly?
In Model model, put
attr_accessor :extra_info
Then in controller
model.extra_info = "more detail"
Nick's answer above is the best in terms of creating well structured, well documented code.
Update
*It seems I was wrong on the below*
The code below still creates deprecation warnings on my development server.
In this particular case, I don't want to clutter the model up with extra accessors for very specific once off cases, so I'm using
s.send(:write_attribute, :extra_info, "more detail")
Inside a helper, inside the controller.
I'm developing a RESTful interface which is used to provide JSON data for a JavaScript application.
On the server side I use Grails 1.3.7 and use GORM Domain Objects for persistence. I implemented a custom JSON Marshaller to support marshalling the nested domain objects
Here are sample domain objects:
class SampleDomain {
static mapping = { nest2 cascade: 'all' }
String someString
SampleDomainNested nest2
}
and
class SampleDomainNested {
String someField
}
The SampleDomain resource is published under the URL /rs/sample/ so /rs/sample/1 points to the SampleDomain object with ID 1
When I render the resource using my custom json marshaller (GET on /rs/sample/1), I get the following data:
{
"someString" : "somevalue1",
"nest2" : {
"someField" : "someothervalue"
}
}
which is exactly what I want.
Now comes the problem: I try to send the same data to the resource /rs/sample/1 via PUT.
To bind the json data to the Domain Object, the controller handling the request calls def domain = SampleDomain.get(id) and domain.properties = data where data is the unmarshalled object.
The binding for the "someString" field is working just fine, but the nested object is not populated using the nested data so I get an error that the property "nest2" is null, which is not allowed.
I already tried implementing a custom PropertyEditorSupport as well as a StructuredPropertyEditor and register the editor for the class.
Strangely, the editor only gets called when I supply non-nested values. So when I send the following to the server via PUT (which doesn't make any sense ;) )
{
"someString" : "somevalue1",
"nest2" : "test"
}
at least the property editor gets called.
I looked at the code of the GrailsDataBinder. I found out that setting properties of an association seems to work by specifying the path of the association instead of providing a map, so the following works as well:
{
"someString" : "somevalue1",
"nest2.somefield" : "someothervalue"
}
but this doesn't help me since I don't want to implement a custom JavaScript to JSON object serializer.
Is it possible to use Grails data binding using nested maps? Or do I really heave to implement that by hand for each domain class?
Thanks a lot,
Martin
Since this question got upvoted several times I would like to share what I did in the end:
Since I had some more requirements to be implemented like security etc. I implemented a service layer which hides the domain objects from the controllers. I introduced a "dynamic DTO layer" which translates Domain Objects to Groovy Maps which can be serialized easily using the standard serializers and which implements the updates manually. All the semi-automatic/meta-programming/command pattern/... based solutions I tried to implement failed at some point, mostly resulting in strange GORM errors or a lot of configuration code (and a lot of frustration). The update and serialization methods for the DTOs are fairly straightforward and could be implemented very quickly. It does not introduce a lot of duplicate code as well since you have to specify how your domain objects are serialized anyway if you don't want to publish your internal domain object structure. Maybe it's not the most elegant solution but it was the only solution which really worked for me. It also allows me to implement batch updates since the update logic is not connected to the http requests any more.
However I must say that I don't think that grails is the appropriate tech stack best suited for this kind of application, since it makes your application very heavy-weight and inflexbile. My experience is that once you start doing things which are not supported by the framework by default, it starts getting messy. Furthermore, I don't like the fact that the "repository" layer in grails essentially only exists as a part of the domain objects which introduced a lot of problems and resulted in several "proxy services" emulating a repository layer. If you start building an application using a json rest interface, I would suggest to either go for a very light-weight technology like node.js or, if you want to/have to stick to a java based stack, use standard spring framework + spring mvc + spring data with a nice and clean dto layer (this is what I've migrated to and it works like a charm). You don't have to write a lot of boilerplate code and you are completely in control of what's actually happening. Furthermore you get strong typing which increases developer productivity as well as maintainability and which legitimates the additional LOCs. And of course strong typing means strong tooling!
I started writing a blog entry describing the architecture I came up with (with a sample project of course), however I don't have a lot of time right now to finish it. When it's done I'm going to link to it here for reference.
Hope this can serve as inspiration for people experiencing similar problems.
Cheers!
It requires you to provide teh class name:
{ class:"SampleDomain", someString: "abc",
nest2: { class: "SampleDomainNested", someField:"def" }
}
I know, it requires different input that the output it produces.
As I mentioned in the comment earlier, you might be better off using the gson library.
Not sure why you wrote your own json marshaller, with xstream around.
See http://x-stream.github.io/json-tutorial.html
We have been very happy with xstream for our back end (grails based) services and this way you can render marshall in xml or json, or override the default marshalling for a specific object if you like.
Jettison seems to produce a more compact less human readable JSON and you can run into some library collision stuff, but the default internal json stream renderer is decent.
If you are going to publish the service to the public, you will want to take the time to return appropriate HTTP protocol responses for errors etc... ($.02)
I am using Plinqo and Linq-to-SQL to implement a repository. I'd like to inform the UI of validation rules by examining metadata and acting accordingly. Problem is, the Metadata classes in Plinqo are marked internal and are nested inside the classes they decorate.
How can I get at these classes and enumerate their attributes from another assembly?
The only way that I'm aware of accomplishing this is to use Reflection. The following code uses reflection and looks for all of the rule's attributes defined on the internal metadata class. DynamicData also does a similar lookup of the attributes defined in the Metadata class by using an attribute defined on the class that can be found in the generated partial class:
[System.ComponentModel.DataAnnotations.MetadataType(typeof(PetShop.Data.Category.Metadata))]
Thanks
-Blake Niemyjski