I have classes generated via xsd.exe (xsd.exe someschema.xsd /classes). One of the nodes is declared as an element:
<xs:element name="containsxmlelementsbeneath"/>
As the (ficticious) name implies, it looks like this:
<containsxmlelementsbeneath>
<somemore>
...
</somemore>
</containsxmlelementsbeneath>
When it is deserialized, I see in the debugger that it is of type
System.Xml.XmlNode[]
I can force it in the Immediate Window
?((System.Xml.XmlNode[])elem.containsxmlelementsbeneath)[0].InnerXml
I got no IntelliSense, which made sense when I tried the snippet in my code - the class seems to have been removed from the WinRT profile - which is nice if all you need is Windows.Data.Xml.Dom.IXmlNode - but not in this case.
How can I go and get the string representation of that element? Is there a way to "fix" the xsd.exe-generated output so it uses Windows.Data.Xml.Dom for Serialization? (doesn't look like it to me)
Did I hit a fringe case they didn't think about?
Update - tried the following (I know, an abuse of dynamic):
dynamic x = elem.containsxmlelementsbeneath;
string s = x[0].InnerXml;
This yields "The API 'System.Xml.XmlNode[]' cannot be used on the current platform."
I had a (longer) chat with another dev - after some trial and error we came up with a solution:
<xs:element ref="containsxmlelementsbeneath"/>
<xs:element name="containsxmlelementsbeneath">
</xs:element>
This creates an empty class for us (via xsd.exe)
[System.CodeDom.Compiler.GeneratedCodeAttribute("xsd", "4.0.30319.17929")]
[System.Diagnostics.DebuggerStepThroughAttribute()]
[System.Xml.Serialization.XmlTypeAttribute(AnonymousType = true)]
[System.Xml.Serialization.XmlRootAttribute(Namespace = "", IsNullable = false)]
public partial class containsxmlelementsbeneath
{
}
This has to be modified like this:
[System.Xml.Serialization.XmlRootAttribute(Namespace = "", IsNullable = false)]
public partial class containsxmlelementsbeneath : IXmlSerializable
{
[XmlIgnore]
public string Text { get; set; }
public System.Xml.Schema.XmlSchema GetSchema()
{
throw new System.NotImplementedException();
}
public void ReadXml(System.Xml.XmlReader reader)
{
Text = reader.ReadInnerXml();
}
public void WriteXml(System.Xml.XmlWriter writer)
{
throw new System.NotImplementedException();
}
}
Note that all attributes except XmlRoot have to be removed, otherwise you get Reflection Exceptions (Only XmlRoot attribute may be specified for the type containsxmlelementsbeneath. Please use XmlSchemaProviderAttribute to specify schema type.)
End Result: this node with all its subnodes as a plain old string. No non-accesible XmlNode or XmlElement any more...
Related
The simplest version of the problem is shown by creating a new ASP .NET core web api project. Adding this class
public class TestClass
{
public string AAAA_BBBB { get; set; } = "1";
}
And this controller method
[HttpGet("GetTestClass")]
public TestClass GetTestClass()
{
return new TestClass();
}
results in this response
{
"aaaA_BBBB": "1"
}
After experimenting, it looks like anything which has an underscore in it is treated this way. All the characters in the bit before the FIRST underscore except for the last character in that set get converted into lower case.
So
AAAA_BBBB becomes aaaA_BBBB
AAAAAAAA_BBB_CCC_DDD becomes aaaaaaaA_BBB_CCC_DDD
A_BBB becomes a_BBBB
AA_BB becomes aA_BB
Why is this happening and how do I fix it?
By default Web API serializes the fields of types that have a [Serializable] attribute (e.g. Version).That's what the Web API guys wanted.
You can decorate with a property name like so:
[JsonPropertyName("AAAA_BBBB")]
public string AAAA_BBBB { get; set; } = "1";
Or,you can refer to these two links:Link1 and Link2 to stop Web API doing it in JSON.
Update
The same effect can be achieved using the following code in Startup:
services.AddControllersWithViews().
AddJsonOptions(options =>
{
options.JsonSerializerOptions.PropertyNameCaseInsensitive = true;
options.JsonSerializerOptions.PropertyNamingPolicy = null;
});
I've got following setup: C#, ServiceStack, MariaDB, POCOs with objects and structs, JSON.
The main question is: how to use ServiceStack to store POCOs to MariaDB having complex types (objects and structs) blobbed as JSON and still have working de/serialization of the same POCOs? All of these single tasks are supported, but I had problems when all put together mainly because of structs.
... finally during writing this I found some solution and it may look like I answered my own question, but I still would like to know the answer from more skilled people, because the solution I found is a little bit complicated, I think. Details and two subquestions arise later in the context.
Sorry for the length and for possible misinformation caused by my limited knowledge.
Simple example. This is the final working one I ended with. At the beginning there were no SomeStruct.ToString()/Parse() methods and no JsConfig settings.
using Newtonsoft.Json;
using ServiceStack;
using ServiceStack.DataAnnotations;
using ServiceStack.OrmLite;
using ServiceStack.Text;
using System.Diagnostics;
namespace Test
{
public class MainObject
{
public int Id { get; set; }
public string StringProp { get; set; }
public SomeObject ObjectProp { get; set; }
public SomeStruct StructProp { get; set; }
}
public class SomeObject
{
public string StringProp { get; set; }
}
public struct SomeStruct
{
public string StringProp { get; set; }
public override string ToString()
{
// Unable to use .ToJson() here (ServiceStack does not serialize structs).
// Unable to use ServiceStack's JSON.stringify here because it just takes ToString() => stack overflow.
// => Therefore Newtonsoft.Json used.
var serializedStruct = JsonConvert.SerializeObject(this);
return serializedStruct;
}
public static SomeStruct Parse(string json)
{
// This method behaves differently for just deserialization or when part of Save().
// Details in the text.
// After playing with different options of altering the json input I ended with just taking what comes.
// After all it is not necessary, but maybe useful in other situations.
var structItem = JsonConvert.DeserializeObject<SomeStruct>(json);
return structItem;
}
}
internal class ServiceStackMariaDbStructTest
{
private readonly MainObject _mainObject = new MainObject
{
ObjectProp = new SomeObject { StringProp = "SomeObject's String" },
StringProp = "MainObject's String",
StructProp = new SomeStruct { StringProp = "SomeStruct's String" }
};
public ServiceStackMariaDbStructTest()
{
// This one line is needed to store complex types as blobbed JSON in MariaDB.
MySqlDialect.Provider.StringSerializer = new JsonStringSerializer();
JsConfig<SomeStruct>.RawSerializeFn = someStruct => JsonConvert.SerializeObject(someStruct);
JsConfig<SomeStruct>.RawDeserializeFn = json => JsonConvert.DeserializeObject<SomeStruct>(json);
}
public void Test_Serialization()
{
try
{
var json = _mainObject.ToJson();
if (!string.IsNullOrEmpty(json))
{
var objBack = json.FromJson<MainObject>();
}
}
catch (System.Exception ex)
{
Debug.WriteLine(ex.Message);
}
}
public void Test_Save()
{
var cs = "ConnectionStringToMariaDB";
var dbf = new OrmLiteConnectionFactory(cs, MySqlDialect.Provider);
using var db = dbf.OpenDbConnection();
db.DropAndCreateTable<MainObject>();
try
{
db.Save(_mainObject);
var dbObject = db.SingleById<MainObject>(_mainObject.Id);
}
catch (System.Exception ex)
{
Debug.WriteLine(ex.Message);
}
}
}
}
What (I think) I know / have tried but at first didn't help to solve it myself:
ServiceStack stores complex types in DB as blobbed JSV by default (last paragraph of first section: https://github.com/ServiceStack/ServiceStack.OrmLite), so it is necessary to set it the way it is proposed: MySqlDialect.Provider.StringSerializer = new JsonStringSerializer(); (https://github.com/ServiceStack/ServiceStack.OrmLite#pluggable-complex-type-serializers)=> default JSV changed to JSON.
the ServiceStack's serialization does not work with structs, it is necessary to treat them special way:
a) according to https://github.com/ServiceStack/ServiceStack.Text#c-structs-and-value-types and example https://github.com/ServiceStack/ServiceStack.Text/#using-structs-to-customize-json it is necessary to implement TStruct.ToString() and static TStruct.ParseJson()/ParseJsv() methods.
b) according to https://github.com/ServiceStack/ServiceStack.Text/#typeserializer-details-jsv-format and unit tests https://github.com/ServiceStack/ServiceStack.Text/blob/master/tests/ServiceStack.Text.Tests/CustomStructTests.cs it shall be TStruct.ToString() (the same as in a) and static TStruct.Parse().
Subquestion #1: which one is the right one? For me, ParseJson() was never called, Parse() was. Documentation issue or is it used in other situation?
I implemented option b). Results:
IDbConnection.Save(_mainObject) saved the item to MariaDB. Success.
Through the saving process ToString() and Parse() were called. In Parse, incoming JSON looked this way:
"{\"StringProp\":\"SomeStruct's String\"}". Fine.
Serialization worked. Success.
Deserialization failed. I don't know the reason, but JSON incoming to Parse() was "double-escaped":
"{\\\"StringProp\\\":\\\"SomeStruct's String\\\"}"
Subquestion #2: Why the "double-escaping" in Parse on deserialization?
I tried to solve structs with JsConfig (and Newtonsoft.Json to get proper JSON):
JsConfig<SomeStruct>.SerializeFn = someStruct => JsonConvert.SerializeObject(someStruct);
JsConfig<SomeStruct>.DeSerializeFn = json => JsonConvert.DeserializeObject<SomeStruct>(json);
a) at first without ToString() and Parse() defined in the TStruct. Results:
Save failed: the json input in JsonConvert.DeserializeObject(json) that is used during Save was just type name "WinAmbPrototype.SomeStruct".
De/serialization worked.
b) then I implemented ToString() also using Newtonsoft.Json. During Save ToString() was used instead of JsConfig.SerializeFn even the JsConfig.SerializeFn was still set (maybe by design, I do not judge). Results:
Save failed: but the json input of DeserializeFn called during Save changed, now it was JSV-like "{StringProp:SomeStruct's String}", but still not deserializable as JSON.
De/serialization worked.
Then (during writing this I was still without any solution) I found JsConfig.Raw* "overrides" and tried them:
JsConfig<SomeStruct>.RawSerializeFn = someStruct => JsonConvert.SerializeObject(someStruct);
JsConfig<SomeStruct>.RawDeserializeFn = json => JsonConvert.DeserializeObject<SomeStruct>(json);
a) at first without ToString() and Parse() defined in the TStruct. Results are the same as in 2a.
b) then I implemented ToString(). Results:
BOTH WORKED. No Parse() method needed for this task.
But it is very fragile setup:
if I removed ToString(), it failed (now I understand why, default ToString produced JSON with just type name in 2a, 3a).
if I removed RawSerializeFn setting, it failed in RawDeserializeFn ("double-escaped" JSON).
Is there some simpler solution? I would be very glad if someone points me to better direction.
Acceptable would be maybe two (both of them accessible because of different circumstances):
if I am the TStruct owner: with just pure TStruct.ToString() and static TStruct.Parse() to support out of the box de/serialization and DB by ServiceStack (without different input in Parse()).
if I am a consumer of TStruct with no JSON support implemented and I am without access to its code: until now I did not find the way, if the ToString is not implemented: Save to DB did not work. Maybe would be fine to ensure JsConfig serialize functions are enough for both de/serialization and when used during saving to DB.
And the best one would be without employing other dependency (e.g. Newtonsoft.Json) to serialize structs. Maybe some JsConfig.ShallProcessStructs = true; (WARNING: just a tip, not working as of 2021-04-02) would be fine for such situations.
ServiceStack treats structs like a single scalar value type, just like most of the core BCL Value Types (e.g. TimeSpan, DateTime, etc). Overloading the Parse() and ToString() methods and Struct's Constructor let you control the serialization/deserialization of custom structs.
Docs have been corrected. Structs use Parse whilst classes use ParseJson/ParseJsv
If you want to serialize a models properties I'd suggest you use a class instead as the behavior you're looking for is that of a POCO DTO.
If you want to have structs serailized as DTOs in your RDBMS an alternative you can try is to just use JSON.NET for the complex type serialization, e.g:
public class JsonNetStringSerializer : IStringSerializer
{
public To DeserializeFromString<To>(string serializedText) =>
JsonConvert.DeserializeObject<To>(serializedText);
public object DeserializeFromString(string serializedText, Type type) =>
JsonConvert.DeserializeObject(serializedText, type);
public string SerializeToString<TFrom>(TFrom from) =>
JsonConvert.SerializeObject(from);
}
MySqlDialect.Provider.StringSerializer = new JsonNetStringSerializer();
If I have a view model like this:
public class MyModel{
public DateTime? StartDate {get;set;}
}
And on a view an input tag is used with an asp-for tag helper like so:
<input asp-for="StartDate" />
The default html that is generated by this is
<input type="datetime" id="StartDate" name="StartDate" value="" />
But what I want it to generate is html that looks like this:
<input type="datetime" id="startDate" name="startDate" value="" />
How can I make the asp-for input tag helper generate camel case names like above without having to make my model properties camelCase?
After studying the code that #Bebben posted and the link provided with it, I continued to dig more into the Asp.Net Core source code. And I found that the designers of the Asp.Net Core provided some extensibility points that could be leveraged to achieve lower camelCase id and name values.
To do it, we need to implement our own IHtmlGenerator which we can do by creating a custom class that inherits from DefaultHtmlGenerator. Then on that class we need to override the GenerateTextBox method to fix the casing. Or alternatively we can override the GenerateInput method to fix the casing of name and id attribute values for all input fields (not just input text fields) which is what I chose to do. As a bonus I also override the GenerateLabel method so the label's for attribute also specifies a value using the custom casing.
Here's the class:
using Microsoft.AspNetCore.Antiforgery;
using Microsoft.AspNetCore.Mvc;
using Microsoft.AspNetCore.Mvc.Internal;
using Microsoft.AspNetCore.Mvc.ModelBinding;
using Microsoft.AspNetCore.Mvc.Rendering;
using Microsoft.AspNetCore.Mvc.Routing;
using Microsoft.AspNetCore.Mvc.ViewFeatures;
using Microsoft.Extensions.Options;
using System.Collections.Generic;
using System.Text.Encodings.Web;
namespace App.Web {
public class CustomHtmlGenerator : DefaultHtmlGenerator {
public CustomHtmlGenerator(
IAntiforgery antiforgery,
IOptions<MvcViewOptions> optionsAccessor,
IModelMetadataProvider metadataProvider,
IUrlHelperFactory urlHelperFactory,
HtmlEncoder htmlEncoder,
ClientValidatorCache clientValidatorCache) : base
(antiforgery, optionsAccessor, metadataProvider, urlHelperFactory,
htmlEncoder, clientValidatorCache) {
//Nothing to do
}
public CustomHtmlGenerator(
IAntiforgery antiforgery,
IOptions<MvcViewOptions> optionsAccessor,
IModelMetadataProvider metadataProvider,
IUrlHelperFactory urlHelperFactory,
HtmlEncoder htmlEncoder,
ClientValidatorCache clientValidatorCache,
ValidationHtmlAttributeProvider validationAttributeProvider) : base
(antiforgery, optionsAccessor, metadataProvider, urlHelperFactory, htmlEncoder,
clientValidatorCache, validationAttributeProvider) {
//Nothing to do
}
protected override TagBuilder GenerateInput(
ViewContext viewContext,
InputType inputType,
ModelExplorer modelExplorer,
string expression,
object value,
bool useViewData,
bool isChecked,
bool setId,
bool isExplicitValue,
string format,
IDictionary<string, object> htmlAttributes) {
expression = GetLowerCamelCase(expression);
return base.GenerateInput(viewContext, inputType, modelExplorer, expression, value, useViewData,
isChecked, setId, isExplicitValue, format, htmlAttributes);
}
public override TagBuilder GenerateLabel(
ViewContext viewContext,
ModelExplorer modelExplorer,
string expression,
string labelText,
object htmlAttributes) {
expression = GetLowerCamelCase(expression);
return base.GenerateLabel(viewContext, modelExplorer, expression, labelText, htmlAttributes);
}
private string GetLowerCamelCase(string text) {
if (!string.IsNullOrEmpty(text)) {
if (char.IsUpper(text[0])) {
return char.ToLower(text[0]) + text.Substring(1);
}
}
return text;
}
}
}
Now that we have our CustomHtmlGenerator class we need to register it in the IoC container in place of the DefaultHtmlGenerator. We can do that in the ConfigureServices method of the Startup.cs via the following two lines:
//Replace DefaultHtmlGenerator with CustomHtmlGenerator
services.Remove<IHtmlGenerator, DefaultHtmlGenerator>();
services.AddTransient<IHtmlGenerator, CustomHtmlGenerator>();
Pretty cool. And not only have we solved the id and name casing issue on the input fields but by implementing our own custom IHtmlGenerator, and getting it registered, we have opened the door on all kinds of html customization that can be done.
I'm starting to really appreciate the power of a system built around an IoC, and default classes with virtual methods. The level of customization available with little effort under such an approach is really pretty amazing.
Update
#Gup3rSuR4c pointed out that my services.Remove call must be an extension method that's not included in the framework. I checked, and yep that true. So, here is the code for that extension method:
public static class IServiceCollectionExtensions {
public static void Remove<TServiceType, TImplementationType>(this IServiceCollection services) {
var serviceDescriptor = services.First(s => s.ServiceType == typeof(TServiceType) &&
s.ImplementationType == typeof(TImplementationType));
services.Remove(serviceDescriptor);
}
}
The simplest way to do this is to just write
<input asp-for="StartDate" name="startDate" />
Or do you want to have it generated completely automatically in camel case, for the whole application?
To do that, it seems like you have to implement your own InputTagHelpers in Microsoft.AspNetCore.Mvc.TagHelpers.
Here is the method where the name is generated:
private TagBuilder GenerateTextBox(ModelExplorer modelExplorer, string inputTypeHint, string inputType)
{
var format = Format;
if (string.IsNullOrEmpty(format))
{
format = GetFormat(modelExplorer, inputTypeHint, inputType);
}
var htmlAttributes = new Dictionary<string, object>
{
{ "type", inputType }
};
if (string.Equals(inputType, "file") && string.Equals(inputTypeHint, TemplateRenderer.IEnumerableOfIFormFileName))
{
htmlAttributes["multiple"] = "multiple";
}
return Generator.GenerateTextBox(
ViewContext,
modelExplorer,
For.Name,
value: modelExplorer.Model,
format: format,
htmlAttributes: htmlAttributes);
}
(The above code is from https://github.com/aspnet/Mvc/blob/dev/src/Microsoft.AspNetCore.Mvc.TagHelpers/InputTagHelper.cs, Apache License, Version 2.0, Copyright .NET Foundation)
The line is "For.Name". The name is sent into some other methods, and the one that in the end gives the final name is in a static class (Microsoft.AspNetCore.Mvc.ViewFeatures.Internal.NameAndIdProvider), so nothing we can really plug into easily.
I'm just start switching to memcached and currently on testing with memcached.
I'm having 2 object, I created an object and put [Serializable] on it (for instance, let call this Object1), the other object is created using Linq DBML (Object2)..
I tried to memcached List<Object1>, it work just fine, like charm, everything here is cache and loaded properly.
But then, i move on to the Linq object, now i try to add to memcached List<Object2> this does not work, it did not add to memcached at all. no key was added
I move on and change the Serialization Mode to Unidirectional, do the add again, still no hope.
Is there anyway to make this work?
Here is the simple test I just wrote, using MemcachedProvider from codeplex to demonstrate:
public ActionResult Test()
{
var returnObj = DistCache.Get<List<Post>>("testKey");
if (returnObj == null)
{
DataContext _db = new DataContext();
returnObj = _db.Posts.ToList();
DistCache.Add("testKey", returnObj, new TimeSpan(29, 0, 0, 0));
_db.Dispose();
}
return Content(returnObj.First().TITLE);
}
this is from Memcached, no STORE was called:
> NOT FOUND _x_testKey
>532 END
<528 get _x_testKey
> NOT FOUND _x_testKey
>528 END
<516 get _x_testKey
> NOT FOUND _x_testKey
>516 END
And in my SQL profiler, it called 3 query for 3 test time => Proved that the object called back from Memcached is null, then it query.
It looks like the default implementation (DefaultTranscoder) is to use BinaryFormatter; the "unidirectional" stuff is an instruction to a different serializer (DataContractSerializer), and doesn't add [Serializable].
(Note: I've added a memo to myself to try to write a protobuf-net transcoder for memcached; that would be cool and would fix most of this for free)
I haven't tested, but a few options present themselves:
write a different transcoder implementation that detects [DataContract] and uses DataContractSerializer, and hook this transcoder
add [Serializable] to your types via a partial class (I'm not convinced this will work due to the LINQ field types not being serializable)
add an ISerializable implementation in a partial class that uses DataContractSerializer
like 3, but using protobuf-net, which a: works with "unidirectional", and b: is faster and smaller than DataContractSerializer
write a serializable DTO and map your types to that
The last is simple but may add more work.
I'd be tempted to to look at the 3rd option first, as the 1st involves rebuilding the provider; the 4th option would also definitely be on my list of things to test.
I struggled with 3, due to the DCS returning a different object during deserialization; I switched to protobuf-net instead, so here's a version that shows adding a partial class to your existing [DataContract] type that makes it work with BinaryFormatter. Actually, I suspect (with evidence) this will also make it much efficient (than raw [Serializable]), too:
using System;
using System.IO;
using System.Runtime.Serialization;
using System.Runtime.Serialization.Formatters.Binary;
using ProtoBuf;
/* DBML generated */
namespace My.Object.Model
{
[DataContract]
public partial class MyType
{
[DataMember(Order = 1)]
public int Id { get; set; }
[DataMember(Order = 2)]
public string Name { get; set; }
}
}
/* Your extra class file */
namespace My.Object.Model
{
// this adds **extra** code into the existing MyType
[Serializable]
public partial class MyType : ISerializable {
public MyType() {}
void ISerializable.GetObjectData(SerializationInfo info, StreamingContext context) {
Serializer.Serialize(info, this);
}
protected MyType(SerializationInfo info, StreamingContext context) {
Serializer.Merge(info, this);
}
}
}
/* quick test via BinaryFormatter */
namespace My.App
{
using My.Object.Model;
static class Program
{
static void Main()
{
BinaryFormatter bf = new BinaryFormatter();
MyType obj = new MyType { Id = 123, Name = "abc" }, clone;
using (MemoryStream ms = new MemoryStream())
{
bf.Serialize(ms, obj);
ms.Position = 0;
clone = (MyType)bf.Deserialize(ms);
}
Console.WriteLine(clone.Id);
Console.WriteLine(clone.Name);
}
}
}
I'm really stumped on this incredibly simple mapping. It looks just like one of the examples even. If I comment out the internal structure, it'll run the binding compiler successfully. If I put the internal structure back in, it fails. Note that the internal structure is just defining the XML. This is basically example5 of the JIBX tutorial examples.
<binding>
<mapping name="RequestTransaction" class="TransactionRequest">
<value name="version" set-method="setVersion" get-method="getVersion" style="attribute" />
<structure name="transHeader">
<value name="requestCount" set-method="setRequestCount" get-method="getRequestCount"/>
</structure>
</mapping>
<binding>
Then I get the following error on the jibx compile:
Error: Error during validation: null; on mapping element at (line 2, col 97, in jibx-binding.xml)
I'm absolutely stumped and out of ideas. Google shows nothing useful.
The <structure> is arguably the most important concept in JiBX binding because it allows you to map arbitrary XML to your Java classes without forcing you to create bloated and ugly layers of nested Java objects and classes to match the XML design.
In this case your binding declares that you have an XML element named <transHeader> that will not be present in your Java class.
With some slight fixes to your XML format, your binding works perfectly. I assume the fact that your binding has two <binding> open tags rather than and open and close <binding></binding> is a typo, because you said you got it to work without the structure. Also add <?xml version="1.0"?> at the top of your binding file. Those two XML mods allow the JiBX 1.2 binding compiler to work with the following Java class:
(Note: you didn't provide the Java class this binding is for so I had to reconstruct it from the info you put in the binding file. The obvious side effect of this is that I reconstructed a class that will work with this binding. But the simple fact is that a JiBX binding by design contains all the info you need to know about the class and the XML.)
public class TransactionRequest {
private String version;
private int requestCount;
public void setVersion(String ver) {
version = ver;
}
public String getVersion() {
return version;
}
public void setRequestCount(int count) {
requestCount = count;
}
public int getRequestCount() {
return requestCount;
}
}
compile the class then run the binding compiler with:
>java -jar jibx-bind.jar jibx-binding.xml
To test it I used the following sample.xml:
(Note: you also didn't provide the XML you are trying to map so again I created a sample based on what you did provide)
<?xml version="1.0"?>
<RequestTransaction version="0.1">
<transHeader>
<requestCount>3</requestCount>
</transHeader>
</RequestTransaction>
Running the test uses the following code:
public static void main(String[] argz) {
String fileName = "./sample.xml";
IBindingFactory bfact = null;
IUnmarshallingContext uctx = null;
TransactionRequest sample = null;
try {
bfact = BindingDirectory.getFactory(TransactionRequest.class);
uctx = bfact.createUnmarshallingContext();
InputStream in = new FileInputStream(fileName);
sample = (TransactionRequest)uctx.unmarshalDocument(in, null);
System.out.println(sample.getRequestCount());
System.out.println(sample.getVersion());
}
catch (Exception e) {
e.printStackTrace();
}
}
And it runs successfully.
It's been a while now, but I found it was related to inheritance. I needed to give mappings for everything in the inheritance tree, including interfaces as I recall.
I ended up creating a wrapper object, which I've found seems to be the easiest way to use JIBX in general. Trying to map a true domain class causes tendrils into every class that class touches and I have to unjar everything so JIBX can find the classes, including 3rd party libs.