Interactive T4 Templates - configuration

I'm using a set of T4 templates in most of my MVC projects that create a set of Managers (think repositories), ViewModels and Extensions - utility extension methods such as ToModel(), ToViewModel() and ToSelectList(). So far so good. An enormous amount of basic 'plumbing' code is now written for me.
What I'd really like is an ability to configure variables that are used within those templates from an external file and then have the template use that file when executing.
I know I can run another T4 template from within another, but I can't find a way to add configuration in a separate file.
Presently, I include an 'Entity' table in my database and use that for configuration. It works but it feels dirty to have this in the database.

T4 is just C#/VB.Net code in the end, you can pretty much use any libraries you want. If you want an external configuration file you could use json.net and a simple json file in your project. At the start of your template, use the file io in the framework to read your json files contents, pass that to json.net and then extract the parameters you need. The most common way to use json.net is to serialize and deserialize classes but it also gives you access to a lower json dictionary like object that you can use linq to get any data you need from the json.
But remember there is always more then one way to solve a problem and this is a problem I have been trying to solve for a while. My preferred solution is an extension that I have created called T4 Awesome. My extension takes a totally different approach to using T4 for scaffolding inside Visual Studio. I add multiple tool windows and context menus around the IDE to make managing and using T4 templates faster and easier. I have a dynamic UI that lets you define simple parameters and pass them to your templates and also give you much more control over the final output files location. Feel free to check it out. And full disclaimer, I charge for this extension but have a free community version that should be able to do what you want.

Related

Boomi integration - Dynamically inject mapping information

We are now in process of evaluating integration solutions and comparing Mule and Boomi.
Use case is to read an Excel file, map the columns to a pre-defined set of JSON attributes and then use the JSON to insert records into a database. The mapping may vary from one Excel template to another wherein the column names in an Excel may be different from others.
How do I inject mapping information (source vs target) from outside integration flow?
Note: In Mule, I'm able to do that using a mapping variable (value is JSON) that I inject using Mule DataWeave language.
Boomi's mapping component is static in terms of structure but more versatile solutions are certainly possible.
The data processor component opens up Groovy, JavaScript, and XSLT 3.0 as options. These are Turing-complete languages that can be used to bend Boomi to almost any outcome.
You could make the Boomi UI available to those who need to write the maps in JSON. It's a pretty simple interface to learn. By using a route component, there could be one "parent" process that governs the a process for each template/process and then a map for each template. Such a solution would be pretty easy to build and run; allowing the template-specific processes to be deployed independently of the "parent".
You could map to a generic columnar structure and then dynamically alter the target
columns by writing a SQL procedure that would alter the target columns.
I've come across attempts to do what you're describing (not using either Boomi or Mulesoft) which were tragic failures: https://www.zdnet.com/article/uk-rural-payments-agency-rpa-it-failure-and-gross-incompetence-screws-farmers/ I draw your attention to the NAO's points:
ensure the system specifications retain a realistic level of flexibility
and
bespoke software is costly to develop, needs to be thoroughly tested, and takes more time to implement
The general goal for such a requirement like yours is usually to make transformation/ETL available to "non-programmers" which denies the reality that there are many more skills to delivering an outcome than "programming".

Running XSLT on servers?

Currently I am working on a project for a client that compares the difference between two XML files, generates an XML that lists the differences (i.e. if a part in an inventory was <Added>, <Deleted>, or <Modified>) and displays a report in HTML.
I have three transforms that basically transform large vendor-specific XML files to simple generic XML files (schema defined). These generic XML files are then transformed into one generic XML file that shows the differences and then that is transformed to a report.html for display for the user.
Presently for testing, I invoke a .bat file to run all three transforms (using Saxon8.jar). My question is, is it possible to put these transforms on a server and create a HTML page with a one-click action that will let the user upload the vendor-specific XML files, transform them, and display the generated HTML file to the user?
You haven't specified whether you'll be using php, java or ASP.NET, however, the functionality you're looking for is possible in all three cases. Your backend web app should have the necesssary mechanism to accept the file uploaded by the user, save it in some work folder, run the necessary transformation using your chosen language, Jave, C#, php etc. and then write back the HTML.
Is it possible? Yes.
To do it you'd typically use some server-side technology (php, ruby, java) to perform the transforms.
But browser-side XSLT is possible, too.
Apache Cocoon is a powerful XML processing engine.
If you're just doing this one job, then coding a Java servlet to do it is not too difficult. If you're doing lots of similar things, a framework like Cocoon or Orbeon will save you effort in the long run.

Generation custom files from dbml file?

I've been having a look at making changes to the partial classes generated from a DBML file. I was reading into using the sqlmetal.exe tool but it appears that you can't do much customisation of what it actually spits out.
I'm wanting to make changes to the file for serialization purposes, I'd like to add the Data Member Attribute to specified properties in the generated partial classes.
Is this possible to do using the sqlmetal.exe tool or would I need to write my own tool for the file generation?
You could check out T4 templates or CodeSmith for file generation.
No it is not. You can accomplish this with Entity Framework.
http://blogs.msdn.com/jkowalski/archive/2008/05/12/transparent-lazy-loading-for-entity-framework-part-1.aspx
Code written by Jaroslaw Kowalski works much the same way that Linq to SQL does.
It has some issues, but you can do everything with it, because you have the source. I'm going to publish my version soon(support for stored procedures, improved databinding experience and many other useful features)
If you want the datacontract and datamember attributes to be added, simply change the "Serialization Mode" property in the L2S designer's datacontext properties from "None" to "Unidirectional". All entity classes will then be datacontracts, and their members will be datamembers...
The upcoming Beta version of Entity Developer will contain highly customizable T4-like templates for code generation.
Also we have added functionality to divide the generated code into separate files.

Testing and mocking with Flex

I am developing a "dumb" front-end, it's an AIR application that interacts with a "smart" LiveCycle server. There are currently about 20 request & response pairs for the application. For many reasons (testing, developing outside the corporate network, etc), we have several XML files of fake data, and if a certain configuration flag is set, the files are loaded, a specific file is parsed and used to create a mock response. Each XML file is a set of responses for different situation, all internally consistent. We currently have about 10 XML files, each corresponding to different situation we can run into. This is probably going to grow to 30-50 XML files.
The current system was developed by me during one of those 90-hour-week release cycles, when we were under duress because LiveCycle was down again and we had a deadline to meet. Most of the minor crap has been cleaned up.
The fake data is in an object called FakeData, with properties like customerType1:XML, customerType2:XML, overdueCustomer1:XML, etc. Then in the FakeData constructor, all of the properties are set like this:
customerType1:XML = FileUtil.loadXML(File.applicationDirectory.resolvePath("fakeData/customerType1.xml");
And whenever you need some fake data (this happens in special FakeDelegates that extend the real LiveCycle Delegates), you get it from an instance of FakeData.
This is awful, for many reasons, but it works. One embarrassing part is that every time you create an instance of FakeData, it reloads all the XML files.
I'm trying to figure out if there's a design pattern that is not Singleton that can handle this more elegantly. The constraints are:
No global instances can be required (currently, all the code dealing with the fake data, including the fake delegates, is pulled out of production builds without any side-effects, and it needs to stay that way). This puts the Factory pattern out of the running.
It can handle multiple objects using the XML data without performance issues.
The XML files are read centrally so that the other code doesn't have to know where the XML files are, and so some preprocessing can be done (like creating a map of certain tag values and the associated XML file).
Design patterns, or other architecture suggestions, would be greatly appreciated.
Take a look at ASMock which was developed by a good friend of mine (and a member here Richard Szalay) and is based on .nets Rhino mocks. We've used it in several production environments now so i can vouch for it's stability.
should be able to get rid of any fake tests (more like integration tests) by using the mock object instead.
Wouldn't it make more sense to do traditional mocking with a mocking framework? Depending on your implementation, it might be possible to set up the Expects by reading the fake-data XML files.
Here is a Google Code project that offers mocking for ActionScript.

.NET template files

I have a application that generates a couple of different mails. These mails are currently build up using a string builder that generates a HTML based string that is the mail content.
This approach is getting messy. The code is objects mixed with HTML, etc, etc. What I'd like is to have a template similar to the one used in for example MVC and then use the output of this template to add to the mail.
Can this be done using for example T4 add how wound that work or should I use another approach for this?
The templates don't have to be editable at runtime even if that would be nice.
It can be done with T4. It would work exactly the way it does in MVC.
You should consider getting the Clarius T4 Editor, as Visual Studio does not come with intellisense for T4 out of the box, and the Clarius T4 editor can provide that.
Have you considered just using XSLT? You can easily create HTML documents from raw XML by applying an XSL Transform in .Net
We store XSL templates in the database, and use a simple NAME/VALUE pair approach to populate the templates in code. It is very effective and flexible, and the output can be piped to the browser easily. Alternatively you could just store the individual templates in files on the server and load them up that way.
Either way, the amount of code to implement this is relatively simple, and .Net has many classes in place to facilitate this. If you want an example of how we implemented this, leave a comment and I will append some code examples to this answer.