BIML SSIS ScriptTask as a data source - Error with OutputBuffer - ssis

I am getting the below error when trying to generate a package through BIML using a ScriptTask as a datasource. I have a large (circa 5GB) XML file to load and wanted to use a StreamReader to get the data into the database.
'Output0Buffer' does not contain a definition for 'PORTF_LIST' and no extension method 'PORTF_LIST' accepting a first argument of type 'Output0Buffer' could be found (are you missing a using directive or an assembly reference?).
This is occurring for each column. The columns are dynamic and come from a separate method in a c# class looking at the dacpac so should be the same names and casing everywhere.
Sample of the file as below:
<ANALYTICS>
<INSTRUMENTS ASOF_DATE="3/31/2017" CREATE_DATE="4/2/2017" RECORDS="3763">
<INSTRUMENT>
<PORTF_LIST>XX1245897</PORTF_LIST>
<PRT_FULL_NAME>Convertible Bonds</PRT_FULL_NAME>
<ISIN>11803384</ISIN>
</INSTRUMENT>
</INSTRUMENTS>
</ANALYTICS>
Output buffer is defined as below (there are 250 odd columns, but all follow the same pattern:
<OutputBuffers>
<OutputBuffer Name="Output0" IsSynchronous="false">
<Columns>
<Column Name="PORTF_LIST" DataType="String" Length="255"/>
<Column Name="PRT_FULL_NAME" DataType="String" Length="255"/>
<Column Name="ISIN" DataType="String" Length="255"/>
</Columns>
</OutputBuffer>
</OutputBuffers>
The script task code where I am trying to add to the buffer is below:
<## property name="Elements" type="String" #>
<## property name="Columns" type="String" #>
<## property name="BufferColumns" type="String" #>
<## property name="RootElement" type="String" #>
<ScriptComponentProject ProjectCoreName="SC_eb1debcd2374468ebccbbfad4fbe5976" Name="XmlSource">
<AssemblyReferences>
<AssemblyReference AssemblyPath="Microsoft.SqlServer.DTSPipelineWrap" />
<AssemblyReference AssemblyPath="Microsoft.SqlServer.DTSRuntimeWrap" />
<AssemblyReference AssemblyPath="Microsoft.SqlServer.PipelineHost" />
<AssemblyReference AssemblyPath="Microsoft.SqlServer.TxScript" />
<AssemblyReference AssemblyPath="Microsoft.SqlServer.ManagedDTS.dll" />
<AssemblyReference AssemblyPath="Microsoft.SqlServer.ScriptTask.dll" />
<AssemblyReference AssemblyPath="System.dll" />
<AssemblyReference AssemblyPath="System.AddIn.dll" />
<AssemblyReference AssemblyPath="System.Data.dll" />
<AssemblyReference AssemblyPath="System.Windows.Forms.dll" />
<AssemblyReference AssemblyPath="System.Xml.dll" />
<AssemblyReference AssemblyPath="System.Xml.Linq.dll" />
<AssemblyReference AssemblyPath="System.Core.dll" />
</AssemblyReferences>
<OutputBuffers>
<!--
Define what your buffer is called and what it looks like
Must set IsSynchronous as false. Otherwise it is a transformation
(one row enters, one row leaves) and not a source.
-->
<OutputBuffer Name="Output0" IsSynchronous="false">
<Columns>
<#=BufferColumns#>
</Columns>
</OutputBuffer>
</OutputBuffers>
<Files>
<File Path="Properties\AssemblyInfo.cs">
using System.Reflection;
using System.Runtime.CompilerServices;
[assembly: AssemblyTitle("XmlSource")]
[assembly: AssemblyDescription("Script Component as source")]
[assembly: AssemblyConfiguration("")]
[assembly: AssemblyCompany("")]
[assembly: AssemblyProduct("XmlSource")]
[assembly: AssemblyCopyright("Copyright # 2017")]
[assembly: AssemblyTrademark("")]
[assembly: AssemblyCulture("")]
[assembly: AssemblyVersion("1.0.*")]
</File>
<File Path="main.cs">
<![CDATA[
using System;
using System.Data;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;
using System.Security;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Xml;
using System.Xml.Linq;
using System.Windows.Forms;
[Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute]
public class ScriptMain : UserComponent
{
public override void PreExecute()
{
base.PreExecute();
}
public override void PostExecute()
{
base.PostExecute();
}
public string sourceFile = Dts.Variables["User::FileName"].Value.ToString();
public override void CreateNewOutputRows()
{
foreach (var myXmlData in (
from elements in StreamReader(sourceFile, "INSTRUMENT")
select new
{
PORTF_LIST = elements.Element("PORTF_LIST").Value,
PRT_FULL_NAME = elements.Element("PRT_FULL_NAME").Value,
ISIN = elements.Element("ISIN").Value
}
))
{
try
{
Output0Buffer.AddRow();
Output0Buffer.PORTF_LIST = myXmlData.PORTF_LIST;
Output0Buffer.PRT_FULL_NAME = myXmlData.PRT_FULL_NAME;
Output0Buffer.ISIN = myXmlData.ISIN;
}
catch (Exception e)
{
string errorMessage = string.Format("Data retrieval failed: {0}", e.Message);
bool cancel;
ComponentMetaData.FireError(0, ComponentMetaData.Name, errorMessage,string.Empty,0, out cancel);
}
}
}
public static IEnumerable<XElement> StreamReader(String filename, string elementName)
{
// Create an XML reader for this file.
using (XmlReader reader = XmlReader.Create(filename))
{
reader.MoveToContent(); // will not advance reader if already on a content node; if successful, ReadState is Interactive
reader.Read(); // this is needed, even with MoveToContent and ReadState.Interactive
while(!reader.EOF && reader.ReadState == ReadState.Interactive)
{
if(reader.NodeType == XmlNodeType.Element && reader.Name.Equals(elementName))
{
// this advances the reader...so it's either XNode.ReadFrom() or reader.Read(), but not both
var matchedElement = XNode.ReadFrom(reader) as XElement;
if(matchedElement != null)
yield return matchedElement;
}
else
reader.Read();
}
reader.Close();
}
}
}
]]>
</File>
</Files>
<ReadOnlyVariables>
<Variable Namespace="User" DataType="String" VariableName="FileName" />
</ReadOnlyVariables>
<ReadWriteVariables>
</ReadWriteVariables>
</ScriptComponentProject>
I've checked the code in a console app and it reads the XML file fine, but no luck with the BIML. There are about 250 odd columns so I am trying to avoid doing this manually so if you have any ideas what I am doing wrong I'd really appreciate it!

It seems that the script task does not like underscores in the OutputBuffer.
I created a stub package manually and intellisense had PORTFLIST rather than PORTF_LIST when assigning the value.
So that snippit of code should be:
Output0Buffer.AddRow();
Output0Buffer.PORTFLIST = myXmlData.PORTF_LIST;
Output0Buffer.PRTFULLNAME = myXmlData.PRT_FULL_NAME;
Output0Buffer.ISIN = myXmlData.ISIN
I have another error, my favorite "EmitSsis. Internal Compiler Error: Workflow EmitSsis contains fatal errors.", but at least this one is solved!
Thanks Bill for your help, and sorry I led you down the garden path with the wrong column name in the posted error, or you probably would have known the issue!

Related

Unable to parse json file in SourceGenerator

I am attempting to read a json file and generate C# code. I have a source generator like below, and if I uncomment the JsonDocument line, the source generator stops working. In fact, if I use any class from System.Text.Json it stops working. I have also attempted to use Newtonsoft.Json same result.
Whilst building the target project that is using this source generator I get a build error error MSB6006: "csc.exe" exited with code -532462766.
using System;
using System.Diagnostics;
using System.IO;
using System.Linq;
using System.Text;
using Microsoft.CodeAnalysis;
using Microsoft.CodeAnalysis.Text;
using System.Text.Json;
namespace Raya.Plugin.Registrations.SourceGenerator
{
[Generator]
public class DeviceRegistrationGenerator : ISourceGenerator
{
public void Execute(GeneratorExecutionContext context)
{
var metadata = context.AdditionalFiles.Single(x => x.Path.EndsWith("plugin.metadata.json"));
// var doc = JsonDocument.Parse(metadata.GetText(context.CancellationToken).ToString());
}
public void Initialize(GeneratorInitializationContext context)
{
#if DEBUG
if (!Debugger.IsAttached)
{
Debugger.Launch();
}
#endif
Debug.WriteLine("Initalize code generator");
}
}
}
Generator csproj
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netstandard2.0</TargetFramework>
<DevelopmentDependency>true</DevelopmentDependency>
<IncludeBuildOutput>false</IncludeBuildOutput>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|AnyCPU'">
<LangVersion>latest</LangVersion>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|AnyCPU'">
<LangVersion>latest</LangVersion>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.CodeAnalysis.CSharp" Version="4.4.0" PrivateAssets="all" />
<PackageReference Include="Microsoft.CodeAnalysis.CSharp.Workspaces" Version="4.4.0" PrivateAssets="all" />
<PackageReference Include="Microsoft.CodeAnalysis.Analyzers" Version="3.3.3" PrivateAssets="all" />
<PackageReference Include="System.Text.Json" Version="6.0.5" GeneratePathProperty="true" PrivateAssets="all"/>
</ItemGroup>
</Project>
UPDATE
Adding following seems to make it a work a bit, but I am not sure what is happening. Now I am getting InvalidCastException
<PropertyGroup>
<GetTargetPathDependsOn>$(GetTargetPathDependsOn);GetDependencyTargetPaths</GetTargetPathDependsOn>
<GeneratePackageOnBuild>True</GeneratePackageOnBuild>
</PropertyGroup>
<Target Name="GetDependencyTargetPaths" AfterTargets="ResolvePackageAssets">
<ItemGroup>
<TargetPathWithTargetPlatformMoniker Include="#(ResolvedCompileFileDefinitions)" IncludeRuntimeDependency="false" />
</ItemGroup>
</Target>
Exception
Since the project using this source generator also ends up referencing System.Text.Json.SourceGenerator.dll, I am getting following exception from that dll.
Unhandled Exception: System.InvalidCastException: [A]Microsoft.CodeAnalysis.CSharp.Syntax.CompilationUnitSyntax cannot be cast to [B]Microsoft.CodeAnalysis.CSharp.Syntax.CompilationUnitSyntax. Type A originates from 'Microsoft.CodeAnalysis.CSharp, Version=4.4.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' in the context 'Default' at location 'C:\Program Files\Microsoft Visual Studio\2022\Enterprise\MSBuild\Current\Bin\Roslyn\Microsoft.CodeAnalysis.CSharp.dll'. Type B originates from 'Microsoft.CodeAnalysis.CSharp, Version=4.4.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' in the context 'LoadFrom' at location 'C:\Users\myuser.nuget\packages\microsoft.codeanalysis.csharp\4.4.0\lib\netstandard2.0\Microsoft.CodeAnalysis.CSharp.dll'

BIML class nugget error: 'AstTableNode' does not contain a definition for 'GetTag' and no accessible extension method 'GetTag'

I have created a HelperFunctions.cs file containing the following:
using Varigence.Languages.Biml.Table;
public static class HelperFunctions
{
public static string GetDisplayTableName(AstTableNode table)
{
return table.GetTag("DatabaseName").ToUpper() + "_" + table.Schema.Name.ToUpper() + "_" + table.Name.ToUpper();
}
}
However it does not recognise GetTag() and throws the error :
'AstTableNode' does not contain a definition for 'GetTag' and no accessible extension method 'GetTag' accepting a first argument of type 'AstTableNode' could be found (are you missing a using directive or an assembly reference?).
What do I need to add to make this work?
Thanks
Jon
A static method needs to have the parameter prefixed with this
SO_63828312.cs
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Varigence.Biml.Extensions;
using Varigence.Languages.Biml.Table;
public static class HelperFunctions
{
public static string GetDisplayTableName(this AstTableNode table)
{
return table.GetTag("DatabaseName").ToUpper() + "_" + table.Schema.Name.ToUpper() + "_" + table.Name.ToUpper();
}
}
My static Biml defines a table, which needs a schema, which needs a database, which needs a connection. All of that to get us to a table that has an AnnotationType of Tag with a value of AW.
SO_63828312.T0.biml
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Connections>
<OleDbConnection ConnectionString="Data Source=localhost\dev2017;Initial Catalog=tempdb;Provider=SQLNCLI11.1;Integrated Security=SSPI;Auto Translate=False;Packet Size=32767;" Name="AdventureWorks" />
</Connections>
<Databases>
<Database Name="AW" ConnectionName="AdventureWorks" />
</Databases>
<Schemas>
<Schema DatabaseName="AW" Name="dbo" />
</Schemas>
<Tables>
<Table Name="foo" SchemaName="AW.dbo">
<Annotations>
<Annotation Tag="DatabaseName" AnnotationType="Tag">AW</Annotation>
</Annotations>
<Columns>
<Column Name="Col1" DataType="Int32"></Column>
</Columns>
</Table>
</Tables>
</Biml>
Tier 1 execution. This begins our dynamic tiering. Since there's only one, I don't explicitly give it one but if you have multiple tiers, you'd want to provide a directive.
Here I enumerate my Tables collection (defined in a preceding tier) and for each table I find, I write the table name and the tag value
SO_63828312.T1.biml
<## code file="SO_63828312.cs" #>
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<#
string template = "<!-- {0}|{1} -->";
foreach(var t in this.RootNode.Tables)
{
WriteLine(string.Format(template, t.Name, t.GetDisplayTableName()));
}
#>
</Biml>

Error CS1061 'DbContextOptionsBuilder' does not contain a definition for 'UseMySql'

I just learn to use ASP.NET CORE 2.0 MVC using Visual Studio Community Edition.
I want to use MySQL database instead of using SQL Server because I need to use some data inside the old MySQL DB. Please help me solving this problem .. thank you
Here is my error:
Severity Code Description Project File Line Suppression State
Error CS1061 'DbContextOptionsBuilder' does not contain a definition
for 'UseMySql' and no extension method 'UseMySql' accepting a first
argument of type 'DbContextOptionsBuilder' could be found (are you
missing a using directive or an assembly
reference?) LearnEFCore d:\temp\aspnet\LearnEFCore\LearnEFCore\Startup.cs 29 Active
My code as follow:
In Startup.cs
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Hosting;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.EntityFrameworkCore;
using LearnEFCore.Data;
....
// This method gets called by the runtime. Use this method to add services to the container.
public void ConfigureServices(IServiceCollection services)
{
var sqlConnectionString = Configuration.GetConnectionString("DataAccessMySqlProvider");
services.AddDbContext<SchoolContext>(options => options.UseMySql(sqlConnectionString));
services.AddMvc();
}
In my appsettings.json
{
"Logging": {
"IncludeScopes": false,
"LogLevel": {
"Default": "Warning"
}
},
"ConnectionStrings": {
"DataAccessMySqlProvider": "server=localhost;port=3306;userid=root;password=root;database=sportstore;"
}
}
In my Models/Data
using LearnEFCore.Models;
using Microsoft.EntityFrameworkCore;
namespace LearnEFCore.Data
{
public class SchoolContext : DbContext
{
public SchoolContext(DbContextOptions<SchoolContext> options) : base(options)
{
}
public DbSet<Course> Courses { get; set; }
public DbSet<Enrollment> Enrollments { get; set; }
public DbSet<Student> Students { get; set; }
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
modelBuilder.Entity<Course>().ToTable("Course");
modelBuilder.Entity<Enrollment>().ToTable("Enrollment");
modelBuilder.Entity<Student>().ToTable("Student");
}
}
}
My csproj
<Project Sdk="Microsoft.NET.Sdk.Web">
<PropertyGroup>
<TargetFramework>netcoreapp2.0</TargetFramework>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.AspNetCore.All" Version="2.0.0" />
<PackageReference Include="Microsoft.EntityFrameworkCore" Version="2.0.0" />
<PackageReference Include="Microsoft.EntityFrameworkCore.Tools" Version="2.0.0" />
<PackageReference Include="MySql.Data.EntityFrameworkCore" Version="6.10.4" />
</ItemGroup>
<ItemGroup>
<DotNetCliToolReference Include="Microsoft.VisualStudio.Web.CodeGeneration.Tools" Version="2.0.0" />
</ItemGroup>
</Project>
changin to Pomelo.EntityFrameworkCore.MySql solved the problem ...

How do you access the Package Variables Collection in biml?

I'm just getting started with biml and bimlscript. I can see the power it holds, but so far digging through the language and API reference has been frustrating. I can't seem to find any reference online to access the package's variable collection.
I'm trying to set up this script so I can add more variables into my Variables section, and then automatically add those variables to a script task later in the process.
Here is the minimal code for my problem:
<Biml xmlns="http://schemas.varigence.com/biml.xsd" >
<Packages>
<Package Name="Load">
<Variables>
<Variable Name="ETLProcessStepID" DataType="Int32">0</Variable>
<Variable Name="TenantID" DataType="Int32">1</Variable>
</Variables>
<!-- more stuff going on in the biml -->
<# var package = RootNode.Packages.Where(loadPackage => loadPackage.Name.Contains("Load"));
foreach (var variable in package.Variables) { #>
<ReadWriteVariables VariableName="<#=variable.Name#>" />
<# }#>
</Package>
</Packages>
</Biml>
This seems to be the closest I've gotten. Unfortunately It results in:
Error 0 'System.Collections.Generic.IEnumerable<Varigence.Languages.Biml.Task.AstPackageNode>' does not contain a definition for 'Variables' and no extension method 'Variables' accepting a first argument of type 'System.Collections.Generic.IEnumerable<Varigence.Languages.Biml.Task.AstPackageNode>' could be found (are you missing a using directive or an assembly reference?).
If I'm reading the documentation right, there is a Variables collection in the Packages node. https://varigence.com/Documentation/Api/Type/AstPackageNode
If I'm not reading the documentation right, can anyone direct me to a reference on how I could access the package variables collection?
The first error you're running into is that your C# variable called package is going to return a collection from that Linq call. Since there should only be one element that matches it, we'll use First to just give us one of these things
var package = RootNode.Packages.Where(loadPackage => loadPackage.Name.Contains("Load")).First();
Now the tricky part and I'll actually have to check with some bigger guns on this, but I don't think you'd be able to access the current package's variable collection like that because it's not built yet. Well, at least using BIDS Helper/BimlExpress. The Biml first needs to get compiled into objects because, assuming a single select, you won't have anything in the RootNode.Packages collection. You certainly wouldn't have "Load" because you're compiling it right now.
In Mist, the paid for solution which is soon to be rebranded as BimlStudio, you could use a Transformer to accomplish this. You'd build out the Load package first and then a transformer fires off just prior to emission as a dtsx package and does whatever correction you were trying.
Consider the following test harness. It creates a simple package and then has some bimlscript immediately after it wherein I enumerate though all the packages and then for each package, I enumerate the root level Variables collection. You'll only see the "Test" message rendered. The inner calls won't fire because nothing exists yet.
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Packages>
<Package Name="so_38908470" >
<Variables>
<Variable Name="ETLProcessStepID" DataType="Int32">0</Variable>
<Variable Name="TenantID" DataType="Int32">1</Variable>
</Variables>
<#
string message = "<!-- {0} -->";
WriteLine(message, "test");
foreach (var package in RootNode.Packages)
{
WriteLine(message, package.Name);
foreach(var variable in package.Variables)
{
WriteLine(message, variable.Name);
}
}
#>
</Package>
</Packages>
</Biml>
The more I think about this, Tiering might be able to accomplish this with BIDS Helper/BimlExpress. Since it looks like you're trying to use the Variables defined within a package as inputs to a Script Task or Component, as long as you're using ScriptProjects type things which are parallel to a Packages collection, this might work.
Eureka
Add two Biml files to your project: Load.biml and Script.Biml. Use the following code in each. Select both and right click to generate SSIS package.
Load.biml
This is going to be your package. It is the package you started up above with a Script Task in there that is going to dump the name and value of all the user variables declared at the root of the package. But as you see, there isn't anything in the ScriptTask tag that specifies what variables or what the code is going to do.
<Biml xmlns="http://schemas.varigence.com/biml.xsd" >
<Packages>
<Package Name="Load">
<Variables>
<Variable Name="ETLProcessStepID" DataType="Int32">0</Variable>
<Variable Name="TenantID" DataType="Int32">1</Variable>
</Variables>
<Tasks>
<Script ProjectCoreName="ST_EchoBack" Name="SCR Echo Back">
<ScriptTaskProjectReference ScriptTaskProjectName="ST_EchoBack" />
</Script>
</Tasks>
</Package>
</Packages>
</Biml>
Script.biml
This biml looks like a lot but it's the same concepts as I was working with above where I enumerate though the packages collection and then work with the Variables collection. I use the biml nuggets to control the emission of the Namespace, Name and DataType properties.
<## template language="C#" tier="1" #>
<Biml xmlns="http://schemas.varigence.com/biml.xsd" >
<ScriptProjects>
<ScriptTaskProject ProjectCoreName="ST_EchoBack" Name="ST_EchoBack" VstaMajorVersion="0">
<ReadOnlyVariables>
<!-- List all the variables you are interested in tracking -->
<#
string message = "<!-- {0} -->";
WriteLine(message, "test");
// ValidationReporter.Report(Severity.Error, "test");
foreach (var package in RootNode.Packages.Where(x=> x.Name == "Load"))
{
WriteLine(message, package.Name);
// ValidationReporter.Report(Severity.Error, package.Name);
foreach(var variable in package.Variables)
{
WriteLine(message, variable.Name);
// ValidationReporter.Report(Severity.Error, variable.Name);
#>
<Variable Namespace="<#=variable.Namespace#>" VariableName="<#=variable.Name#>" DataType="<#=variable.DataType#>" />
<#
}
}
#>
</ReadOnlyVariables>
<Files>
<File Path="ScriptMain.cs" BuildAction="Compile">using System;
using System.Data;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;
namespace ST_EchoBack
{
[Microsoft.SqlServer.Dts.Tasks.ScriptTask.SSISScriptTaskEntryPointAttribute]
public partial class ScriptMain : Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
{
public void Main()
{
bool fireAgain = false;
string message = "{0}::{1} : {2}";
foreach (var item in Dts.Variables)
{
Dts.Events.FireInformation(0, "SCR Echo Back", string.Format(message, item.Namespace, item.Name, item.Value), string.Empty, 0, ref fireAgain);
}
Dts.TaskResult = (int)ScriptResults.Success;
}
enum ScriptResults
{
Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success,
Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
};
}
} </File>
<File Path="Properties\AssemblyInfo.cs" BuildAction="Compile">
using System.Reflection;
using System.Runtime.CompilerServices;
[assembly: AssemblyVersion("1.0.*")]
</File>
</Files>
<AssemblyReferences>
<AssemblyReference AssemblyPath="System" />
<AssemblyReference AssemblyPath="System.Data" />
<AssemblyReference AssemblyPath="System.Windows.Forms" />
<AssemblyReference AssemblyPath="System.Xml" />
<AssemblyReference AssemblyPath="Microsoft.SqlServer.ManagedDTS.dll" />
<AssemblyReference AssemblyPath="Microsoft.SqlServer.ScriptTask.dll" />
</AssemblyReferences>
</ScriptTaskProject>
</ScriptProjects>
</Biml>
I thought I could simplify call GetBiml() the variable variable but that is going to emit the exact biml it was defined with
<Variable Name="ETLProcessStepID" DataType="Int32">0</Variable>
<Variable Name="TenantID" DataType="Int32">1</Variable>
and if that didn't have the actual value in there, it'd be legit syntax for the ReadOnly/ReadWrite variables collection. Que lástima.
Biml Script Component Source
Biml Test for echo aka Script Task

DataTable to JSON- lose extra formatting?

I'm sure this has been answered somewhere, but I've read dozens of articles and pages with no joy. I need to pass a DataTable from a windows service to an asp.net page. It seemed that serializing to Json seemed a good idea, so I have this:
[ServiceContract(Namespace = "DataGrabber")]
public interface IDataGrabber
{
[OperationContract]
[WebInvoke(Method="GET", BodyStyle=WebMessageBodyStyle.Bare, UriTemplate = "/Tables/{tablename}", ResponseFormat=WebMessageFormat.Json)]
string GetData(string tablename);
}
public class DataGrabber : IDataGrabber
{
public string GetData(string TableName)
{
DataTable result;
switch (TableName)
{
case "firstTable":
result = Grabber.firstTable;
break;
//SNIP MORE
}
return GetJson(result);
}
public string GetJson(DataTable dt)
{
System.Web.Script.Serialization.JavaScriptSerializer serializer = new
System.Web.Script.Serialization.JavaScriptSerializer();
List<Dictionary<string, object>> rows =
new List<Dictionary<string, object>>();
Dictionary<string, object> row = null;
foreach (DataRow dr in dt.Rows)
{
row = new Dictionary<string, object>();
foreach (DataColumn col in dt.Columns)
{
row.Add(col.ColumnName.Trim(), dr[col]);
}
rows.Add(row);
}
return serializer.Serialize(rows);
}
And my config:
<?xml version="1.0"?>
<configuration>
<startup>
<supportedRuntime version="v4.0" sku=".NETFramework,Version=v4.0"/>
</startup>
<system.serviceModel>
<services>
<service behaviorConfiguration="DefaultBehavior" name="DataGrabber">
<endpoint address="http://localhost:9001/DataGrabber"
binding="webHttpBinding" bindingConfiguration="WebBinding" name="Rest"
contract="DataGrabber.IDataGrabber" />
</service>
</services>
<bindings>
<webHttpBinding>
<binding name="WebBinding">
<security mode="None" />
</binding>
</webHttpBinding>
</bindings>
<behaviors>
<serviceBehaviors>
<behavior name="DefaultBehavior">
<serviceMetadata httpGetEnabled="true" />
<serviceDebug includeExceptionDetailInFaults="true"/>
</behavior>
</serviceBehaviors>
</behaviors>
</system.serviceModel>
</configuration>
This works OK, but my resulting JSON has lots of escaped quotes, like this:
"[{\"lat\":51.75,\"lng\":-1.25,\"score\":7},{\"lat\":31.780001,\"lng\":35.23,\"score\":7},{\"lat\":47.717999,\"lng\":-116.951599,\"score\":9},{\"lat\":33.990799,\"lng\":-118.460098,\"score\":1},{\"lat\":34.746498,\"lng\":-92.289597,\"score\":10},{\"lat\":-31.9522,...
This is rejected by online parsers, and won't serialize back to a DataTable in my client application. If I strip out all the extra stuff by using the "htmlviewer" instead of "textviewer" while debugging in VS, it's fine for the parsers. I tried using the Newtonsoft library too, and it resulted in similar output- so what am I missing?
Typical. Straight after posting I found the answer.
So because I was converting to a string, it was getting double encoded. Instead, I needed to output a stream.
So both my interface and method are now set to return a System.IO.Stream, and the output is generated thus:
return new MemoryStream(Encoding.UTF8.GetBytes(GetJson(result)));