Is there a way to reference a csv or other data source which has table names and source queries to generate BIML documents?
Thanks
Do be aware C# is not my strong suit and the below may well not be the ideal way to do this. If you do find something more suitable, I would very much like to hear about it :)
The easiest way I have found to include CSV based metadata into my Biml projects is to load them into C# DataTable objects that I then reference in my Biml as
a C# variable object, which plays very well with foreach to iterate through the rows.
Assuming you are aware how to include C# in your Biml projects (either in the file directly or via referenced .cs file), you can use the following code:
public static DataTable FlatFileToDataTable(string filePath, char delimiter)
{
DataTable dt = new DataTable();
using (StreamReader sr = new StreamReader(filePath))
{
string[] headers = sr.ReadLine().Split(delimiter);
foreach (string header in headers)
{
dt.Columns.Add(header);
}
while (!sr.EndOfStream)
{
string[] rows = sr.ReadLine().Split(delimiter);
DataRow dr = dt.NewRow();
for (int i = 0; i < headers.Length; i++)
{
dr[i] = rows[i];
}
dt.Rows.Add(dr);
}
}
return dt;
}
I think in order to use the StreamReader you will need to add using System.IO; to your code file as well.
Usage would be to define a DataTable object and populate it with the result of the above, then to reference it using code snippets within your Biml:
DataTable YourDataTable = FlatFileToDataTable("<Path to CSV file>",'<Value Delimiter>');
...
<Columns>
<# foreach(DataRow r in YourDataTable.Rows){ #>
<Column Name="<#=r["YourColumnName"]#>" etc />
<# } #>
</Columns>
Related
I am trying to import multiple JSON files in a folder to an Oracle database using SSIS. The code below is the JSON parser that is able to import a single file. I need this to loop through a folder and import all the files. Here is a the code in the script component to import the JSON file. Any ideas? Thank you!
public override void CreateNewOutputRows()
{
String jsonFileContent = File.ReadAllText(#"C:\Users\tngo\File\File1.json");
JavaScriptSerializer js = new JavaScriptSerializer();
List<IGData> igdatas = js.Deserialize<List<IGData>>(jsonFileContent);
foreach (IGData igdata in igdatas)
{
Output0Buffer.AddRow();
Output0Buffer.piececount = igdata.piececount;
Output0Buffer.wgt = igdata.wgt;
}
}
Since you are already in C# you can finish it off there with a foreach loop around your whole code.
string[] files = System.IO.Directory.GetFiles("C:\\Users\\tngo\File\\", "*.json");
foreach(string file in files)
{
String jsonFileContent = File.ReadAllText(file)
JavaScriptSerializer js = new JavaScriptSerializer();
List<IGData> igdatas = js.Deserialize<List<IGData>>(jsonFileContent);
foreach (IGData igdata in igdatas)
{
Output0Buffer.AddRow();
Output0Buffer.piececount = igdata.piececount;
Output0Buffer.wgt = igdata.wgt;
}
}
You'll need to use the Foreach Loop Task.
In the Forech Loop Editor do the following:
Use the Foreach File Enumerator type and point the Folder to C:\Users\tngo\File\. Your Files wildcard will be *.json, and you should check Fully qualified under Retrieve file name. After that, click on Variable Mapping on the left pane of the editor, and create a new string variable that will hold you fully qualified filename. We'll call ours ForEachLoop_StringVar for this example.
After you create the loop, drag your Script Task into the Foreach Loop, and then double click the Script Task to open the Script Task Editor. Add the string variable you created above as a ReadOnlyVariables, and then hit the Edit Script... button to pull up your script. You can then change the hard-coded filename with a reference to your variable. Your script code would then look something like this:
public override void CreateNewOutputRows()
{
String jsonFileContent = File.ReadAllText((string)Dts.Variables["User::ForEachLoop_StringVar"].Value);
JavaScriptSerializer js = new JavaScriptSerializer();
List<IGData> igdatas = js.Deserialize<List<IGData>>(jsonFileContent);
foreach (IGData igdata in igdatas)
{
Output0Buffer.AddRow();
Output0Buffer.piececount = igdata.piececount;
Output0Buffer.wgt = igdata.wgt;
}
}
I want to make a paramterized junit test using #RunWith(Parameterized.class) and
#Parameterized.Parameters
public static Collection<String[]> testdata() {
return Arrays.asList(new String[][] {
{ "inParam1", "inPAram2", "expectedOut1", "expectedOut2" }
});
}
The actual testdata shall be created by business people via Excel.
Is there an easy / generic way to get an Apache POI XSSFSheet to the prescribed Collection of String arrays?
If yes: can someone provide an example please ?
I found this question: Datadriven Testing in TestNG using Apache POI --- but I'd expect a kind of a 3-liner ;-)
It isn't quite a 3 liner, but assuming I've correctly understood your needs, you can do something like:
Sheet sheet = workbook.getSheetAt(0);
DataFormatter fmt = new DataFormatter();
List<List<String>> cellData = new ArrayList<List<String>>();
for (Row r : sheet) {
List<String> rd = new ArrayList<String>();
for (Cell c : r) {
rd.add(fmt.formatCellValue(c));
}
cellData.add(rd);
}
That will, for all defined rows and cells, of any time, generate you a list of list of strings, one list per row, one string per cell in that. You can switch from lists to arrays fairly easily
If you need more control over what cells/rows are/aren't included, including for empty ones, see the Iterating guide in the documentation
I want to create a JSON file for use as part of a simple web prototyping exercise. LinqPAD is perfect for accessing the data from my DB in just the shape I need, however I cannot get it out as JSON very easily.
I don't really care what the schema is, because I can adapt my JavaScript to work with whatever is returned.
Is this possible?
A more fluent solution is to add the following methods to the "My Extensions" File in Linqpad:
public static String DumpJson<T>(this T obj)
{
return
obj
.ToJson()
.Dump();
}
public static String ToJson<T>(this T obj)
{
return
new System.Web.Script.Serialization.JavaScriptSerializer()
.Serialize(obj);
}
Then you can use them like this in any query you like:
Enumerable.Range(1, 10)
.Select(i =>
new
{
Index = i,
IndexTimesTen = i * 10,
})
.DumpJson();
I added "ToJson" separately so it can be used in with "Expessions".
This is not directly supported, and I have opened a feature request here. Vote for it if you would also find this useful.
A workaround for now is to do the following:
Set the language to C# Statement(s)
Add an assembly reference (press F4) to System.Web.Extensions.dll
In the same dialog, add a namespace import to System.Web.Script.Serialization
Use code like the following to dump out your query as JSON
new JavaScriptSerializer().Serialize(query).Dump();
There's a solution with Json.NET since it does indented formatting, and renders Json dates properly. Add Json.NET from NuGet, and refer to Newtonsoft.Json.dll to your “My Extensions” query and as well the following code :
public static object DumpJson(this object value, string description = null)
{
return GetJson(value).Dump(description);
}
private static object GetJson(object value)
{
object dump = value;
var strValue = value as string;
if (strValue != null)
{
var obj = JsonConvert.DeserializeObject(strValue);
dump = JsonConvert.SerializeObject(obj, Newtonsoft.Json.Formatting.Indented);
}
else
{
dump = JsonConvert.SerializeObject(value, Newtonsoft.Json.Formatting.Indented);
}
return dump;
}
Use .DumpJson() as .Dump() to render the result. It's possible to override more .DumpJson() with different signatures if necessary.
As of version 4.47, LINQPad has the ability to export JSON built in. Combined with the new lprun.exe utility, it can also satisfy your needs.
http://www.linqpad.net/lprun.aspx
I have a question that I hope someone can answer. I am using .Net 3.5 Winforms and the SpreadSheetGear 2010 component and need to know if there is a free or low cost method to convert an Excel Workbook to HTML? Is there a good XSLT transform or some low cost or open source component? I can save the Excel file to OpenXMLWorkbook programmactically but the component does not allow to save to html. Any help would be appreciated. Thanks.
Use ASP.Net and follow the code demonstrated by the Excel to DataGrid Samples. You can always include a reference to System.Web in your project.
But, even without ASP.Net, and sticking with the .Net client profile, generating an HTML table from a DataTable is simple. Use the AntiXss Library (download) to encode the data to html. Then construct your HTML with a StringBuilder. Here's an extension method you can use to convert any DataTable to an html <table>:
using AntiXss = Microsoft.Security.Application;
...
public static String ToHTML(this DataTable dt)
{
StringBuilder html = new StringBuilder("<table><thead><tr>");
foreach (DataColumn col in dt.Columns)
{
html.AppendFormat("<td>{0}</td>",
AntiXss.Encoder.HtmlEncode(col.ColumnName));
}
html.Append("</tr></thead><tbody>");
foreach (DataRow row in dt.Rows)
{
html.Append("<tr>");
foreach (var data in row.ItemArray)
{
html.AppendFormat("<td>{0}</td>",
AntiXss.Encoder.HtmlEncode(data.ToString()));
}
html.Append("</tr>");
}
html.Append("</tbody></table>");
return html.ToString();
}
And to get html for an entire workbook, another extension method:
public static List<String> ToHTML(this IWorkBook wb)
{
List<String> tables = new List<String>();
DataSet ds = wb.GetDataSet(GetDataFlags.FormattedText);
foreach (DataTable dt in ds.Tables)
{
tables.Add(dt.ToHTML());
}
return tables;
}
I have a huge collection of visual foxpro dbf files that I would like to convert to csv.
(If you like, you can download some of the data here. Click on the 2011 link for Transaction Data, and prepare to wait a long time...)
I can open each table with DBF View Plus (an awesome freeware utility), but exporting them to csv takes a few hours per file, and I have several dozen files to work with.
Is there a program like DBF View plus that will allow me to set up a batch of dbf-to-csv conversions to run overnight?
/Edit: Alternatively, is there a good way to import .dbf files straight into SQL Server 2008? They should all go into 1 table, as each file is just a subset of records from the same table and should have all the same column names.
Load up your list of FoxPro files in an array/list then call the ConvertDbf on each to convert them from FoxPro to csv files. See the c# console application code below...
Credit c# datatable to csv for the DataTableToCSV function.
using System;
using System.Data;
using System.Data.OleDb;
using System.IO;
using System.Linq;
using System.Text;
namespace SO8843066
{
class Program
{
static void Main(string[] args)
{
string connectionString = #"Provider=VFPOLEDB.1;Data Source=C:\";
string dbfToConvert = #"C:\yourdbffile.dbf";
ConvertDbf(connectionString, dbfToConvert, dbfToConvert.Replace(".dbf", ".csv"));
Console.WriteLine("End of program execution");
Console.WriteLine("Press any key to end");
Console.ReadKey();
}
static void DataTableToCSV(DataTable dt, string csvFile)
{
StringBuilder sb = new StringBuilder();
var columnNames = dt.Columns.Cast<DataColumn>().Select(column => column.ColumnName).ToArray();
sb.AppendLine(string.Join(",", columnNames));
foreach (DataRow row in dt.Rows)
{
var fields = row.ItemArray.Select(field => field.ToString()).ToArray();
for (int i =0;i < fields.Length;i++)
{
sb.Append("\"" + fields[i].Trim() );
sb.Append((i != fields.Length - 1) ? "\"," : "\"");
}
sb.Append("\r\n");
}
File.WriteAllText(csvFile, sb.ToString());
}
static void ConvertDbf(string connectionString, string dbfFile, string csvFile)
{
string sqlSelect = string.Format("SELECT * FROM {0}", dbfFile);
using (OleDbConnection connection = new OleDbConnection(connectionString))
{
using (OleDbDataAdapter da = new OleDbDataAdapter(sqlSelect, connection))
{
DataSet ds = new DataSet();
da.Fill(ds);
DataTableToCSV(ds.Tables[0], csvFile);
}
}
}
}
}
In that case, SQL-Server I think has a capability of connecting to foxpro tables. I'm not exactly sure how as I've never done it recently (last time using SQL-Server about 8+ yrs ago). I'm sure there are other threads out there that can point you to connecting SQL-Server to VFP.
I quickly searched and saw this thread
In addition, you might need the latest OleDb provider to establish the connection which I've also posted in a thread here. This thread also shows a sample of the connection string information you may need from SQL-Server. The data source information should point to the PATH where the .DBF files are found, and not the specific name of the .DBF you are trying to connect to.
Hope this helps you out.
This works very well and thanks for the solution. I used this to convert some visual foxpro dbf tables to flat files. With these tables, there is the additional challenge of converting fields of type Currency.
Currency fields are a 64-bit (8 byte) signed integer amidst a 36 element byte array starting at the 27th position. The integer is then divided by 1000 to get 4-decimal precision equivalent.
If you have this type of field, try this inside the fields FOR loop
if (("" + fields[i]).Equals("System.Byte[]"))
{
StringBuilder db = new StringBuilder();
byte[] inbytes = new byte[36];
inbytes = ObjectToByteArray(fields[i]);
db.Append("" + (double)BitConverter.ToInt64(inbytes,27)/1E4);
sb.Append("\"" + db);
}
With the following helper method
private static byte[] ObjectToByteArray(Object obj)
{
BinaryFormatter bf = new BinaryFormatter();
using (var ms = new MemoryStream())
{
bf.Serialize(ms, obj);
return ms.ToArray();
}
}
Check out my answer to Foxbase to postrgresql data transfer. (dbf files reader).