Excel Spreadsheet to HTML Programmactically - html

I have a question that I hope someone can answer. I am using .Net 3.5 Winforms and the SpreadSheetGear 2010 component and need to know if there is a free or low cost method to convert an Excel Workbook to HTML? Is there a good XSLT transform or some low cost or open source component? I can save the Excel file to OpenXMLWorkbook programmactically but the component does not allow to save to html. Any help would be appreciated. Thanks.

Use ASP.Net and follow the code demonstrated by the Excel to DataGrid Samples. You can always include a reference to System.Web in your project.
But, even without ASP.Net, and sticking with the .Net client profile, generating an HTML table from a DataTable is simple. Use the AntiXss Library (download) to encode the data to html. Then construct your HTML with a StringBuilder. Here's an extension method you can use to convert any DataTable to an html <table>:
using AntiXss = Microsoft.Security.Application;
...
public static String ToHTML(this DataTable dt)
{
StringBuilder html = new StringBuilder("<table><thead><tr>");
foreach (DataColumn col in dt.Columns)
{
html.AppendFormat("<td>{0}</td>",
AntiXss.Encoder.HtmlEncode(col.ColumnName));
}
html.Append("</tr></thead><tbody>");
foreach (DataRow row in dt.Rows)
{
html.Append("<tr>");
foreach (var data in row.ItemArray)
{
html.AppendFormat("<td>{0}</td>",
AntiXss.Encoder.HtmlEncode(data.ToString()));
}
html.Append("</tr>");
}
html.Append("</tbody></table>");
return html.ToString();
}
And to get html for an entire workbook, another extension method:
public static List<String> ToHTML(this IWorkBook wb)
{
List<String> tables = new List<String>();
DataSet ds = wb.GetDataSet(GetDataFlags.FormattedText);
foreach (DataTable dt in ds.Tables)
{
tables.Add(dt.ToHTML());
}
return tables;
}

Related

Is it possible to get raw HTML from Razor/Blazor components?

I'd like to set up a "mailer/newsletter" using MailKit. My site stack is based off of Blazor web assembly and uses .Razor components.
I'm wondering if there is a way to consume a razor component I've written to output HTML into the MimeMessage object I'm using to generate my email body and what that architecture would look like / the best way to accomplish this?
Similar question (though not Blazor):
Can I use an ASP.Net MVC Razor view to generate an nicely formatted HTML Body as the input for an email sent from the server?
Late answer since I just saw this question: I wrote an alternative system called BlazorTemplater which uses .razor files instead of .cshtml since I had exactly this problem.
You can convert your templates to .razor format and then use BlazorTemplater to render to HTML:
var html = new ComponentRenderer<MyRazorClass>()
.Set(c => c.SomeParameter = someValue)
.Render();
It supports parameters, DI injection and nested components so you should find it useful! It's also much easier to set up and works in Razor Class Libraries too.
I am using Blazor with MailKit here: Google Email Viewer in Server Side Blazor
I use MarkupString to display the email content like this:
#using MimeKit
#using MessageReader
#strError
<div style="padding:2px; vertical-align:top">
<div><i>#MimeKitMessage.Date.ToString()</i></div>
<div><b>From:</b> #MimeKitMessage.From.ToString()</div>
<div><b>To:</b> #MimeKitMessage.To.ToString()</div>
<div><b>Subject:</b> #MimeKitMessage.Subject</div>
<br />
<div>#((MarkupString)#htmlEmail)</div>
</div>
#code {
[Parameter] public Message paramMessage { get; set; }
MimeMessage MimeKitMessage;
string strError = "";
string htmlEmail = "";
protected override void OnInitialized()
{
try
{
if (paramMessage != null)
{
string converted = paramMessage.Raw.Replace('-', '+');
converted = converted.Replace('_', '/');
byte[] decodedByte = Convert.FromBase64String(converted);
using (Stream stream = new MemoryStream(decodedByte))
{
// Convert to MimeKit from GMail
// Load a MimeMessage from a stream
MimeKitMessage = MimeMessage.Load(stream);
// Convert any embedded images
var visitor = new HtmlPreviewVisitor();
MimeKitMessage.Accept(visitor);
htmlEmail = visitor.HtmlBody;
//If the email has attachments we can get them here
//var attachments = visitor.Attachments;
}
}
}
catch (Exception ex)
{
strError = ex.Message;
}
}
}

using external source to feed into BIML

Is there a way to reference a csv or other data source which has table names and source queries to generate BIML documents?
Thanks
Do be aware C# is not my strong suit and the below may well not be the ideal way to do this. If you do find something more suitable, I would very much like to hear about it :)
The easiest way I have found to include CSV based metadata into my Biml projects is to load them into C# DataTable objects that I then reference in my Biml as
a C# variable object, which plays very well with foreach to iterate through the rows.
Assuming you are aware how to include C# in your Biml projects (either in the file directly or via referenced .cs file), you can use the following code:
public static DataTable FlatFileToDataTable(string filePath, char delimiter)
{
DataTable dt = new DataTable();
using (StreamReader sr = new StreamReader(filePath))
{
string[] headers = sr.ReadLine().Split(delimiter);
foreach (string header in headers)
{
dt.Columns.Add(header);
}
while (!sr.EndOfStream)
{
string[] rows = sr.ReadLine().Split(delimiter);
DataRow dr = dt.NewRow();
for (int i = 0; i < headers.Length; i++)
{
dr[i] = rows[i];
}
dt.Rows.Add(dr);
}
}
return dt;
}
I think in order to use the StreamReader you will need to add using System.IO; to your code file as well.
Usage would be to define a DataTable object and populate it with the result of the above, then to reference it using code snippets within your Biml:
DataTable YourDataTable = FlatFileToDataTable("<Path to CSV file>",'<Value Delimiter>');
...
<Columns>
<# foreach(DataRow r in YourDataTable.Rows){ #>
<Column Name="<#=r["YourColumnName"]#>" etc />
<# } #>
</Columns>

How to convert xls to Collection<String[]> with Apache POI?

I want to make a paramterized junit test using #RunWith(Parameterized.class) and
#Parameterized.Parameters
public static Collection<String[]> testdata() {
return Arrays.asList(new String[][] {
{ "inParam1", "inPAram2", "expectedOut1", "expectedOut2" }
});
}
The actual testdata shall be created by business people via Excel.
Is there an easy / generic way to get an Apache POI XSSFSheet to the prescribed Collection of String arrays?
If yes: can someone provide an example please ?
I found this question: Datadriven Testing in TestNG using Apache POI --- but I'd expect a kind of a 3-liner ;-)
It isn't quite a 3 liner, but assuming I've correctly understood your needs, you can do something like:
Sheet sheet = workbook.getSheetAt(0);
DataFormatter fmt = new DataFormatter();
List<List<String>> cellData = new ArrayList<List<String>>();
for (Row r : sheet) {
List<String> rd = new ArrayList<String>();
for (Cell c : r) {
rd.add(fmt.formatCellValue(c));
}
cellData.add(rd);
}
That will, for all defined rows and cells, of any time, generate you a list of list of strings, one list per row, one string per cell in that. You can switch from lists to arrays fairly easily
If you need more control over what cells/rows are/aren't included, including for empty ones, see the Iterating guide in the documentation

How to export data from LinqPAD as JSON?

I want to create a JSON file for use as part of a simple web prototyping exercise. LinqPAD is perfect for accessing the data from my DB in just the shape I need, however I cannot get it out as JSON very easily.
I don't really care what the schema is, because I can adapt my JavaScript to work with whatever is returned.
Is this possible?
A more fluent solution is to add the following methods to the "My Extensions" File in Linqpad:
public static String DumpJson<T>(this T obj)
{
return
obj
.ToJson()
.Dump();
}
public static String ToJson<T>(this T obj)
{
return
new System.Web.Script.Serialization.JavaScriptSerializer()
.Serialize(obj);
}
Then you can use them like this in any query you like:
Enumerable.Range(1, 10)
.Select(i =>
new
{
Index = i,
IndexTimesTen = i * 10,
})
.DumpJson();
I added "ToJson" separately so it can be used in with "Expessions".
This is not directly supported, and I have opened a feature request here. Vote for it if you would also find this useful.
A workaround for now is to do the following:
Set the language to C# Statement(s)
Add an assembly reference (press F4) to System.Web.Extensions.dll
In the same dialog, add a namespace import to System.Web.Script.Serialization
Use code like the following to dump out your query as JSON
new JavaScriptSerializer().Serialize(query).Dump();
There's a solution with Json.NET since it does indented formatting, and renders Json dates properly. Add Json.NET from NuGet, and refer to Newtonsoft.Json.dll to your “My Extensions” query and as well the following code :
public static object DumpJson(this object value, string description = null)
{
return GetJson(value).Dump(description);
}
private static object GetJson(object value)
{
object dump = value;
var strValue = value as string;
if (strValue != null)
{
var obj = JsonConvert.DeserializeObject(strValue);
dump = JsonConvert.SerializeObject(obj, Newtonsoft.Json.Formatting.Indented);
}
else
{
dump = JsonConvert.SerializeObject(value, Newtonsoft.Json.Formatting.Indented);
}
return dump;
}
Use .DumpJson() as .Dump() to render the result. It's possible to override more .DumpJson() with different signatures if necessary.
As of version 4.47, LINQPad has the ability to export JSON built in. Combined with the new lprun.exe utility, it can also satisfy your needs.
http://www.linqpad.net/lprun.aspx

Batch convert visual foxpro dbf tables to csv

I have a huge collection of visual foxpro dbf files that I would like to convert to csv.
(If you like, you can download some of the data here. Click on the 2011 link for Transaction Data, and prepare to wait a long time...)
I can open each table with DBF View Plus (an awesome freeware utility), but exporting them to csv takes a few hours per file, and I have several dozen files to work with.
Is there a program like DBF View plus that will allow me to set up a batch of dbf-to-csv conversions to run overnight?
/Edit: Alternatively, is there a good way to import .dbf files straight into SQL Server 2008? They should all go into 1 table, as each file is just a subset of records from the same table and should have all the same column names.
Load up your list of FoxPro files in an array/list then call the ConvertDbf on each to convert them from FoxPro to csv files. See the c# console application code below...
Credit c# datatable to csv for the DataTableToCSV function.
using System;
using System.Data;
using System.Data.OleDb;
using System.IO;
using System.Linq;
using System.Text;
namespace SO8843066
{
class Program
{
static void Main(string[] args)
{
string connectionString = #"Provider=VFPOLEDB.1;Data Source=C:\";
string dbfToConvert = #"C:\yourdbffile.dbf";
ConvertDbf(connectionString, dbfToConvert, dbfToConvert.Replace(".dbf", ".csv"));
Console.WriteLine("End of program execution");
Console.WriteLine("Press any key to end");
Console.ReadKey();
}
static void DataTableToCSV(DataTable dt, string csvFile)
{
StringBuilder sb = new StringBuilder();
var columnNames = dt.Columns.Cast<DataColumn>().Select(column => column.ColumnName).ToArray();
sb.AppendLine(string.Join(",", columnNames));
foreach (DataRow row in dt.Rows)
{
var fields = row.ItemArray.Select(field => field.ToString()).ToArray();
for (int i =0;i < fields.Length;i++)
{
sb.Append("\"" + fields[i].Trim() );
sb.Append((i != fields.Length - 1) ? "\"," : "\"");
}
sb.Append("\r\n");
}
File.WriteAllText(csvFile, sb.ToString());
}
static void ConvertDbf(string connectionString, string dbfFile, string csvFile)
{
string sqlSelect = string.Format("SELECT * FROM {0}", dbfFile);
using (OleDbConnection connection = new OleDbConnection(connectionString))
{
using (OleDbDataAdapter da = new OleDbDataAdapter(sqlSelect, connection))
{
DataSet ds = new DataSet();
da.Fill(ds);
DataTableToCSV(ds.Tables[0], csvFile);
}
}
}
}
}
In that case, SQL-Server I think has a capability of connecting to foxpro tables. I'm not exactly sure how as I've never done it recently (last time using SQL-Server about 8+ yrs ago). I'm sure there are other threads out there that can point you to connecting SQL-Server to VFP.
I quickly searched and saw this thread
In addition, you might need the latest OleDb provider to establish the connection which I've also posted in a thread here. This thread also shows a sample of the connection string information you may need from SQL-Server. The data source information should point to the PATH where the .DBF files are found, and not the specific name of the .DBF you are trying to connect to.
Hope this helps you out.
This works very well and thanks for the solution. I used this to convert some visual foxpro dbf tables to flat files. With these tables, there is the additional challenge of converting fields of type Currency.
Currency fields are a 64-bit (8 byte) signed integer amidst a 36 element byte array starting at the 27th position. The integer is then divided by 1000 to get 4-decimal precision equivalent.
If you have this type of field, try this inside the fields FOR loop
if (("" + fields[i]).Equals("System.Byte[]"))
{
StringBuilder db = new StringBuilder();
byte[] inbytes = new byte[36];
inbytes = ObjectToByteArray(fields[i]);
db.Append("" + (double)BitConverter.ToInt64(inbytes,27)/1E4);
sb.Append("\"" + db);
}
With the following helper method
private static byte[] ObjectToByteArray(Object obj)
{
BinaryFormatter bf = new BinaryFormatter();
using (var ms = new MemoryStream())
{
bf.Serialize(ms, obj);
return ms.ToArray();
}
}
Check out my answer to Foxbase to postrgresql data transfer. (dbf files reader).