Let's say I have a table Employee like this
EmpID, EmpName
1 , hatem
and I write a query: select * from Employee for xml auto
so the output will be in XML format.
I want to know how can I export the result to a XML file to be saved on my computer's drive as I need to read the XML files from this folder and deserialize them in my .net application.
If you only need to store the XML and not do anything else to it, this is probably the easiest way to accomplish this - using straight simple ADO.NET:
string query = "SELECT EmployeeID, LastName, FirstName, Title, BirthDate, HireDate FROM dbo.Employees FOR XML AUTO";
using(SqlConnection _con = new SqlConnection("server=(local);database=Northwind;integrated security=SSPI;"))
using (SqlCommand _cmd = new SqlCommand(query, _con))
{
_con.Open();
string result = _cmd.ExecuteScalar().ToString();
_con.Close();
File.WriteAllText(#"D:\test.xml", result);
}
This will create a file D:\test.xml (or change that to match your system) and will put those XML tags into that file.
The SqlCommand object also has a .ExecuteXmlReader() method which would return an XmlReader object to scan and manipulate the XML - not just return a string. Use whatever makes the most sense to you!
PS: also, the output of FOR XML AUTO is a bit .... let's say ... suboptimal. It uses the dbo.Employee as it's main XML tag and so forth... with SQL Server 2008, I would strongly recommend you look into using FOR XML PATH instead - it allows you to tweak and customize the layout of the XML output.
Compare your original XML output with FOR XML AUTO
<dbo.Employees _x0040_ID="1" LastName="Davolio" FirstName="Nancy" Title="Sales Representative" BirthDate="1948-12-08T00:00:00" HireDate="1992-05-01T00:00:00" />
<dbo.Employees _x0040_ID="2" LastName="Fuller" FirstName="Andrew" Title="Vice President, Sales" BirthDate="1952-02-19T00:00:00" HireDate="1992-08-14T00:00:00" />
against this query - just to see the difference:
SELECT
[EmployeeID] AS '#ID',
[LastName], [FirstName],
[Title],
[BirthDate], [HireDate]
FROM
[dbo].[Employees]
FOR XML PATH('Employee'), ROOT('Employees')
Output is:
<Employees>
<Employee ID="1">
<LastName>Davolio</LastName>
<FirstName>Nancy</FirstName>
<Title>Sales Representative</Title>
<BirthDate>1948-12-08T00:00:00</BirthDate>
<HireDate>1992-05-01T00:00:00</HireDate>
</Employee>
<Employee ID="2">
<LastName>Fuller</LastName>
<FirstName>Andrew</FirstName>
<Title>Vice President, Sales</Title>
<BirthDate>1952-02-19T00:00:00</BirthDate>
<HireDate>1992-08-14T00:00:00</HireDate>
</Employee>
I've had the same problem and I've created a .NET CLR that exports XML to a file:
using System;
using System.Data;
using System.Data.SqlClient;
using System.Data.SqlTypes;
using Microsoft.SqlServer.Server;
using System.Text;
using System.Xml;
using System.IO;
public sealed class StringWriterWithEncoding : StringWriter
{
private readonly Encoding encoding;
public StringWriterWithEncoding(Encoding encoding)
{
this.encoding = encoding;
}
public override Encoding Encoding
{
get { return encoding; }
}
}
public partial class StoredProcedures
{
[Microsoft.SqlServer.Server.SqlProcedure]
public static void XMLExport (SqlXml InputXml, SqlString OutputFile)
{
try
{
if (!InputXml.IsNull && !OutputFile.IsNull)
{
XmlDocument doc = new XmlDocument();
doc.LoadXml(InputXml.Value);
StringWriterWithEncoding sw = new StringWriterWithEncoding(System.Text.Encoding.UTF8);
XmlWriterSettings settings = new XmlWriterSettings
{
Indent = true,
IndentChars = " ",
NewLineChars = "\r\n",
NewLineHandling = NewLineHandling.Replace,
Encoding = System.Text.Encoding.UTF8
};
using (XmlWriter writer = XmlWriter.Create(sw, settings))
{
doc.Save(writer);
}
System.IO.File.WriteAllText(OutputFile.ToString(), sw.ToString(), System.Text.Encoding.UTF8);
}
else
{
throw new Exception("Parameters must be set");
}
}
catch
{
throw;
}
}
}
Here's an example how to use it:
DECLARE #x xml
SET #x = '<Test><Something>1</Something><AnotherOne>2</AnotherOne></Test>'
EXEC dbo.XmlExport #x, 'c:\test.xml'
And the output is a nicely formatted XML file:
<?xml version="1.0" encoding="utf-8"?>
<Test>
<Something>1</Something>
<AnotherOne>2</AnotherOne>
</Test>
Related
I am querying the Wikipedia API and am getting JSON back that looks like this:
https://en.wikipedia.org/w/api.php?action=query&prop=pageimages&titles=cessna%20172&pithumbsize=500&format=json
{"batchcomplete":"","query":{"normalized":[{"from":"cessna 172","to":"Cessna 172"}],"pages":{"173462":{"pageid":173462,"ns":0,"title":"Cessna 172","thumbnail":{"source":"https://upload.wikimedia.org/wikipedia/commons/thumb/a/ae/Cessna_172S_Skyhawk_SP%2C_Private_JP6817606.jpg/500px-Cessna_172S_Skyhawk_SP%2C_Private_JP6817606.jpg","width":500,"height":333},"pageimage":"Cessna_172S_Skyhawk_SP,_Private_JP6817606.jpg"}}}}
Using .Net Core 2.2, what is the proper way to get the image thumbnail out of this (the source property in this case)?
Parsing JSON is not a built in feature in .Net core 2.2 so you will want to add the Newtonsoft.Json package to the project with dotnet add package Newtonsoft.Json --version 12.0.3.
From there include Newtonsoft.Json by adding using Newtonsoft.Json.Linq; to the top of the file. and using System.Net; to use WebClient.
From there the code retrieves the string from the url. JObject.Parse parses the string as a JObject. We can get the property you want by chaining indexers: ["query"]["pages"]["173462"]["thumbnail"]["source"].
Full source:
using System;
using System.Net;
using Newtonsoft.Json.Linq;
class Program
{
static void Main(string[] args)
{
const string url = "https://en.wikipedia.org/w/api.php?action=query&prop=pageimages&titles=cessna%20172&pithumbsize=500&format=json";
using (WebClient client = new WebClient())
{
string rawString = client.DownloadString(url);
var jsonResult = JObject.Parse(rawString);
string thumbnail = jsonResult["query"]["pages"]["173462"]["thumbnail"]["source"];
Console.WriteLine(thumbnail);
}
}
}
Ideally, you will have to define a class and de-serialised the json. Example :
Batch batch = JsonConvert.DeserializeObject<Account>(json);
More details here.
However, at times, just to get one/two values, it might be overhead to use an entire class structure. In this case, a quick way might be to parse the json dynamically. Example which is taken from here:
public void JValueParsingTest()
{
var jsonString = #"{""Name"":""Rick"",""Company"":""West Wind"",
""Entered"":""2012-03-16T00:03:33.245-10:00""}";
dynamic json = JValue.Parse(jsonString);
// values require casting
string name = json.Name;
string company = json.Company;
DateTime entered = json.Entered;
Assert.AreEqual(name, "Rick");
Assert.AreEqual(company, "West Wind");
}
Is there a way to reference a csv or other data source which has table names and source queries to generate BIML documents?
Thanks
Do be aware C# is not my strong suit and the below may well not be the ideal way to do this. If you do find something more suitable, I would very much like to hear about it :)
The easiest way I have found to include CSV based metadata into my Biml projects is to load them into C# DataTable objects that I then reference in my Biml as
a C# variable object, which plays very well with foreach to iterate through the rows.
Assuming you are aware how to include C# in your Biml projects (either in the file directly or via referenced .cs file), you can use the following code:
public static DataTable FlatFileToDataTable(string filePath, char delimiter)
{
DataTable dt = new DataTable();
using (StreamReader sr = new StreamReader(filePath))
{
string[] headers = sr.ReadLine().Split(delimiter);
foreach (string header in headers)
{
dt.Columns.Add(header);
}
while (!sr.EndOfStream)
{
string[] rows = sr.ReadLine().Split(delimiter);
DataRow dr = dt.NewRow();
for (int i = 0; i < headers.Length; i++)
{
dr[i] = rows[i];
}
dt.Rows.Add(dr);
}
}
return dt;
}
I think in order to use the StreamReader you will need to add using System.IO; to your code file as well.
Usage would be to define a DataTable object and populate it with the result of the above, then to reference it using code snippets within your Biml:
DataTable YourDataTable = FlatFileToDataTable("<Path to CSV file>",'<Value Delimiter>');
...
<Columns>
<# foreach(DataRow r in YourDataTable.Rows){ #>
<Column Name="<#=r["YourColumnName"]#>" etc />
<# } #>
</Columns>
Referencing this example of using "json:Array": Converting between JSON and XML
I have two questions:
Does the namespace have to be "json"? I.e. if ns2 matched back to
"xmlns:ns2='http://james.newtonking.com/projects/json'" would that work?
Can the namespace be omitted? Can I just put "Array='true'"?
I'm about to try to test by trial and error, but thought maybe somebody would know the answer, or someone in the future would like to know.
Not that it matters a whole lot, but my XML is being generated by BizTalk 2010 and I'm using a WCF CustomBehavior to call NewtonSoft as follows:
private static ConvertedJSON ConvertXMLToJSON(string xml)
{
// To convert an XML node contained in string xml into a JSON string
XmlDocument doc = new XmlDocument();
doc.LoadXml(xml);
ConvertedJSON convertedJSON = new ConvertedJSON();
convertedJSON.JSONtext = JsonConvert.SerializeXmlNode(doc, Newtonsoft.Json.Formatting.None);
convertedJSON.rootElement = doc.DocumentElement.Name;
return convertedJSON;
}
Looks like the namespace has to be exactly what they provide:
string xmlToConvert2 = "<myRoot xmlns:json='http://james.newtonking.com/projects/json'><MyText json:Array='true'>This is the text here</MyText><Prices><SalesPrice>10.00</SalesPrice></Prices></myRoot>";
string strJSON2 = ConvertXMLToJSON(xmlToConvert2);
As with normal xml, the namespace prefix can be any value. The follow worked equally as well as the above.
string xmlToConvert3 = "<myRoot xmlns:abc='http://james.newtonking.com/projects/json'><MyText abc:Array='true'>This is the text here</MyText><Prices><SalesPrice>10.00</SalesPrice></Prices></myRoot>";
string strJSON3 = ConvertXMLToJSON(xmlToConvert3);
I want to create a JSON file for use as part of a simple web prototyping exercise. LinqPAD is perfect for accessing the data from my DB in just the shape I need, however I cannot get it out as JSON very easily.
I don't really care what the schema is, because I can adapt my JavaScript to work with whatever is returned.
Is this possible?
A more fluent solution is to add the following methods to the "My Extensions" File in Linqpad:
public static String DumpJson<T>(this T obj)
{
return
obj
.ToJson()
.Dump();
}
public static String ToJson<T>(this T obj)
{
return
new System.Web.Script.Serialization.JavaScriptSerializer()
.Serialize(obj);
}
Then you can use them like this in any query you like:
Enumerable.Range(1, 10)
.Select(i =>
new
{
Index = i,
IndexTimesTen = i * 10,
})
.DumpJson();
I added "ToJson" separately so it can be used in with "Expessions".
This is not directly supported, and I have opened a feature request here. Vote for it if you would also find this useful.
A workaround for now is to do the following:
Set the language to C# Statement(s)
Add an assembly reference (press F4) to System.Web.Extensions.dll
In the same dialog, add a namespace import to System.Web.Script.Serialization
Use code like the following to dump out your query as JSON
new JavaScriptSerializer().Serialize(query).Dump();
There's a solution with Json.NET since it does indented formatting, and renders Json dates properly. Add Json.NET from NuGet, and refer to Newtonsoft.Json.dll to your “My Extensions” query and as well the following code :
public static object DumpJson(this object value, string description = null)
{
return GetJson(value).Dump(description);
}
private static object GetJson(object value)
{
object dump = value;
var strValue = value as string;
if (strValue != null)
{
var obj = JsonConvert.DeserializeObject(strValue);
dump = JsonConvert.SerializeObject(obj, Newtonsoft.Json.Formatting.Indented);
}
else
{
dump = JsonConvert.SerializeObject(value, Newtonsoft.Json.Formatting.Indented);
}
return dump;
}
Use .DumpJson() as .Dump() to render the result. It's possible to override more .DumpJson() with different signatures if necessary.
As of version 4.47, LINQPad has the ability to export JSON built in. Combined with the new lprun.exe utility, it can also satisfy your needs.
http://www.linqpad.net/lprun.aspx
I have a huge collection of visual foxpro dbf files that I would like to convert to csv.
(If you like, you can download some of the data here. Click on the 2011 link for Transaction Data, and prepare to wait a long time...)
I can open each table with DBF View Plus (an awesome freeware utility), but exporting them to csv takes a few hours per file, and I have several dozen files to work with.
Is there a program like DBF View plus that will allow me to set up a batch of dbf-to-csv conversions to run overnight?
/Edit: Alternatively, is there a good way to import .dbf files straight into SQL Server 2008? They should all go into 1 table, as each file is just a subset of records from the same table and should have all the same column names.
Load up your list of FoxPro files in an array/list then call the ConvertDbf on each to convert them from FoxPro to csv files. See the c# console application code below...
Credit c# datatable to csv for the DataTableToCSV function.
using System;
using System.Data;
using System.Data.OleDb;
using System.IO;
using System.Linq;
using System.Text;
namespace SO8843066
{
class Program
{
static void Main(string[] args)
{
string connectionString = #"Provider=VFPOLEDB.1;Data Source=C:\";
string dbfToConvert = #"C:\yourdbffile.dbf";
ConvertDbf(connectionString, dbfToConvert, dbfToConvert.Replace(".dbf", ".csv"));
Console.WriteLine("End of program execution");
Console.WriteLine("Press any key to end");
Console.ReadKey();
}
static void DataTableToCSV(DataTable dt, string csvFile)
{
StringBuilder sb = new StringBuilder();
var columnNames = dt.Columns.Cast<DataColumn>().Select(column => column.ColumnName).ToArray();
sb.AppendLine(string.Join(",", columnNames));
foreach (DataRow row in dt.Rows)
{
var fields = row.ItemArray.Select(field => field.ToString()).ToArray();
for (int i =0;i < fields.Length;i++)
{
sb.Append("\"" + fields[i].Trim() );
sb.Append((i != fields.Length - 1) ? "\"," : "\"");
}
sb.Append("\r\n");
}
File.WriteAllText(csvFile, sb.ToString());
}
static void ConvertDbf(string connectionString, string dbfFile, string csvFile)
{
string sqlSelect = string.Format("SELECT * FROM {0}", dbfFile);
using (OleDbConnection connection = new OleDbConnection(connectionString))
{
using (OleDbDataAdapter da = new OleDbDataAdapter(sqlSelect, connection))
{
DataSet ds = new DataSet();
da.Fill(ds);
DataTableToCSV(ds.Tables[0], csvFile);
}
}
}
}
}
In that case, SQL-Server I think has a capability of connecting to foxpro tables. I'm not exactly sure how as I've never done it recently (last time using SQL-Server about 8+ yrs ago). I'm sure there are other threads out there that can point you to connecting SQL-Server to VFP.
I quickly searched and saw this thread
In addition, you might need the latest OleDb provider to establish the connection which I've also posted in a thread here. This thread also shows a sample of the connection string information you may need from SQL-Server. The data source information should point to the PATH where the .DBF files are found, and not the specific name of the .DBF you are trying to connect to.
Hope this helps you out.
This works very well and thanks for the solution. I used this to convert some visual foxpro dbf tables to flat files. With these tables, there is the additional challenge of converting fields of type Currency.
Currency fields are a 64-bit (8 byte) signed integer amidst a 36 element byte array starting at the 27th position. The integer is then divided by 1000 to get 4-decimal precision equivalent.
If you have this type of field, try this inside the fields FOR loop
if (("" + fields[i]).Equals("System.Byte[]"))
{
StringBuilder db = new StringBuilder();
byte[] inbytes = new byte[36];
inbytes = ObjectToByteArray(fields[i]);
db.Append("" + (double)BitConverter.ToInt64(inbytes,27)/1E4);
sb.Append("\"" + db);
}
With the following helper method
private static byte[] ObjectToByteArray(Object obj)
{
BinaryFormatter bf = new BinaryFormatter();
using (var ms = new MemoryStream())
{
bf.Serialize(ms, obj);
return ms.ToArray();
}
}
Check out my answer to Foxbase to postrgresql data transfer. (dbf files reader).