import CSV without header row file to datatable - csv

I have a CSV file uploaded by UploadFile control which has no header row. When I try to read it to datatable it gives the error because the first row has same values in different columns. How to insert the header row to this file or read data from CSV to datatable with predefined columns?
Data in csv:
IO23968 2012 11 AB WI 100162804410W500 0 516.78 0 0 0 N 0
IO24190 2012 11 AB WI 100140604510W500 302 516.78 15617.9 0 15617 N 0
IO24107 2012 11 AB WI 100033104410W500 337 516.78 17456.3 0 17456 N 0
Control:
HtmlInputFile fileOilFile = fileOilSubmission as HtmlInputFile;
if (fileOilFile != null)
strOilFileName = fileOilFile.Value;
DataTable:
DataTable csvData = new DataTable();
csvData.Columns.Add("RoyaltyEntityID", typeof(string));
csvData.Columns.Add("ProductionYear", typeof(int));
csvData.Columns.Add("ProductionMonth", typeof(int));
csvData.Columns.Add("ProductionEntityID", typeof(string));
csvData.Columns.Add("ProductionVolume", typeof(double));
csvData.Columns.Add("SalePrice", typeof(double));
csvData.Columns.Add("GrossRoyaltyAmount", typeof(double));
csvData.Columns.Add("TruckingRate", typeof(double));
csvData.Columns.Add("TotalNetRoyalty", typeof(double));
csvData.Columns.Add("ConfidentialWell", typeof(bool));
csvData.Columns.Add("HoursProductionAmount", typeof(double));
using (StreamReader sr = File.OpenText(strOilFileName))
{
string s = String.Empty;
while ((s = sr.ReadLine()) != null)
{ //we're just testing read speeds
foreach (var line in strOilFileName)
{
csvData.Rows.Add(line.split(',')[0]);
csvData.Rows.Add(line.split(',')[1]);
csvData.Rows.Add(line.split(',')[2]);
csvData.Rows.Add(line.split(',')[3]);
}
}
}

Can you just add the header manually in code then add the rows as you parse the file:
ex:
DataTable table = new DataTable();
table.Columns.Add("Dosage", typeof(int));
table.Columns.Add("Drug", typeof(string));
table.Columns.Add("Patient", typeof(string));
table.Columns.Add("Date", typeof(DateTime));
foreach (var line in csv)
{
table.Rows.Add(line.split(',')[0]);
table.Rows.Add(line.split(',')[1]);
table.Rows.Add(line.split(',')[2]);
table.Rows.Add(line.split(',')[3]);
}
return table;

The code below shows how you might read a CSV into a DataTable.
This code assumes your CSV can be found at strOilFileName and the DataTable's schema is what you show in your question. I'm also assuming that your CSV is actually comma-delimited (doesn't look that way from the sample data in your question).
DataTable csvData = new DataTable();
// ... add columns as you show.
using (StreamReader sr = File.OpenText(strOilFileName)) {
string line = string.Empty;
while ((line = sr.ReadLine()) != null) {
string[] fields = line.Split(',');
if (fields.Length == 13) {
// Create a new empty row based on your DataTable's schema.
var row = csvData.NewRow();
//Start populating the new row with data from the CSV line.
row[0] = fields[0];
// You can't be sure that the data in your CSV can be converted to your DataTable's column's data type so
// always use the TryParse methods when you can.
int prodYear = 0;
if (int.TryParse(fields[1], out prodYear)) {
row[1] = prodYear;
} else {
// Do what when the field's value does not contain a value that can be converted to an int?
// Here I'm setting the field to 2000 but you'll want to throw an Exception, set a different default, etc.
row[1] = 2000;
}
//
// Repeat the above steps for filling the rest of the columns in your DataRow.
//
// Add your new row to your DataTable.
csvData.Rows.Add(row);
} else {
// Do something because Split returned in unexpected number of fields.
}
}
}
The code to read the CSV is fairly simplistic. You might want to look into other CSV parsers that can handle a lot of the parsing details for you. There are a bunch out there.

private static DataTable GetDataTabletFromCSVFile(string csv_file_path)
{
DataTable csvData = new DataTable();
csvData.Columns.Add("RoyaltyEntityID", typeof(string));
csvData.Columns.Add("ProductionYear", typeof(int));
csvData.Columns.Add("ProductionMonth", typeof(int));
csvData.Columns.Add("ProductionEntityID", typeof(string));
csvData.Columns.Add("ProductionVolume", typeof(double));
csvData.Columns.Add("SalePrice", typeof(double));
csvData.Columns.Add("GrossRoyaltyAmount", typeof(double));
csvData.Columns.Add("TruckingRate", typeof(double));
csvData.Columns.Add("TotalNetRoyalty", typeof(double));
csvData.Columns.Add("ConfidentialWell", typeof(string));
csvData.Columns.Add("HoursProductionAmount", typeof(double));
using (StreamReader sr = new StreamReader(csv_file_path))
{
string line = string.Empty;
while ((line = sr.ReadLine()) != null)
{
string[] strRow = line.Split(',');
DataRow dr = csvData.NewRow();
dr["RoyaltyEntityID"] = strRow[0];
dr["ProductionYear"] = strRow[1];
dr["ProductionMonth"] = strRow[2];
dr["ProductionEntityID"] = strRow[3];
dr["ProductionVolume"] = strRow[4];
dr["SalePrice"] = strRow[5];
dr["GrossRoyaltyAmount"] = strRow[6];
dr["TruckingRate"] = strRow[7];
dr["TotalNetRoyalty"] = strRow[8];
dr["ConfidentialWell"] = strRow[9];
if (strRow[9] == "Y")
{
dr["HoursProductionAmount"] = strRow[10];
}
else
{
dr["HoursProductionAmount"] = "0";
}
csvData.Rows.Add(dr);
}
}
return csvData;
}enter code here

Related

Getting error - "Stream was not readable" while reading CSVs and merging them into one

When I run the following code, it fails on the second loop with the error - "Stream was not readable". I don't understand why the stream in the second loop is coming to be closed when it is being created every time in the loop.
string resultCsvContents = "col1,col2,col3\n1,2,3";
using (var streamWriter = new StreamWriter("MergedCsvFile.csv", true))
{
int fileCount = 0;
foreach (var outputFilePath in outputFiles)
{
using (var fileStream = mockService.GetFileStream(outputFilePath))
{
using (var streamReader = new StreamReader(fileStream))
{
if (fileCount != 0)
{
var header = streamReader.ReadLine();
}
streamWriter.Write(streamReader.ReadToEnd());
fileCount++;
}
}
}
}
The output of this should be a csv with the following data :
col1, col2, col3 1,2,3 1,2,3 1,2,3
Here the mock service is mocked to return
string resultCsvContents = "col1,col2,col3\n1,2,3"; mockService.Setup(x => x.GetFileStream(It.IsAny<string>())) .ReturnsAsync(new MemoryStream(Encoding.ASCII.GetBytes(this.resultCsvContents1)));
Is there an issue with the way I am mocking this function ?

Converting datatable column to double C#

I read some columns from a csv file and then display it in a DataGridView. The column "Value" contains some 3-digit integer values. I want to have this integer values shown in the datagridview as doubles with one decimals place. The conversion doesn't seem to work. Also when I load a large csv file (around 30k rows) it is loaded immediately but with conversion it takes too much time.
using (StreamReader str = new StreamReader(openFileDialog1.FileName)) {
CsvReader csvReadFile = new CsvReader(str);
dt = new DataTable();
dt.Columns.Add("Value", typeof(double));
dt.Columns.Add("Time Stamp", typeof(DateTime));
while (csvReadFile.Read()) {
var row = dt.NewRow();
foreach (DataColumn column in dt.Columns) {
row[column.ColumnName] = csvReadFile.GetField(column.DataType, column.ColumnName);
}
dt.Rows.Add(row);
foreach (DataRow row1 in dt.Rows)
{
row1["Value"] = (Convert.ToDouble(row1["Value"])/10);
}
}
}
dataGridView1.DataSource = dt;
Sounds like you have two questions:
How to format the value to a single decimal place of scale.
Why does the convert section of code take so long?
Here are possibilities
See this answer which specifies a possibility of using
String.Format("{0:0.##}", (Decimal) myTable.Rows[rowIndex].Columns[columnIndex]);
You are iterating over every row of the datatable every time you read a line. That means when you read line 10 of the CSV, you will iterate over rows 1-9 of the DataTable again! And so on for each line you read! Refactor to pull that loop out of the ReadLine... something like this:
using (StreamReader str = new StreamReader(openFileDialog1.FileName)) {
CsvReader csvReadFile = new CsvReader(str);
dt = new DataTable();
dt.Columns.Add("Value", typeof(double));
dt.Columns.Add("Time Stamp", typeof(DateTime));
while (csvReadFile.Read()) {
var row = dt.NewRow();
foreach (DataColumn column in dt.Columns) {
row[column.ColumnName] = csvReadFile.GetField(column.DataType, column.ColumnName);
}
dt.Rows.Add(row);
}
foreach (DataRow row1 in dt.Rows)
{
row1["Value"] = (Convert.ToDouble(row1["Value"])/10);
}
}
dataGridView1.DataSource = dt;

The component cannot be found. (Exception from HRESULT: 0x88982F50) when setting stream to bitmapimage in windows phone 8 app

I am trying to add an element in existing isolated folder in xml
code:
public void writeToXML(List<AppTile> appList)
{
// Write to the Isolated Storage
XmlWriterSettings x_W_Settings = new XmlWriterSettings();
x_W_Settings.Indent = true;
using (IsolatedStorageFile ISF = IsolatedStorageFile.GetUserStoreForApplication())
{
if (!ISF.FileExists("config.xml"))
{
using (IsolatedStorageFileStream stream = ISF.OpenFile("config.xml", FileMode.CreateNew))
{
XmlSerializer serializer = new XmlSerializer(typeof(List<AppTile>));
using (XmlWriter xmlWriter = XmlWriter.Create(stream, x_W_Settings))
{
serializer.Serialize(xmlWriter, appList);
}
stream.Close();
}
}
else
{
string tileName = null;
string url = null;
string key = null;
byte [] tilePic = null;
XDocument loadedData;
if (appList != null)
{
foreach (AppTile app in appList)
{
tileName = app.TileName;
url = app.Url;
key = app.Key;
tilePic = app.TilePic;
// tilePic = Encoding.UTF8.GetString(app.TilePic,0,app.TilePic.Length);
// tilePic = Encoding.Unicode.GetString(app.TilePic,0,app.TilePic.Length);
// var writer = new BinaryWriter(tilePic);
}
using (Stream stream = ISF.OpenFile("config.xml", FileMode.Open, FileAccess.ReadWrite))
{
loadedData = XDocument.Load(stream);
var RootNode = new XElement("AppTile");
RootNode.Add(new XElement("TileName", tileName));
RootNode.Add(new XElement("Key", key));
RootNode.Add(new XElement("Url", url));
RootNode.Add(new XElement("TilePic", tilePic));
// Find Root Element And Descedents and Append New Node After That
var root = loadedData.Element("ArrayOfAppTile");
var rows = root.Descendants("AppTile");
var lastRow = rows.Last();
lastRow.AddAfterSelf(RootNode);
}
// Save To ISOconfig.xml File
using (IsolatedStorageFileStream newStream = new IsolatedStorageFileStream("config.xml", FileMode.Create, ISF))
{
loadedData.Save(newStream);
newStream.Close();
}
}
}
}
}
while reading from xml i am using xmlWriter and deserialize it to get List
when ever i am trying to acces the tilePic of AppTile type i am getting error :The component cannot be found. with the following code::
Image img = new Image();
MemoryStream imgStream = new MemoryStream(NewApp.TilePic);//NewApp is AppTile type
BitmapImage imgSource = new BitmapImage();
imgSource.SetSource(imgStream);//here i get error
img.Source = imgSource;
Its most likely with the TilePic i am saving is not formatted correctly to be retrieved.
Please Help!!
Why don't you use the SaveJpeg extension methodoogy in creating the byte array?
This one might help you!
Windows Phone - byte array to BitmapImage converter throws exception
solved the issue:
while saving the byte array it is encoded to some format not recognizable while reading,So what i have done is saved it as a string.So the updated code is:
else
{
string tileName = null;
string url = null;
string key = null;
string tilePicString = null;
XDocument loadedData;
System.Windows.Media.Imaging.BitmapImage imgSource = new System.Windows.Media.Imaging.BitmapImage();
if (appList != null)
{
foreach (AppTile app in appList)
{
tileName = app.TileName;
url = app.Url;
key = app.Key;
//changed here
tilePicString = System.Convert.ToBase64String(app.TilePic);
}
//same code as above

how to insert record to multiple tables with multiple file with linq2sql

I have web application in asp.net 3.5. I just want to add record to 3 tables and one table contains multiple file details that I associated with table primary key value.
For more info I include this code:
protected void submit_Click(object sender, EventArgs e)
{
if (Page.IsValid)
{
using (DataClassesDataContext db = new DataClassesDataContext())
{
int user_id = 0;
var query = from u in db.Users
where u.Username == (String)Session["Username"]
select new
{
Id = u.Id
};
foreach (var item in query)
{
if (item != null)
{
user_id = int.Parse(item.Id.ToString());
break;
}
}
Post myPost = new Post();
myPost.Title = txt_ComName.Text.Trim();
myPost.Category_id = int.Parse(DDL_Categorynames.SelectedItem
.Value.ToString());
myPost.Description = txt_ComName1.Text.Trim();
myPost.User_id = user_id;
myPost.ToUser_id = user_id;
if(file_upload.HasFile)
{
myPost.IsFileAttached = true;
}
else
{
myPost.IsFileAttached = false;
}
db.Posts.InsertOnSubmit(myPost);
db.SubmitChanges();
int newId = myPost.Id;
Flag myFlag = new Flag();
myFlag.IsRead = false;
myFlag.IsImportant = false;
myFlag.IsRemoved = false;
myFlag.User_id = user_id;
myFlag.Post_History_id = newId;
db.Flags.InsertOnSubmit(myFlag);
db.SubmitChanges();
File myFile = new File();
HttpFileCollection fileCollection = Request.Files;
for (int i = 0; i < fileCollection.Count; i++)
{
HttpPostedFile uploadfile = fileCollection[i];
string fileName = Path.GetFileName(uploadfile.FileName);
string fileType = System.IO.Path.GetExtension(fileName)
.ToString().ToLower();
myFile.Post_History_id = newId;
myFile.File_name = fileName;
myFile.File_ext = fileType;
myFile.File_Size = uploadfile.ContentLength.ToString();
if (uploadfile.ContentLength > 0)
{
uploadfile.SaveAs(Server.MapPath("~/PostFiles/")
+ fileName);
db.Files.InsertOnSubmit(myFile);
db.SubmitChanges();
}
}
Panel_AddNew.Visible = false;
Panel_ViewPostList.Visible = true;
this.FillGrid();
}
}
}
However, with this scenario first file that was selected that was inserted and after that second file iterated then error occurred like:
Cannot add an entity that already exists.
Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
Exception Details: System.InvalidOperationException: Cannot add an entity that already exists.
Source Error:
Line 248: {
Line 249: uploadfile.SaveAs(Server.MapPath("~/PostFiles/") + FileName);
Line 250: db.Files.InsertOnSubmit(myFile);
Line 251: db.SubmitChanges();
Line 252: }
Source File: f:\EasyWeb\EndUser\Post_History.aspx.cs Line: 250
please help me...
Create a new File in each iteration of the foreach loop:
for (int i = 0; i < fileCollection.Count; i++)
{
File myFile = new File();
HttpPostedFile uploadfile = fileCollection[i];
...

import csv and xls for bulk upload of user to register in a website

i am developing a website and i want to register the school children in a bulk way as they will provide the excel sheet and want that when i upload that sheet it automatically register user in userinfo table
here is the code
if (Request.Files["FileUpload1"] != null && Request.Files["FileUpload1"].ContentLength > 0)
{
string extension = System.IO.Path.GetExtension(Request.Files["FileUpload1"].FileName);
string path1 = string.Format("{0}/{1}", Server.MapPath("~/Content/UploadedFolder"), Request.Files["FileUpload1"].FileName);
if (System.IO.File.Exists(path1))
System.IO.File.Delete(path1);
Request.Files["FileUpload1"].SaveAs(path1);
string sqlConnectionString = #"Data Source=(LocalDb)\v11.0;Initial Catalog=aspnet-Planetskool-20130901224446;Integrated Security=SSPI;AttachDBFilename=|DataDirectory|\aspnet-Planetskool-20130901224446.mdf;Database=DefaultConnection; Trusted_Connection=true;Persist Security Info=True";
//Create connection string to Excel work book
string excelConnectionString = #"Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + path1 + ";Extended Properties=Excel 12.0;Persist Security Info=False";
//Create Connection to Excel work book
OleDbConnection excelConnection = new OleDbConnection(excelConnectionString);
//Create OleDbCommand to fetch data from Excel
OleDbCommand cmd = new OleDbCommand("Select [UserInfoID],[UserID],[GraphID],[UserLevelEnumId],[Title],[FirstName],[MiddleName],[LastName],[Birthdate],[Gender],[Email],[MobileNo],[Country],[Zipcode],[CountFollowers],[CountFollows],[CountFiles],[CountPhotos],[Quote],[AvatarURL],[isVerified],[VerificationCount],[UserEnumType],[UserCreatorId] from [Sheet1$]", excelConnection);
excelConnection.Open();
OleDbDataReader dReader;
dReader = cmd.ExecuteReader();
SqlBulkCopy sqlBulk = new SqlBulkCopy(sqlConnectionString);
//Give your Destination table name
sqlBulk.DestinationTableName = "UserInfo";
sqlBulk.WriteToServer(dReader);
excelConnection.Close();
// SQL Server Connection String
}
return RedirectToAction("Import");
your code like below
if (Request.Files["FileUpload1"].ContentLength > 0)
{
string fileExtension = System.IO.Path.GetExtension(Request.Files["FileUpload1"].FileName);
if (fileExtension == ".xls" || fileExtension == ".xlsx")
{
// Create a folder in App_Data named ExcelFiles because you need to save the file temporarily location and getting data from there.
string path1 = string.Format("{0}/{1}", Server.MapPath("~/Content/UploadedFolder"), Request.Files["FileUpload1"].FileName);
if (System.IO.File.Exists(path1))
System.IO.File.Delete(path1);
Request.Files["FileUpload1"].SaveAs(path1);
string sqlConnectionString = #"Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + path1 + ";Extended Properties=Excel 12.0;Persist Security Info=False";
//Create Connection to Excel work book and add oledb namespace
OleDbConnection excelConnection = new OleDbConnection(sqlConnectionString);
excelConnection.Open();
DataTable dt = new DataTable();
dt = excelConnection.GetOleDbSchemaTable(OleDbSchemaGuid.Tables, null);
if (dt == null)
{
return null;
}
String[] excelSheets = new String[dt.Rows.Count];
int t = 0;
//excel data saves in temp file here.
foreach (DataRow row in dt.Rows)
{
excelSheets[t] = row["TABLE_NAME"].ToString();
Debug.Write("SheetTitle = " + excelSheets[t]);
t++;
}
OleDbConnection excelConnection1 = new OleDbConnection(sqlConnectionString);
DataSet ds = new DataSet();
string query = string.Format("Select * from [{0}]", excelSheets[0]);
using (OleDbDataAdapter dataAdapter = new OleDbDataAdapter(query, excelConnection1))
{
dataAdapter.Fill(ds);
}
for (int j = 0; j <= ds.Tables[0].Rows.Count - 1; j++)
{
}
}