SqlBulkCopy does not store accented characters properly - sql-server-2008

I am inserting French language text into nvarchar column in SQL server 2008. The French accented characters are not stored properly in the SQL DB.
string strData = "Accented chars- Les caractères accentués français ";
DataTable dtTemp = new DataTable();
dtTemp.Columns.Add("ID", typeof(string));
dtTemp.Columns.Add("Value", typeof(string));
DataRow dr = dtTemp.NewRow();
dr["ID"] = "100";
dr["Value"] = strData;
dtTemp.Rows.Add(dr);
strSQLCon = GetSQLConnectionString();
using (SqlConnection cn = new SqlConnection(strSQLCon))
{
cn.Open();
using (SqlBulkCopy copy = new SqlBulkCopy(cn))
{
copy.ColumnMappings.Add("ID", "ID");
copy.ColumnMappings.Add("Value", "Value");
copy.DestinationTableName = "MYTABLE";
copy.WriteToServer(dtTemp);
}
}
The French characters are not stored properly in SQL server data base.
It works fine when i do a normal insert query. insert into MYTABLEvalues(1 , 'Accented chars- Les caractères accentués français')
Please let me know why it does not work with SQL Bulk copy class. Any settings need to be changed or C# code needs to be modified to store the non-English characters properly.

I am designing this table, the collation for every column is set to French_CI_AS, French culture, accent sensitive. Every sql string type considered.
I am building a typed dataset for this table (not the purpose of this question).
Sql bulk copy:
var ds = new FrenchCharacters.FrenchDataSet();
using (var destinationConnection = new SqlConnection(StackOverflow.Properties.Settings.Default.StackOverflowConnectionString))
{
destinationConnection.Open();
//all French characters http://en.wikipedia.org/wiki/French_orthography
string[] sArray = new string[] {
"Àà, Ââ, Ææ, Ää"
, "Çç"
, "Îî, Ïï"
, "Ôô, Œœ, Öö"
, "Ùù, Ûû, Üü"
, "Ÿÿ"
};
// open the connection
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(destinationConnection.ConnectionString))
{
bulkCopy.BatchSize = 500;
bulkCopy.NotifyAfter = 10000;
bulkCopy.DestinationTableName = "French";
//
// build data table to be written to the server
// data table is now strongly-typed ds.French
//
for (int i = 0; i < 100; i++)
{
foreach (string s in sArray)
ds.French.AddFrenchRow(s, s, s, s, s, s);
}
//
bulkCopy.WriteToServer(ds.French);
}
}
result:
Notice no invalid entries whatever the sql char type!.

I tested your code on the following table and it worked fine, at least on SQL Server 2012.
CREATE TABLE [dbo].[MYTABLE](
[ID] [varchar](20) NOT NULL,
[Value] [nvarchar](255) NOT NULL)

thanks for your replies. The issue seems to occur while reading the csv file into data table before bulk insert. I included the encoding parameter while reading the csv file. (Encoding.Default) and it loads the french text properly and it gets stored in SQL DB without any issues.
old code:
List lstData = File.ReadAllLines(stFile).ToList();
Working code:
List lstData = File.ReadAllLines(stFile, Encoding.Default).ToList();
thanks
Ashok

Related

How to read from MySQL table Polygon data

I am developing an application that I need location data to be stored on MySQL table. In addition to point locations, I need regions (polygon) as well.
I am currently writing the polygon coordinates as follow :
oMySQLConnecion = new MySqlConnection(DatabaseConnectionString);
if (oMySQLConnecion.State == System.Data.ConnectionState.Closed || oMySQLConnecion.State == System.Data.ConnectionState.Broken)
{
oMySQLConnecion.Open();
}
if (oMySQLConnecion.State == System.Data.ConnectionState.Open)
{
string Query = #"INSERT INTO region (REGION_POLYGON) VALUES (PolygonFromText(#Parameter1))";
MySqlCommand oCommand = new MySqlCommand(Query, oMySQLConnecion);
oCommand.Parameters.AddWithValue("#Parameter1", PolygonString);
int sqlSuccess = oCommand.ExecuteNonQuery();
oMySQLConnecion.Close();
oDBStatus.Type = DBDataStatusType.SUCCESS;
oDBStatus.Message = DBMessageType.SUCCESSFULLY_DATA_UPDATED;
return oDBStatus;
}
After the execution, I see the Blob in MySQL table.
Now I want to read the data back for my testing and it does not work the way I tried below :
if (oMySQLConnecion.State == System.Data.ConnectionState.Open)
{
string Query = #"SELECT REGION_ID,REGION_NICK_NAME,GeomFromText(REGION_POLYGON) AS POLYGON FROM region WHERE REGION_USER_ID = #Parameter1";
MySqlCommand oCommand = new MySqlCommand(Query, oMySQLConnecion);
oCommand.Parameters.AddWithValue("#Parameter1", UserID);
using (var reader = oCommand.ExecuteReader())
{
while (reader.Read())
{
R_PolygonCordinates oRec = new R_PolygonCordinates();
oRec.RegionNumber = Convert.ToInt32(reader["REGION_ID"]);
oRec.RegionNickName = reader["REGION_NICK_NAME"].ToString();
oRec.PolygonCodinates = reader["POLYGON"].ToString();
polygons.Add(oRec);
}
}
int sqlSuccess = oCommand.ExecuteNonQuery();
oMySQLConnecion.Close();
return polygons;
}
It returns an empty string.
I am not sure if I am really writing the data since I can not read Blob.
Is my reading syntax incorrect?
** Note:** I am using Visual Studio 2017. The MySQL latest version with Spacial classes.
Any help is highly appreciated.
Thanks
GeomFromText() takes a WKT (the standardized "well-known text" format) value as input and returns the MySQL internal geometry type as output.
This is the inverse of what you need, which is ST_AsWKT() or ST_AsText() -- take an internal-format geometry object as input and return WKT as output.
Prior to 5.6, the function is called AsWKT() or AsText(). In 5.7 these are all synonyms for exactly the same function, but the non ST_* functions are deprecated and will be removed in the future.
https://dev.mysql.com/doc/refman/5.7/en/gis-format-conversion-functions.html#function_st-astext
I don't know for certain what the ST_ prefix means, but I assume it's "spatial type." There's some discussion in WL#8055 that may be of interest.

Load 3D model in Unity using Resource folder and Mysql

I want to load 3D model using Resource folder. I created an sql database to store the address. In this case I stored the file "deer-3ds" in folder "Models" and also save these information in a table named "modeladdress" in sql.
So please help me to correct my code. I know that it's 100% wrong but I dont know how to fix it. Thank you.
using UnityEngine;
using System.Collections;
using System;
using System.Data;
using Mono.Data.Sqlite;
public class addobject : MonoBehaviour {
// Use this for initialization
void Start () {
//GameObject deer=Instantiate(Resources.Load("deer-3d.bak",typeof(GameObject)))as GameObject;
// GameObject instance = Instantiate(Resources.Load("Models/deer-3ds", typeof(GameObject))) as GameObject;
string conn = "URI=file:" + Application.dataPath + "/modeladdress.s3db"; //Path to database.
IDbConnection dbconn;
dbconn = (IDbConnection) new SqliteConnection(conn);
dbconn.Open(); //Open connection to the database.
IDbCommand dbcmd = dbconn.CreateCommand();
string sqlQuery = "SELECT ordinary,foldername, filename " + "FROM modeladdress";
dbcmd.CommandText = sqlQuery;
IDataReader reader = dbcmd.ExecuteReader();
while (reader.Read ()) {
int ordinary = reader.GetInt32 (0);
string foldername = reader.GetString (1);
string filename = reader.GetString (2);
string path = foldername + "/" + filename;
//Debug.Log( "value= "+value+" name ="+name+" random ="+ rand);
GameObject instance = Instantiate(Resources.Load(path, typeof(GameObject))) as GameObject;
instance.SetActive (true);
}
reader.Close();
reader = null;
dbcmd.Dispose();
dbcmd = null;
dbconn.Close();
dbconn = null;
}
// Update is called once per frame
void Update () {
// GameObject instance = Instantiate(Resources.Load("Models/deer-3ds", typeof(GameObject))) as GameObject;
// instance.SetActive (true);
}
}
First of all, you are using SQLite at your database management system, not MySQL. Second, the way you have written your query,
string sqlQuery = "SELECT ordinary,foldername, filename " + "FROM modeladdress";
Will return the ordinary, foldername, and filename for every model. You need to use a WHERE clause to specify precisely which model you want to use. Thus, you need some way to know which model you want to query from the database before you actually execute the query, and in that case, why even query a database? You're going to have to store some unique identifier anyway so a database solves nothing.
Now concerning the actual code you have written, it appears to be correct (i.e. it should be returning what you want). The problem must be that either your table is empty, your values that are returned are incorrect, or that the object is being instantiated in an incorrect location and thus you are thinking it's not working. If you want a more concrete answer you'll have to comment on this answer with the specific problem you are facing (i.e. what specifically is "wrong"?).

SSIS write DT_NTEXT into an UTF-8 csv file

I need to write the result of an SQL query into a CSV file (UTF-8 (I need this encoding as there are French letters)). One of the columns is too large (more than 20000 char) so I can't use DT_WSTR for it. The type that is inputted is DT_TEXT so I use a Data Conversion to change it to DT_NTEXT. But then when I want to write it to the file I have this error message :
Error 2 Validation error. The data type for "input column" is
DT_NTEXT, which is not supported with ANSI files. Use DT_TEXT instead
and convert the data to DT_NTEXT using the data conversion component
Is there a way I can write the data to my file?
Thank you
I had this kind of issues also sometimes. When working with data larger than 255 characters SSIS sees it as blob data and will always handle this as such.
I then converted this blob stream data to a readable text with a script component. Then other transformation should be possible.
This was the case in ssis that came with sql server 2008 but I believe this isn't changed yet.
I ended up doing just like Samyne says, I used a script.
First I've modified my SQL SP, instead of having several columns I put all the info in one single column like follows :
Select Column1 + '^' + Column2 + '^' + Column3 ...
Then I used this code in a script
string fileName = Dts.Variables["SLTemplateFilePath"].Value.ToString();
using (var stream = new FileStream(fileName, FileMode.Truncate))
{
using (var sw = new StreamWriter(stream, Encoding.UTF8))
{
OleDbDataAdapter oleDA = new OleDbDataAdapter();
DataTable dt = new DataTable();
oleDA.Fill(dt, Dts.Variables["FileData"].Value);
foreach (DataRow row in dt.Rows)
{
foreach (DataColumn column in dt.Columns)
{
sw.WriteLine(row[column]);
}
}
sw.WriteLine();
}
}
Putting all the info in one column is optional, I just wanted to avoid handling it in the script, this way if my SP is changed I don't need to modify the SSIS.

dynamic SQL execution and saving the result in flat file in SSIS

I want to create a SSIS package which writes a file with data generated by executing a SQL Statement. This generic package will be invoked by other packages passing in correct SQL as a variable.
Thus in the generic package :
I want to execute a dynamic SELECT query and fetch dynamic number of columns from a single database instance, the connection string does not per call and store the result into a flat file.
What would be an ideal way to accomplish this in SSIS.
What I tried :
The simplest solution that I could find was a writing a script task which would open a SQL connection , execute the SQL using SQLCommand, populate a datatable using the data fetched and write the contents directly to the file system using System.io.File and Release the connection.
I tried using OLE Database source with the SQLsupplied by a variable (with Validation set to false) and directing the rows into a Flat file connection. However due to the dynamic number and names of the columns I ran into errors.
Is there a more standard way of achieving this without using a script task?
How about this ... concatenate all field values into one field, and map AllFields to a field in a text file destination.
SELECT [f1]+',' + [f2] AS AllFields FROM [dbo].[A]
All of the "other"packages will know how to create the correct SQL. Their only contract with the "generic" package would be to eventually have only one field nameed "AllFields".
To answer your question directly, I do not think there is a "standard" way to do this. I believe the solution from Anoop would work well and while I have not tested the idea I wish I would have investigated it before writing my own solution. You should not need a script task in that solution...
In any case, I did write my own way to generate csv files from SQL tables that may run up against edge cases and need polishing but works rather well right now. I am looping through multiple tables before this task so the CurrentTable variable can be replaced with any variable you want.
Here is my code:
public void Main()
{
string datetime = DateTime.Now.ToString("yyyyMMddHHmmss");
try
{
string TableName = Dts.Variables["User::CurrentTable"].Value.ToString();
string FileDelimiter = ",";
string TextQualifier = "\"";
string FileExtension = ".csv";
//USE ADO.NET Connection from SSIS Package to get data from table
SqlConnection myADONETConnection = new SqlConnection();
myADONETConnection = (SqlConnection)(Dts.Connections["connection manager name"].AcquireConnection(Dts.Transaction) as SqlConnection);
//Read data from table or view to data table
string query = "Select * From [" + TableName + "]";
SqlCommand cmd = new SqlCommand(query, myADONETConnection);
//myADONETConnection.Open();
DataTable d_table = new DataTable();
d_table.Load(cmd.ExecuteReader());
//myADONETConnection.Close();
string FileFullPath = Dts.Variables["$Project::ExcelToCsvFolder"].Value.ToString() + "\\Output\\" + TableName + FileExtension;
StreamWriter sw = null;
sw = new StreamWriter(FileFullPath, false);
// Write the Header Row to File
int ColumnCount = d_table.Columns.Count;
for (int ic = 0; ic < ColumnCount; ic++)
{
sw.Write(TextQualifier + d_table.Columns[ic] + TextQualifier);
if (ic < ColumnCount - 1)
{
sw.Write(FileDelimiter);
}
}
sw.Write(sw.NewLine);
// Write All Rows to the File
foreach (DataRow dr in d_table.Rows)
{
for (int ir = 0; ir < ColumnCount; ir++)
{
if (!Convert.IsDBNull(dr[ir]))
{
sw.Write(TextQualifier + dr[ir].ToString() + TextQualifier);
}
if (ir < ColumnCount - 1)
{
sw.Write(FileDelimiter);
}
}
sw.Write(sw.NewLine);
}
sw.Close();
Dts.TaskResult = (int)ScriptResults.Success;
}
catch (Exception exception)
{
// Create Log File for Errors
//using (StreamWriter sw = File.CreateText(Dts.Variables["User::LogFolder"].Value.ToString() + "\\" +
// "ErrorLog_" + datetime + ".log"))
//{
// sw.WriteLine(exception.ToString());
//}
Dts.TaskResult = (int)ScriptResults.Failure;
throw;
}
Dts.TaskResult = (int)ScriptResults.Success;

How to fetch a row from one table and insert it into another table and get new PK value

I have two similar tables on different databases.
Database1/TableA
Database2/TableA
I want to fetch a row from one table and insert it into other table on other server. Like:
Database1/TableA
Id State Name
500 OH John [Fetch this row]
Database2/TableA
Id State Name
1 OH John [Insert and fetch PK '1']
I tried this using bulkcopy and it works fine.
But problem is I need to get PK from the new insert as I need to populate another child table.
Is there any better way to achieve this? Please on C# code, no database linking or SQL queries. Just C# solutions. Or if query can be used in C# code that is fine. Any working example code with Dataset or Datarow will be great help.
Thanks!
First you need to get the row(s) from Database.TableA. You could for example use a SqlDataAdapter with a DataTable or a SqlDataReader.
SCOPE_IDENTITY returns the last identity value inserted into an identity column in the same scope. A scope is a module: a stored procedure, trigger, function, or batch. Therefore, two statements are in the same scope if they are in the same stored procedure, function, or batch.
You can use SqlCommand.ExecuteScalar to execute the insert command and retrieve the new ID in one query.
const String sqlSelect = "SELECT COL1,COl2,Col3 FROM TableA WHERE COL1=#COL1;"
const String sqlInsert = "INSERT INTO TableA (COl2,Col3)VALUES (#Col2,#Col3);"
+ "SELECT CAST(scope_identity() AS int)";
using (var con1 = new SqlConnection(db1ConnectionString))
using (var con2 = new SqlConnection(db2ConnectionString))
{
con1.Open();
con2.Open();
using(var selectCommand = new SqlCommand(sqlSelect, con1))
{
selectCommand.Parameters.AddWithValue("#COL1", 4711);
using (var reader = selectCommand.ExecuteReader())
{
if (reader.Read())
{
int newID;
using (var insertCommand = new SqlCommand(sqlInsert, con2))
{
for (int i = 0; i < reader.FieldCount; i++)
{
insertCommand.Parameters.AddWithValue("#" + reader.GetName(i), reader[i]);
}
newID = (int)insertCommand.ExecuteScalar();
}
}
}
}
}