How to read from MySQL table Polygon data - mysql

I am developing an application that I need location data to be stored on MySQL table. In addition to point locations, I need regions (polygon) as well.
I am currently writing the polygon coordinates as follow :
oMySQLConnecion = new MySqlConnection(DatabaseConnectionString);
if (oMySQLConnecion.State == System.Data.ConnectionState.Closed || oMySQLConnecion.State == System.Data.ConnectionState.Broken)
{
oMySQLConnecion.Open();
}
if (oMySQLConnecion.State == System.Data.ConnectionState.Open)
{
string Query = #"INSERT INTO region (REGION_POLYGON) VALUES (PolygonFromText(#Parameter1))";
MySqlCommand oCommand = new MySqlCommand(Query, oMySQLConnecion);
oCommand.Parameters.AddWithValue("#Parameter1", PolygonString);
int sqlSuccess = oCommand.ExecuteNonQuery();
oMySQLConnecion.Close();
oDBStatus.Type = DBDataStatusType.SUCCESS;
oDBStatus.Message = DBMessageType.SUCCESSFULLY_DATA_UPDATED;
return oDBStatus;
}
After the execution, I see the Blob in MySQL table.
Now I want to read the data back for my testing and it does not work the way I tried below :
if (oMySQLConnecion.State == System.Data.ConnectionState.Open)
{
string Query = #"SELECT REGION_ID,REGION_NICK_NAME,GeomFromText(REGION_POLYGON) AS POLYGON FROM region WHERE REGION_USER_ID = #Parameter1";
MySqlCommand oCommand = new MySqlCommand(Query, oMySQLConnecion);
oCommand.Parameters.AddWithValue("#Parameter1", UserID);
using (var reader = oCommand.ExecuteReader())
{
while (reader.Read())
{
R_PolygonCordinates oRec = new R_PolygonCordinates();
oRec.RegionNumber = Convert.ToInt32(reader["REGION_ID"]);
oRec.RegionNickName = reader["REGION_NICK_NAME"].ToString();
oRec.PolygonCodinates = reader["POLYGON"].ToString();
polygons.Add(oRec);
}
}
int sqlSuccess = oCommand.ExecuteNonQuery();
oMySQLConnecion.Close();
return polygons;
}
It returns an empty string.
I am not sure if I am really writing the data since I can not read Blob.
Is my reading syntax incorrect?
** Note:** I am using Visual Studio 2017. The MySQL latest version with Spacial classes.
Any help is highly appreciated.
Thanks

GeomFromText() takes a WKT (the standardized "well-known text" format) value as input and returns the MySQL internal geometry type as output.
This is the inverse of what you need, which is ST_AsWKT() or ST_AsText() -- take an internal-format geometry object as input and return WKT as output.
Prior to 5.6, the function is called AsWKT() or AsText(). In 5.7 these are all synonyms for exactly the same function, but the non ST_* functions are deprecated and will be removed in the future.
https://dev.mysql.com/doc/refman/5.7/en/gis-format-conversion-functions.html#function_st-astext
I don't know for certain what the ST_ prefix means, but I assume it's "spatial type." There's some discussion in WL#8055 that may be of interest.

Related

Load 3D model in Unity using Resource folder and Mysql

I want to load 3D model using Resource folder. I created an sql database to store the address. In this case I stored the file "deer-3ds" in folder "Models" and also save these information in a table named "modeladdress" in sql.
So please help me to correct my code. I know that it's 100% wrong but I dont know how to fix it. Thank you.
using UnityEngine;
using System.Collections;
using System;
using System.Data;
using Mono.Data.Sqlite;
public class addobject : MonoBehaviour {
// Use this for initialization
void Start () {
//GameObject deer=Instantiate(Resources.Load("deer-3d.bak",typeof(GameObject)))as GameObject;
// GameObject instance = Instantiate(Resources.Load("Models/deer-3ds", typeof(GameObject))) as GameObject;
string conn = "URI=file:" + Application.dataPath + "/modeladdress.s3db"; //Path to database.
IDbConnection dbconn;
dbconn = (IDbConnection) new SqliteConnection(conn);
dbconn.Open(); //Open connection to the database.
IDbCommand dbcmd = dbconn.CreateCommand();
string sqlQuery = "SELECT ordinary,foldername, filename " + "FROM modeladdress";
dbcmd.CommandText = sqlQuery;
IDataReader reader = dbcmd.ExecuteReader();
while (reader.Read ()) {
int ordinary = reader.GetInt32 (0);
string foldername = reader.GetString (1);
string filename = reader.GetString (2);
string path = foldername + "/" + filename;
//Debug.Log( "value= "+value+" name ="+name+" random ="+ rand);
GameObject instance = Instantiate(Resources.Load(path, typeof(GameObject))) as GameObject;
instance.SetActive (true);
}
reader.Close();
reader = null;
dbcmd.Dispose();
dbcmd = null;
dbconn.Close();
dbconn = null;
}
// Update is called once per frame
void Update () {
// GameObject instance = Instantiate(Resources.Load("Models/deer-3ds", typeof(GameObject))) as GameObject;
// instance.SetActive (true);
}
}
First of all, you are using SQLite at your database management system, not MySQL. Second, the way you have written your query,
string sqlQuery = "SELECT ordinary,foldername, filename " + "FROM modeladdress";
Will return the ordinary, foldername, and filename for every model. You need to use a WHERE clause to specify precisely which model you want to use. Thus, you need some way to know which model you want to query from the database before you actually execute the query, and in that case, why even query a database? You're going to have to store some unique identifier anyway so a database solves nothing.
Now concerning the actual code you have written, it appears to be correct (i.e. it should be returning what you want). The problem must be that either your table is empty, your values that are returned are incorrect, or that the object is being instantiated in an incorrect location and thus you are thinking it's not working. If you want a more concrete answer you'll have to comment on this answer with the specific problem you are facing (i.e. what specifically is "wrong"?).

SqlBulkCopy does not store accented characters properly

I am inserting French language text into nvarchar column in SQL server 2008. The French accented characters are not stored properly in the SQL DB.
string strData = "Accented chars- Les caractères accentués français ";
DataTable dtTemp = new DataTable();
dtTemp.Columns.Add("ID", typeof(string));
dtTemp.Columns.Add("Value", typeof(string));
DataRow dr = dtTemp.NewRow();
dr["ID"] = "100";
dr["Value"] = strData;
dtTemp.Rows.Add(dr);
strSQLCon = GetSQLConnectionString();
using (SqlConnection cn = new SqlConnection(strSQLCon))
{
cn.Open();
using (SqlBulkCopy copy = new SqlBulkCopy(cn))
{
copy.ColumnMappings.Add("ID", "ID");
copy.ColumnMappings.Add("Value", "Value");
copy.DestinationTableName = "MYTABLE";
copy.WriteToServer(dtTemp);
}
}
The French characters are not stored properly in SQL server data base.
It works fine when i do a normal insert query. insert into MYTABLEvalues(1 , 'Accented chars- Les caractères accentués français')
Please let me know why it does not work with SQL Bulk copy class. Any settings need to be changed or C# code needs to be modified to store the non-English characters properly.
I am designing this table, the collation for every column is set to French_CI_AS, French culture, accent sensitive. Every sql string type considered.
I am building a typed dataset for this table (not the purpose of this question).
Sql bulk copy:
var ds = new FrenchCharacters.FrenchDataSet();
using (var destinationConnection = new SqlConnection(StackOverflow.Properties.Settings.Default.StackOverflowConnectionString))
{
destinationConnection.Open();
//all French characters http://en.wikipedia.org/wiki/French_orthography
string[] sArray = new string[] {
"Àà, Ââ, Ææ, Ää"
, "Çç"
, "Îî, Ïï"
, "Ôô, Œœ, Öö"
, "Ùù, Ûû, Üü"
, "Ÿÿ"
};
// open the connection
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(destinationConnection.ConnectionString))
{
bulkCopy.BatchSize = 500;
bulkCopy.NotifyAfter = 10000;
bulkCopy.DestinationTableName = "French";
//
// build data table to be written to the server
// data table is now strongly-typed ds.French
//
for (int i = 0; i < 100; i++)
{
foreach (string s in sArray)
ds.French.AddFrenchRow(s, s, s, s, s, s);
}
//
bulkCopy.WriteToServer(ds.French);
}
}
result:
Notice no invalid entries whatever the sql char type!.
I tested your code on the following table and it worked fine, at least on SQL Server 2012.
CREATE TABLE [dbo].[MYTABLE](
[ID] [varchar](20) NOT NULL,
[Value] [nvarchar](255) NOT NULL)
thanks for your replies. The issue seems to occur while reading the csv file into data table before bulk insert. I included the encoding parameter while reading the csv file. (Encoding.Default) and it loads the french text properly and it gets stored in SQL DB without any issues.
old code:
List lstData = File.ReadAllLines(stFile).ToList();
Working code:
List lstData = File.ReadAllLines(stFile, Encoding.Default).ToList();
thanks
Ashok

Excluding Content From SQL Bulk Insert

I want to import my IIS logs into SQL for reporting using Bulk Insert, but the comment lines - the ones that start with a # - cause a problem becasue those lines do not have the same number f fields as the data lines.
If I manually deleted the comments, I can perform a bulk insert.
Is there a way to perform a bulk insert while excluding lines based on a match such as : any line that beings with a "#".
Thanks.
The approach I generally use with BULK INSERT and irregular data is to push the incoming data into a temporary staging table with a single VARCHAR(MAX) column.
Once it's in there, I can use more flexible decision-making tools like SQL queries and string functions to decide which rows I want to select out of the staging table and bring into my main tables. This is also helpful because BULK INSERT can be maddeningly cryptic about the why and how of why it fails on a specific file.
The only other option I can think of is using pre-upload scripting to trim comments and other lines that don't fit your tabular criteria before you do your bulk insert.
I recommend using logparser.exe instead. LogParser has some pretty neat capabilities on its own, but it can also be used to format the IIS log to be properly imported by SQL Server.
Microsoft has a tool called "PrepWebLog" http://support.microsoft.com/kb/296093 - which strips-out these hash/pound characters, however I'm running it now (using a PowerShell script for multiple files) and am finding its performance intolerably slow.
I think it'd be faster if I wrote a C# program (or maybe even a macro).
Update: PrepWebLog just crashed on me. I'd avoid it.
Update #2, I looked at PowerShell's Get-Content and Set-Content commands but didn't like the syntax and possible performance. So I wrote this little C# console app:
if (args.Length == 2)
{
string path = args[0];
string outPath = args[1];
Regex hashString = new Regex("^#.+\r\n", RegexOptions.Multiline | RegexOptions.Compiled);
foreach (string file in Directory.GetFiles(path, "*.log"))
{
string data;
using (StreamReader sr = new StreamReader(file))
{
data = sr.ReadToEnd();
}
string output = hashString.Replace(data, string.Empty);
using (StreamWriter sw = new StreamWriter(Path.Combine(outPath, new FileInfo(file).Name), false))
{
sw.Write(output);
}
}
}
else
{
Console.WriteLine("Source and Destination Log Path required or too many arguments");
}
It's pretty quick.
Following up on what PeterX wrote, I modified the application to handle large log files since anything sufficiently large would create an out-of-memory exception. Also, since we're only interested in whether or not the first character of a line starts with a hash, we can just use StartsWith() method on the read operation.
class Program
{
static void Main(string[] args)
{
if (args.Length == 2)
{
string path = args[0];
string outPath = args[1];
string line;
foreach (string file in Directory.GetFiles(path, "*.log"))
{
using (StreamReader sr = new StreamReader(file))
{
using (StreamWriter sw = new StreamWriter(Path.Combine(outPath, new FileInfo(file).Name), false))
{
while ((line = sr.ReadLine()) != null)
{
if(!line.StartsWith("#"))
{
sw.WriteLine(line);
}
}
}
}
}
}
else
{
Console.WriteLine("Source and Destination Log Path required or too many arguments");
}
}
}

Dapper And System.Data.OleDb DbType.Date throwing 'OleDbException : Data type mismatch in criteria expression'

Not sure if I should raise an issue regarding this, so thought I would ask if anybody knew a simple workaround for this first. I am getting an error when I try to use Dapper with OleDbConnection when used in combination with MS Access 2003 (Jet.4.0) (not my choice of database!)
When running the test code below I get an exception 'OleDbException : Data type mismatch in criteria expression'
var count = 0;
using (var conn = new OleDbConnection(connString)) {
conn.Open();
var qry = conn.Query<TestTable>("select * from testtable where CreatedOn <= #CreatedOn;", new { CreatedOn = DateTime.Now });
count = qry.Count();
}
I believe from experience in the past with OleDb dates, is that when setting the DbType to Date, it then changes internally the value for OleDbType property to OleDbTimeStamp instead of OleDbType.Date. I understand this is not because of Dapper, but what 'could' be considered a strange way of linking internally in the OleDbParameter class
When dealing with this either using other ORMs, raw ADO or my own factory objects, I would clean up the command object just prior to running the command and change the OleDbType to Date.
This is not possible with Dapper as far as I can see as the command object appears to be internal. Unfortunately I have not had time to learn the dynamic generation stuff, so I could be missing something simple or I might suggest a fix and contribute rather than simply raise an issue.
Any thoughts?
Lee
It's an old thread but I had the same problem: Access doesn't like DateTime with milliseconds, so you have to add and extension method like this :
public static DateTime Floor(this DateTime date, TimeSpan span)
{
long ticks = date.Ticks / span.Ticks;
return new DateTime(ticks * span.Ticks, date.Kind);
}
And use it when passing parameters:
var qry = conn.Query<TestTable>("select * from testtable where CreatedOn <= #CreatedOn;", new { CreatedOn = DateTime.Now.Floor(TimeSpan.FromSeconds(1)) });
Unfortunately, with current Dapper version (1.42), we cannot add custom TypeHandler for base types (see #206).
If you can modify Dapper (use the cs file and not the DLL) merge this pull request and then you do not have to use Floor on each parameters :
public class DateTimeTypeHandler : SqlMapper.TypeHandler<DateTime>
{
public override DateTime Parse(object value)
{
if (value == null || value is DBNull)
{
return default(DateTime);
}
return (DateTime)value;
}
public override void SetValue(IDbDataParameter parameter, DateTime value)
{
parameter.DbType = DbType.DateTime;
parameter.Value = value.Floor(TimeSpan.FromSeconds(1));
}
}
SqlMapper.AddTypeHandler<DateTime>(new DateTimeTypeHandler());

LINQ variable to list of string without using column names?

In an C# ASP.Net MVC project, I'm trying to make a List<string> from a LINQ variable.
Now this might be a pretty basic thing, but I just cannot get that to work without using the actual column names for the data in that variable. The thing is that in the interests of trying to make the program as dynamic as possible, I'm leaving it up to a stored procedure to get the data out. There can be any amount of any which way named columns depending on where the data is fetched from. All I care about is taking all of their values into a List<string>, so that I can compare user-input values with them in program.
Pointing to the columns by their names in the code means I'd have to make dozens of overloaded methods that all just basically do the same thing. Below is false non-functioning code. But it should open up the idea of what I mean.
// call for stored procedure
var courses = db.spFetchCourseInformation().ToList();
// if the data fails a check on a single row, it will not pass the check
bool passed = true;
foreach (var i in courses)
{
// each row should be cast into a list of string, which can then be validated
// on a row-by-row basis
List courseRow = new List();
courseRow = courses[i]; // yes, obviously this is wrong syntax
int matches = 0;
foreach (string k in courseRow)
{
if (validator.checkMatch(courseRow[k].ToString()))
{
matches++;
}
}
if (matches == 0)
{
passed = false;
break;
}
}
Now below is an example of how I currently have to do it because I need to use the names for the columns
for (int i = 0; i < courses.Count; i++)
{
int matches = 0;
if (validator.checkMatch(courses[i].Name))
matches++;
if (validator.checkMatch(courses[i].RandomOtherColumn))
matches++;
if (validator.checkMatch(courses[i].RandomThirdColumn))
matches++;
if (validator.checkMatch(courses[i].RandomFourthColumn))
matches++;
/* etc...
* etc...
* you get the point
* and one of these for each and every possible variation from the stored procedure, NOT good practice
* */
Thanks for help!
I'm not 100% sure what problem you are trying to solve (matching user data to a particular record in the DB?), but I'm pretty sure you're going about this in slightly the wrong fashion by putting the data in a List. I
t should be possible to get your user input in an IDictionary with the key being used for the column name, and the object as the input data field.
Then when you get the data from the SP, you can get the data back in a DataReader (a la http://msmvps.com/blogs/deborahk/archive/2009/07/09/dal-access-a-datareader-using-a-stored-procedure.aspx).
DataReaders are indexed on column name, so if you run through the keys in the input data IDictionary, you can check the DataReader to see if it has matching data.
using (SqlDataReader reader = Dac.ExecuteDataReader("CustomerRetrieveAll", null))
{
while (reader.Read())
{
foreach(var key in userInputDictionary.AllKeys)
{
var data = reader[key];
if (data != userInputDictionary[key]) continue;
}
}
}
Still not sure about the problem you are solving but, I hope this helps!
A little creative reflection should do the trick.
var courses = db.spFetchCourseInformation()
var values = courses.SelectMany(c => c.GetType().GetProperties() // gets the properties for your object
.Select(property => property.GetValue(c, null))); // gets the value of each property
List<string> stringValues = new List<string>(
values.Select(v => v == null ? string.Empty : v.ToString()) // some of those values will likely be null
.Distinct()); // remove duplicates