not able to take backup of sql server compact database - sql-server-2008

I am trying to take backup of sql server compact database, I used this code but it is not working.
var srv = new Server(#".\SQLEXPRESS");
SaveFileDialog SD = new SaveFileDialog();
SD.ShowDialog();
Backup BkpDBase = new Backup();
this.Cursor = this.Cursor = Cursors.WaitCursor;
//this.dataGridView1.DataSource = string.Empty;
try
{
string fileName = SD.FileName;
BkpDBase.Action = BackupActionType.Database;
BkpDBase.Database = "TapeDatabase.sdf";
BackupDeviceItem bkpDevice = new BackupDeviceItem(fileName, DeviceType.File);
BkpDBase.Devices.Add(bkpDevice);
BkpDBase.SqlBackup(srv);
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
}

With SQL server compact, just use File.Copy

Related

How to call stored procedure in Entity Framework Core with input and output parameters using mysql

I am using ASP.net Core 2.2 with Entity Framework core 2.2.6 and Pomelo.EntityFrameworkCore.MySql 2.2.0 for connectivity with MySQL, I have a stored procedure which takes 3 input parameters and 1 output parameter. I am able to call it in MySQL workbench like
CALL GetTechniciansByTrade('Automobile', 1, 10, #total);
select #total;
Now I want to call this using entity framework core, the code I am currently using is
var outputParameter = new MySqlParameter("#PageCount", MySqlDbType.Int32);
outputParameter.Direction = System.Data.ParameterDirection.Output;
var results = await _context.GetTechnicians.FromSql("Call GetTechniciansByTrade(#MyTrade, #PageIndex, #PageSize, #PageCount OUT)",
new MySqlParameter("#MyTrade", Trade),
new MySqlParameter("#PageIndex", PageIndex),
new MySqlParameter("#PageSize", PageSize),
outputParameter).ToListAsync();
int PageCount = (int)outputParameter.Value;
Exception I am getting currently is
Only ParameterDirection.Input is supported when CommandType is Text (parameter name: #PageCount)
Can you try below things.
Use exec instead of call
var results = await _context.GetTechnicians.FromSql("EXEC GetTechniciansByTrade(#MyTrade, #PageIndex, #PageSize, #PageCount OUTPUT)"
Select PageCount in stored procedure
I got information from this github issue.
I found the solution using #matt-g suggestion based on this Question.
I had to use ADO.net for this as
var technicians = new List<TechnicianModel>();
using (MySqlConnection lconn = new MySqlConnection(_context.Database.GetDbConnection().ConnectionString))
{
lconn.Open();
using (MySqlCommand cmd = new MySqlCommand())
{
cmd.Connection = lconn;
cmd.CommandText = "GetTechniciansByTrade"; // The name of the Stored Proc
cmd.CommandType = CommandType.StoredProcedure; // It is a Stored Proc
cmd.Parameters.AddWithValue("#Trade", Trade);
cmd.Parameters.AddWithValue("#PageIndex", PageIndex);
cmd.Parameters.AddWithValue("#PageSize", PageSize);
cmd.Parameters.AddWithValue("#PageCount", MySqlDbType.Int32);
cmd.Parameters["#PageCount"].Direction = ParameterDirection.Output; // from System.Data
using (var reader = cmd.ExecuteReader())
{
while (reader.Read())
{
technicians.Add(new TechnicianModel()
{
City = reader["City"].ToString(),
ExperienceYears = reader["ExperienceYears"] != null ? Convert.ToInt32(reader["ExperienceYears"]) : 0,
Id = Guid.Parse(reader["Id"].ToString()),
Name = reader["Name"].ToString(),
Qualification = reader["Qualification"].ToString(),
Town = reader["Town"].ToString()
});
}
}
Object obj = cmd.Parameters["#PageCount"].Value;
var lParam = (Int32)obj; // more useful datatype
}
}

parsing issue with JSON data from SQL 2017 to MongoDB

I am working on c# utility to migrate data from SQL server 2017 to MongoDB. Below are steps I am following
1) Getting data from SQL server in JSON format (FOR JSON AUTO)
2) Parsing into BSON document
3) Then trying to insert into MongoDB
But I am getting error while reading JSON data from SQL.
My Json data is combination of root attributes as well as nested objects.
So Its dynamic data, that I want to PUSH as it is to MongoDB.
string jsonData = string.Empty;
foreach (var userId in userIdList)
{
using (SqlConnection con = new SqlConnection("Data Source=;Initial Catalog=;Integrated Security=True"))
{
using (SqlCommand cmd = new SqlCommand("Usp_GetUserdata", con))
{
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.Add("#userId", SqlDbType.Int).Value = userId;
con.Open();
var reader = cmd.ExecuteReader();
jsonResult = new StringBuilder();
//cmd.ExecuteNonQuery();
if (!reader.HasRows)
{
jsonResult.Append("[]");
}
else
{
while (reader.Read())
{
jsonResult.Append(reader.GetValue(0));
jsonData = reader.GetValue(0).ToString();
File.WriteAllText(#"c:\a.txt", jsonResult.ToString());
File.WriteAllText(#"c:\a.txt",jsonData);
jsonData.TrimEnd(']');
jsonData.TrimStart('[');
//Create client connection to our MongoDB database
var client = new MongoClient(MongoDBConnectionString);
//Create a session object that is used when leveraging transactions
var session = client.StartSession();
//Create the collection object that represents the "products" collection
var employeeCollection = session.Client.GetDatabase("mongodev").GetCollection<BsonDocument>("EmpData");
//Begin transaction
session.StartTransaction();
try
{
dynamic resultJson = JsonConvert.DeserializeObject(result);
var document = BsonSerializer.Deserialize<BsonDocument>(resultJson);
//MongoDB.Bson.BsonDocument document
// = MongoDB.Bson.Serialization.BsonSerializer.Deserialize<BsonDocument>(jsonResult);
employeeCollection.InsertOneAsync(document);
//BsonArray pipeline =
// MongoDB.Bson.Serialization.BsonSerializer.Deserialize<BsonArray>(jsonData);
//var documents = pipeline.Select(val => val.AsBsonDocument);
//employeeCollection.InsertManyAsync(documents);
session.CommitTransaction();
}
catch (Exception e)
{
Console.WriteLine(e);
session.AbortTransaction();
throw;
}
}
}
}
}
}

How to Add a shapefile data to postgis(postgres) using c#

am trying to add shapefile data to postgis using c#
string path = browse_path.Text;
ProcessStartInfo startInfo = new ProcessStartInfo("CMD.exe");
Process p = new Process();
startInfo.RedirectStandardInput = true;
startInfo.UseShellExecute = false;
startInfo.RedirectStandardOutput = true;
startInfo.RedirectStandardError = true;
p = Process.Start(startInfo);
string chgdir = #"chdir " + #"C:\Program Files\PostgreSQL\9.4\bin\";
p.StandardInput.WriteLine(chgdir);
string pass = #"set PGPASSWORD=postgres";
p.StandardInput.WriteLine(pass);
string cmd = #"shp2pgsql -I -s 4326 " + path + " public.states | psql -U postgres -d postgres";
p.StandardInput.WriteLine(cmd);
p.WaitForExit();
p.Close();`
and for waiting almost 7-8 mins its not working. my shp file is 160 kb only.. but the command is working fine if i run it in the cmd rather then using code..
This is a function I wrote to import shapefiles to PG. It uses Nuget packages CliWrap and Npgsql and I just copied shp2pgsql and its dependencies to a project folder 'Tools' so it can be run on a machine that doesn't have PostgreSQL installed. Its a bit messy and you might need to add error handling but it worked for my needs.
public async static Task<bool> OutputSHPtoPSGLAsync(string shpfile, string host, string user, string pwd, string db, string schema = "public", bool dropIfExists = true, string table = "[SHPNAME]")
{
FileInfo shp = new FileInfo(shpfile);
if (!shp.Exists) return false;
if (table == "[SHPNAME]") table = Path.GetFileNameWithoutExtension(shpfile).ToLower();
string args = string.Format("{0} {1}.{2}", shpfile, schema, table);
Command cli = Cli.Wrap(Path.Combine(AppDomain.CurrentDomain.BaseDirectory, #"tools\shp2pgsql.exe")).WithArguments(args);
ExecutionResult eo = await cli.ExecuteAsync();
string sql = eo.StandardOutput;
if (dropIfExists) sql = sql.Replace("CREATE TABLE", string.Format("DROP TABLE IF EXISTS \"{0}\".\"{1}\";\r\nCREATE TABLE", schema, table));
string constring = string.Format("Host={0};Username={1};Password={2};Database={3}", host, user, pwd, db);
using (NpgsqlConnection connection = new NpgsqlConnection(constring))
{
connection.Open();
new NpgsqlCommand(sql, connection).ExecuteNonQuery();
}
return true;
}
I was looking at NetTopologySuite which has type definitions compatible with Npgsql and PostGIS but its all still pre-release and couldn't be bothered working it out.

Headers also inserting into database while upload csv file data

Here headers are also inserting into database .here uploading the csv file with comma separated data
string Feedback = string.Empty;
string connString = ConfigurationManager.ConnectionStrings["DataBaseConnectionString"].ConnectionString;
using (MySqlConnection conn = new MySqlConnection(connString))
{
var copy = new MySqlBulkLoader(conn);
conn.Open();
try
{
copy.TableName = "BulkImportDetails";
copy.FileName = fileName;
copy.FieldTerminator = ",";
copy.LineTerminator = #"\n";
copy.Load();
Feedback = "Upload complete";
}
catch (Exception ex)
{
Feedback = ex.Message;
}
finally { conn.Close(); }
}
return Feedback;
Use the NumberOfLinesToSkip property to skip the first line, like so:
copy.NumberOfLinesToSkip = 1;
The use of this property is clearly shown in the documentation for MySQLBulkLoader. You must make a habit of reading the documentation to resolve your queries before you post a question here.

Unable to load huge files using Load Data infile local into mysql

I tried to load huge data file eg 10 mb files into the db using Load Data Local Infile into mysql using a windows service. But it failed and the windows service stopped without any exception. I tried with files of small size such as 2 mb and it succeded.
Is there any way I can load huge files into the mysql db? And what may be the reason for the windows service to stop?
here is my code..
public int UpdateDataBase(string query, string filename, Logger swLog)
{
int status = 0;
using (MySqlConnection myconnection = new MySqlConnection(connectionCdrBank))
{
try
{
myconnection.ConnectionTimeout = 1000;
myconnection.Open();
myconnection.BeginTransaction();
MySqlCommand mycommand = new MySqlCommand("ClearTempTable", myconnection);
mycommand.CommandType = CommandType.StoredProcedure;
status = mycommand.ExecuteNonQuery();
MySqlCommand mycommand1 = new MySqlCommand(query, myconnection);
mycommand1.CommandTimeout = 1000;
status = mycommand1.ExecuteNonQuery();
MySqlCommand mycommand2 = new MySqlCommand("InsertToCdrDetailsTempTable", myconnection);
mycommand2.CommandType = CommandType.StoredProcedure;
mycommand2.CommandTimeout = 1000;
status = Convert.ToInt32(mycommand2.ExecuteScalar());
myconnection.Commit();
myconnection.Close();
}
catch (Exception ex)
{
Console.WriteLine("Error:" + ex.Message);
if (myconnection.State == ConnectionState.Open)
myconnection.Rollback();
status = (ex.Message.IndexOf("Duplicate entry") != -1) ? -1000 : 0;
swLog.WriteErrorToLog("");
swLog.WriteErrorToLog("Error while saving: " + ex.Message);
mail.SentMail("Error while saving:", ex.Message);
swLog.CloseLogger();
}
}
return status;
}
where query is
LOAD DATA LOCAL INFILE 'a.txt' INTO TABLE tbl_cdrload" FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n';
I'm loading up to 100 MB files into MySQL Blob fields. And this is just my artificial limit.
So, 10MB files should not be a problem.
Did you set max_allowed_packet correctly?
What do you use to load the files? Your own code or some tool? What MySQL Connector?