How to Add a shapefile data to postgis(postgres) using c# - gis

am trying to add shapefile data to postgis using c#
string path = browse_path.Text;
ProcessStartInfo startInfo = new ProcessStartInfo("CMD.exe");
Process p = new Process();
startInfo.RedirectStandardInput = true;
startInfo.UseShellExecute = false;
startInfo.RedirectStandardOutput = true;
startInfo.RedirectStandardError = true;
p = Process.Start(startInfo);
string chgdir = #"chdir " + #"C:\Program Files\PostgreSQL\9.4\bin\";
p.StandardInput.WriteLine(chgdir);
string pass = #"set PGPASSWORD=postgres";
p.StandardInput.WriteLine(pass);
string cmd = #"shp2pgsql -I -s 4326 " + path + " public.states | psql -U postgres -d postgres";
p.StandardInput.WriteLine(cmd);
p.WaitForExit();
p.Close();`
and for waiting almost 7-8 mins its not working. my shp file is 160 kb only.. but the command is working fine if i run it in the cmd rather then using code..

This is a function I wrote to import shapefiles to PG. It uses Nuget packages CliWrap and Npgsql and I just copied shp2pgsql and its dependencies to a project folder 'Tools' so it can be run on a machine that doesn't have PostgreSQL installed. Its a bit messy and you might need to add error handling but it worked for my needs.
public async static Task<bool> OutputSHPtoPSGLAsync(string shpfile, string host, string user, string pwd, string db, string schema = "public", bool dropIfExists = true, string table = "[SHPNAME]")
{
FileInfo shp = new FileInfo(shpfile);
if (!shp.Exists) return false;
if (table == "[SHPNAME]") table = Path.GetFileNameWithoutExtension(shpfile).ToLower();
string args = string.Format("{0} {1}.{2}", shpfile, schema, table);
Command cli = Cli.Wrap(Path.Combine(AppDomain.CurrentDomain.BaseDirectory, #"tools\shp2pgsql.exe")).WithArguments(args);
ExecutionResult eo = await cli.ExecuteAsync();
string sql = eo.StandardOutput;
if (dropIfExists) sql = sql.Replace("CREATE TABLE", string.Format("DROP TABLE IF EXISTS \"{0}\".\"{1}\";\r\nCREATE TABLE", schema, table));
string constring = string.Format("Host={0};Username={1};Password={2};Database={3}", host, user, pwd, db);
using (NpgsqlConnection connection = new NpgsqlConnection(constring))
{
connection.Open();
new NpgsqlCommand(sql, connection).ExecuteNonQuery();
}
return true;
}
I was looking at NetTopologySuite which has type definitions compatible with Npgsql and PostGIS but its all still pre-release and couldn't be bothered working it out.

Related

How to call stored procedure in Entity Framework Core with input and output parameters using mysql

I am using ASP.net Core 2.2 with Entity Framework core 2.2.6 and Pomelo.EntityFrameworkCore.MySql 2.2.0 for connectivity with MySQL, I have a stored procedure which takes 3 input parameters and 1 output parameter. I am able to call it in MySQL workbench like
CALL GetTechniciansByTrade('Automobile', 1, 10, #total);
select #total;
Now I want to call this using entity framework core, the code I am currently using is
var outputParameter = new MySqlParameter("#PageCount", MySqlDbType.Int32);
outputParameter.Direction = System.Data.ParameterDirection.Output;
var results = await _context.GetTechnicians.FromSql("Call GetTechniciansByTrade(#MyTrade, #PageIndex, #PageSize, #PageCount OUT)",
new MySqlParameter("#MyTrade", Trade),
new MySqlParameter("#PageIndex", PageIndex),
new MySqlParameter("#PageSize", PageSize),
outputParameter).ToListAsync();
int PageCount = (int)outputParameter.Value;
Exception I am getting currently is
Only ParameterDirection.Input is supported when CommandType is Text (parameter name: #PageCount)
Can you try below things.
Use exec instead of call
var results = await _context.GetTechnicians.FromSql("EXEC GetTechniciansByTrade(#MyTrade, #PageIndex, #PageSize, #PageCount OUTPUT)"
Select PageCount in stored procedure
I got information from this github issue.
I found the solution using #matt-g suggestion based on this Question.
I had to use ADO.net for this as
var technicians = new List<TechnicianModel>();
using (MySqlConnection lconn = new MySqlConnection(_context.Database.GetDbConnection().ConnectionString))
{
lconn.Open();
using (MySqlCommand cmd = new MySqlCommand())
{
cmd.Connection = lconn;
cmd.CommandText = "GetTechniciansByTrade"; // The name of the Stored Proc
cmd.CommandType = CommandType.StoredProcedure; // It is a Stored Proc
cmd.Parameters.AddWithValue("#Trade", Trade);
cmd.Parameters.AddWithValue("#PageIndex", PageIndex);
cmd.Parameters.AddWithValue("#PageSize", PageSize);
cmd.Parameters.AddWithValue("#PageCount", MySqlDbType.Int32);
cmd.Parameters["#PageCount"].Direction = ParameterDirection.Output; // from System.Data
using (var reader = cmd.ExecuteReader())
{
while (reader.Read())
{
technicians.Add(new TechnicianModel()
{
City = reader["City"].ToString(),
ExperienceYears = reader["ExperienceYears"] != null ? Convert.ToInt32(reader["ExperienceYears"]) : 0,
Id = Guid.Parse(reader["Id"].ToString()),
Name = reader["Name"].ToString(),
Qualification = reader["Qualification"].ToString(),
Town = reader["Town"].ToString()
});
}
}
Object obj = cmd.Parameters["#PageCount"].Value;
var lParam = (Int32)obj; // more useful datatype
}
}

Best approach to populate DynamoDb table

Please keep in mind this is a open question and I am not looking for a specific answer but just approaches and routes I can take.
Essentially I am getting a csv file from my aws s3 bucket. I am able to get it successfully using
AmazonS3 s3Client = new AmazonS3Client(new ProfileCredentialsProvider());
S3Object object = s3Client.getObject(
new GetObjectRequest(bucketName, key));
Now I want to populate a dynamodb table using this JSON file.
I was confused as i found all sorts of stuff online.
Here is one suggestion - This approach is however only reading the file it is not inserting anything to the dynamodb table.
Here is another suggestion - This approach is lot closer to what i am looking for , it is populating a table from a JSON file.
However i was wondering is there a generic way to ready any json file and populate a dynamodb table based on that ? Also for my case what approach is the best?
Since i originally asked the question I did more work.
What I have done so far
I have a csv file sitting in s3 that looks like this
name,position,points,assists,rebounds
Lebron James,SF,41,12,11
Kyrie Irving,PG,41,7,5
Stephen Curry,PG,29,8,4
Klay Thompson,SG,31,5,5
I am able to sucessfully pick it up as a s3object doing the following
AmazonS3 s3client = new AmazonS3Client(/**new ProfileCredentialsProvider()*/);
S3Object object = s3client.getObject(
new GetObjectRequest("lambda-function-bucket-blah-blah", "nba.json"));
InputStream objectData = object.getObjectContent();
Now I want to insert this in to my dynamodb table so i am attempting the following.
AmazonDynamoDBClient dbClient = new AmazonDynamoDBClient();
dbClient.setRegion(Region.getRegion(Regions.US_BLAH_1));
DynamoDB dynamoDB = new DynamoDB(dbClient);
//DynamoDB dynamoDB = new DynamoDB(client);
Table table = dynamoDB.getTable("MyTable");
//after this point i have tried many json parsers etc and did table.put(item) etc but nothing has worked. I would appreciate kind help
For CSV parsing, you can use plain reader as your file looks quite simple
AmazonS3 s3client = new AmazonS3Client(/**new ProfileCredentialsProvider()*/);
S3Object object = s3client.getObject(
new GetObjectRequest("lambda-function-bucket-blah-blah", "nba.json"));
InputStream objectData = object.getObjectContent();
AmazonDynamoDBClient dbClient = new AmazonDynamoDBClient();
dbClient.setRegion(Region.getRegion(Regions.US_BLAH_1));
DynamoDB dynamoDB = new DynamoDB(dbClient);
//DynamoDB dynamoDB = new DynamoDB(client);
Table table = dynamoDB.getTable("MyTable");
String line = "";
String cvsSplitBy = ",";
try (BufferedReader br = new BufferedReader(
new InputStreamReader(objectData, "UTF-8"));
while ((line = br.readLine()) != null) {
// use comma as separator
String[] elements = line.split(cvsSplitBy);
try {
table.putItem(new Item()
.withPrimaryKey("name", elements[0])
.withString("position", elements[1])
.withInt("points", elements[2])
.....);
System.out.println("PutItem succeeded: " + elements[0]);
} catch (Exception e) {
System.err.println("Unable to add user: " + elements);
System.err.println(e.getMessage());
break;
}
}
} catch (IOException e) {
e.printStackTrace();
}
Depending the complexity of your CSV, you can use 3rd party libraries like Apache CSV Parser or open CSV
I leave the original answer for parsing JSon
I would use the Jackson library and following your code do the following
AmazonS3 s3client = new AmazonS3Client(/**new ProfileCredentialsProvider()*/);
S3Object object = s3client.getObject(
new GetObjectRequest("lambda-function-bucket-blah-blah", "nba.json"));
InputStream objectData = object.getObjectContent();
AmazonDynamoDBClient dbClient = new AmazonDynamoDBClient();
dbClient.setRegion(Region.getRegion(Regions.US_BLAH_1));
DynamoDB dynamoDB = new DynamoDB(dbClient);
//DynamoDB dynamoDB = new DynamoDB(client);
Table table = dynamoDB.getTable("MyTable");
JsonParser parser = new JsonFactory()
.createParser(objectData);
JsonNode rootNode = new ObjectMapper().readTree(parser);
Iterator<JsonNode> iter = rootNode.iterator();
ObjectNode currentNode;
while (iter.hasNext()) {
currentNode = (ObjectNode) iter.next();
String lastName = currentNode.path("lastName").asText();
String firstName = currentNode.path("firstName").asText();
int minutes = currentNode.path("minutes").asInt();
// read all attributes from your JSon file
try {
table.putItem(new Item()
.withPrimaryKey("lastName", lastName, "firstName", firstName)
.withInt("minutes", minutes));
System.out.println("PutItem succeeded: " + lastName + " " + firstName);
} catch (Exception e) {
System.err.println("Unable to add user: " + lastName + " " + firstName);
System.err.println(e.getMessage());
break;
}
}
parser.close();
Inserting the records in your table will depend of your schema, I just put an arbitrary example, but anyway this will get you the reading of your file and the way to insert into the dynamoDB table
As you talked about the different approaches, another possibility is to setup a AWS Pipeline

Checking Null Cells within MySql Table

i just wander if there is away to check MySql asp Table if its Cells has Values or Not .
i'v used this to return if Database has Row/Record there
string ConnectionString = #"Server=MYSQL5011.Smarterasp.net;Database=db_9d6c52_ahmed;Uid=9d6c52_ahmed;Pwd=******;";
MySqlConnection GetConnection = new MySqlConnection(ConnectionString);
GetConnection.Open();
string VoiceorScreenSearch = "Select User_Voice ,User_Screen From User where User_Stat=#UserStat";
MySqlCommand Comand = new MySqlCommand(VoiceorScreenSearch, GetConnection);
Comand.Parameters.AddWithValue(#"UserStat", key);
MySqlDataReader ReadData = Comand.ExecuteReader();
if (ReadData.HasRows)
{
hasrowsornot = true;
}
but i need it to return if Cell[1] is null or Has data !, My Cells Datatype is BLOB
and tips of doing this ? , will be helpful
Thanks
You can try like this:
if (!ReadData.IsDbNull(yourfield)) {
var value = ReadData.GetString(yourfield);
// some code
}

not able to take backup of sql server compact database

I am trying to take backup of sql server compact database, I used this code but it is not working.
var srv = new Server(#".\SQLEXPRESS");
SaveFileDialog SD = new SaveFileDialog();
SD.ShowDialog();
Backup BkpDBase = new Backup();
this.Cursor = this.Cursor = Cursors.WaitCursor;
//this.dataGridView1.DataSource = string.Empty;
try
{
string fileName = SD.FileName;
BkpDBase.Action = BackupActionType.Database;
BkpDBase.Database = "TapeDatabase.sdf";
BackupDeviceItem bkpDevice = new BackupDeviceItem(fileName, DeviceType.File);
BkpDBase.Devices.Add(bkpDevice);
BkpDBase.SqlBackup(srv);
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
}
With SQL server compact, just use File.Copy

Restore MYSQL from CMD line using Java Environment

I am using Java and Mysql for a Program, I am Using a Script File in order to restore a Databsae.
Under Java I am Executing a command:under bin: mysql -u root -proot test< c:\test.mysql
It is not running while If I run it under cmd line it will execute properly and restore the database.
Is anybody there who knows.. why it happens..
Whats the problem, why its not running if i run it under Java environment.
EXACT SYNTAX:
I m Using Process P= runtime.getRunTime().exec(FilePath)
where FilePath Variable is having value: mysql -u root -proot test< c:\test.mysql
I am Using Windiws environment. while if I run the FilePath in CmdLine, it will give the perfect reesult.
Highly thankful or help.
I had the same problem!
Actually the only thing that I could make work (on Windows, havent tested other platforms) is using batch files:
here is the code:
public class MysqlDatabase {
private int BUFFER = 10485760;
private String host, port, user, password, db;
public MysqlDatabase(String host, String port, String user, String password, String db) {
this.host = host;
this.port = port;
this.user = user;
this.password = password;
this.db = db;
}
public boolean restoreDatabase(String filepath) throws Exception {
String comando = "mysql " + db + " --host=" + host + " --port=" + port
+ " --user=" + user + " --password=" + password
+ " < " + filepath;
File f = new File("restore.bat");
FileOutputStream fos = new FileOutputStream(f);
fos.write(comando.getBytes());
fos.close();
Process run = Runtime.getRuntime().exec("cmd /C start restore.bat ");
return true;
}
public String getFull() throws Exception {
Process run = Runtime.getRuntime().exec(
"mysqldump --host=" + host + " --port=" + port
+ " --user=" + user + " --password=" + password
+ " --opt "
+ "" + db);
InputStream in = run.getInputStream();
BufferedReader br = new BufferedReader(new InputStreamReader(in));
StringBuilder temp = new StringBuilder();
int count;
char[] cbuf = new char[BUFFER];
while ((count = br.read(cbuf, 0, BUFFER)) != -1) {
temp.append(cbuf, 0, count);
}
br.close();
in.close();
return temp.toString();
}}
I think we need some more information. As long as the paths are set up the same, if it will run from the command line, it should run the same from Runtime.exec(). What errors do you see?
Try setting the commend up in a script so you can echo the paths and the command output to a file to look at later. In UNIX that would look like
LOGFILE=my.log
echo $PATH > $LOGFILE
env >> $LOGFILE
mysql -u root -proot test< c:\test.mysql >> $LOGFILE 2>&1
It looks like you're using Windows, so I don't know how to set of the command file exactly this way; what's important is to make sure you're sending the error output of the mysql commend to the file.