Unable to load huge files using Load Data infile local into mysql - mysql

I tried to load huge data file eg 10 mb files into the db using Load Data Local Infile into mysql using a windows service. But it failed and the windows service stopped without any exception. I tried with files of small size such as 2 mb and it succeded.
Is there any way I can load huge files into the mysql db? And what may be the reason for the windows service to stop?
here is my code..
public int UpdateDataBase(string query, string filename, Logger swLog)
{
int status = 0;
using (MySqlConnection myconnection = new MySqlConnection(connectionCdrBank))
{
try
{
myconnection.ConnectionTimeout = 1000;
myconnection.Open();
myconnection.BeginTransaction();
MySqlCommand mycommand = new MySqlCommand("ClearTempTable", myconnection);
mycommand.CommandType = CommandType.StoredProcedure;
status = mycommand.ExecuteNonQuery();
MySqlCommand mycommand1 = new MySqlCommand(query, myconnection);
mycommand1.CommandTimeout = 1000;
status = mycommand1.ExecuteNonQuery();
MySqlCommand mycommand2 = new MySqlCommand("InsertToCdrDetailsTempTable", myconnection);
mycommand2.CommandType = CommandType.StoredProcedure;
mycommand2.CommandTimeout = 1000;
status = Convert.ToInt32(mycommand2.ExecuteScalar());
myconnection.Commit();
myconnection.Close();
}
catch (Exception ex)
{
Console.WriteLine("Error:" + ex.Message);
if (myconnection.State == ConnectionState.Open)
myconnection.Rollback();
status = (ex.Message.IndexOf("Duplicate entry") != -1) ? -1000 : 0;
swLog.WriteErrorToLog("");
swLog.WriteErrorToLog("Error while saving: " + ex.Message);
mail.SentMail("Error while saving:", ex.Message);
swLog.CloseLogger();
}
}
return status;
}
where query is
LOAD DATA LOCAL INFILE 'a.txt' INTO TABLE tbl_cdrload" FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n';

I'm loading up to 100 MB files into MySQL Blob fields. And this is just my artificial limit.
So, 10MB files should not be a problem.
Did you set max_allowed_packet correctly?
What do you use to load the files? Your own code or some tool? What MySQL Connector?

Related

xamarin mysql command show Index was outside the bounds of the array

I'm trying to insert data into mysql database.
I'm using mysqlConnector.
But after I confirm insertion with my code.
Debugger returns Index was outside the bounds of the array
Here is my code of insertion:
if (co.OpenConnection() == true)
{
string query = "INSERT INTO inventory (flItem) VALUES('" + ItemName.Text + "');";
MySqlCommand cmd = new MySqlCommand(query, co.connection);
//MySqlDataReader reader = cmd.ExecuteReader();
cmd.ExecuteReader();
await DisplayAlert("TEST", cmd.CommandText, "OK");
await Navigation.PushAsync(new AppShell());
}
Thanks for any help in advance.

How to Add a shapefile data to postgis(postgres) using c#

am trying to add shapefile data to postgis using c#
string path = browse_path.Text;
ProcessStartInfo startInfo = new ProcessStartInfo("CMD.exe");
Process p = new Process();
startInfo.RedirectStandardInput = true;
startInfo.UseShellExecute = false;
startInfo.RedirectStandardOutput = true;
startInfo.RedirectStandardError = true;
p = Process.Start(startInfo);
string chgdir = #"chdir " + #"C:\Program Files\PostgreSQL\9.4\bin\";
p.StandardInput.WriteLine(chgdir);
string pass = #"set PGPASSWORD=postgres";
p.StandardInput.WriteLine(pass);
string cmd = #"shp2pgsql -I -s 4326 " + path + " public.states | psql -U postgres -d postgres";
p.StandardInput.WriteLine(cmd);
p.WaitForExit();
p.Close();`
and for waiting almost 7-8 mins its not working. my shp file is 160 kb only.. but the command is working fine if i run it in the cmd rather then using code..
This is a function I wrote to import shapefiles to PG. It uses Nuget packages CliWrap and Npgsql and I just copied shp2pgsql and its dependencies to a project folder 'Tools' so it can be run on a machine that doesn't have PostgreSQL installed. Its a bit messy and you might need to add error handling but it worked for my needs.
public async static Task<bool> OutputSHPtoPSGLAsync(string shpfile, string host, string user, string pwd, string db, string schema = "public", bool dropIfExists = true, string table = "[SHPNAME]")
{
FileInfo shp = new FileInfo(shpfile);
if (!shp.Exists) return false;
if (table == "[SHPNAME]") table = Path.GetFileNameWithoutExtension(shpfile).ToLower();
string args = string.Format("{0} {1}.{2}", shpfile, schema, table);
Command cli = Cli.Wrap(Path.Combine(AppDomain.CurrentDomain.BaseDirectory, #"tools\shp2pgsql.exe")).WithArguments(args);
ExecutionResult eo = await cli.ExecuteAsync();
string sql = eo.StandardOutput;
if (dropIfExists) sql = sql.Replace("CREATE TABLE", string.Format("DROP TABLE IF EXISTS \"{0}\".\"{1}\";\r\nCREATE TABLE", schema, table));
string constring = string.Format("Host={0};Username={1};Password={2};Database={3}", host, user, pwd, db);
using (NpgsqlConnection connection = new NpgsqlConnection(constring))
{
connection.Open();
new NpgsqlCommand(sql, connection).ExecuteNonQuery();
}
return true;
}
I was looking at NetTopologySuite which has type definitions compatible with Npgsql and PostGIS but its all still pre-release and couldn't be bothered working it out.

Best approach to populate DynamoDb table

Please keep in mind this is a open question and I am not looking for a specific answer but just approaches and routes I can take.
Essentially I am getting a csv file from my aws s3 bucket. I am able to get it successfully using
AmazonS3 s3Client = new AmazonS3Client(new ProfileCredentialsProvider());
S3Object object = s3Client.getObject(
new GetObjectRequest(bucketName, key));
Now I want to populate a dynamodb table using this JSON file.
I was confused as i found all sorts of stuff online.
Here is one suggestion - This approach is however only reading the file it is not inserting anything to the dynamodb table.
Here is another suggestion - This approach is lot closer to what i am looking for , it is populating a table from a JSON file.
However i was wondering is there a generic way to ready any json file and populate a dynamodb table based on that ? Also for my case what approach is the best?
Since i originally asked the question I did more work.
What I have done so far
I have a csv file sitting in s3 that looks like this
name,position,points,assists,rebounds
Lebron James,SF,41,12,11
Kyrie Irving,PG,41,7,5
Stephen Curry,PG,29,8,4
Klay Thompson,SG,31,5,5
I am able to sucessfully pick it up as a s3object doing the following
AmazonS3 s3client = new AmazonS3Client(/**new ProfileCredentialsProvider()*/);
S3Object object = s3client.getObject(
new GetObjectRequest("lambda-function-bucket-blah-blah", "nba.json"));
InputStream objectData = object.getObjectContent();
Now I want to insert this in to my dynamodb table so i am attempting the following.
AmazonDynamoDBClient dbClient = new AmazonDynamoDBClient();
dbClient.setRegion(Region.getRegion(Regions.US_BLAH_1));
DynamoDB dynamoDB = new DynamoDB(dbClient);
//DynamoDB dynamoDB = new DynamoDB(client);
Table table = dynamoDB.getTable("MyTable");
//after this point i have tried many json parsers etc and did table.put(item) etc but nothing has worked. I would appreciate kind help
For CSV parsing, you can use plain reader as your file looks quite simple
AmazonS3 s3client = new AmazonS3Client(/**new ProfileCredentialsProvider()*/);
S3Object object = s3client.getObject(
new GetObjectRequest("lambda-function-bucket-blah-blah", "nba.json"));
InputStream objectData = object.getObjectContent();
AmazonDynamoDBClient dbClient = new AmazonDynamoDBClient();
dbClient.setRegion(Region.getRegion(Regions.US_BLAH_1));
DynamoDB dynamoDB = new DynamoDB(dbClient);
//DynamoDB dynamoDB = new DynamoDB(client);
Table table = dynamoDB.getTable("MyTable");
String line = "";
String cvsSplitBy = ",";
try (BufferedReader br = new BufferedReader(
new InputStreamReader(objectData, "UTF-8"));
while ((line = br.readLine()) != null) {
// use comma as separator
String[] elements = line.split(cvsSplitBy);
try {
table.putItem(new Item()
.withPrimaryKey("name", elements[0])
.withString("position", elements[1])
.withInt("points", elements[2])
.....);
System.out.println("PutItem succeeded: " + elements[0]);
} catch (Exception e) {
System.err.println("Unable to add user: " + elements);
System.err.println(e.getMessage());
break;
}
}
} catch (IOException e) {
e.printStackTrace();
}
Depending the complexity of your CSV, you can use 3rd party libraries like Apache CSV Parser or open CSV
I leave the original answer for parsing JSon
I would use the Jackson library and following your code do the following
AmazonS3 s3client = new AmazonS3Client(/**new ProfileCredentialsProvider()*/);
S3Object object = s3client.getObject(
new GetObjectRequest("lambda-function-bucket-blah-blah", "nba.json"));
InputStream objectData = object.getObjectContent();
AmazonDynamoDBClient dbClient = new AmazonDynamoDBClient();
dbClient.setRegion(Region.getRegion(Regions.US_BLAH_1));
DynamoDB dynamoDB = new DynamoDB(dbClient);
//DynamoDB dynamoDB = new DynamoDB(client);
Table table = dynamoDB.getTable("MyTable");
JsonParser parser = new JsonFactory()
.createParser(objectData);
JsonNode rootNode = new ObjectMapper().readTree(parser);
Iterator<JsonNode> iter = rootNode.iterator();
ObjectNode currentNode;
while (iter.hasNext()) {
currentNode = (ObjectNode) iter.next();
String lastName = currentNode.path("lastName").asText();
String firstName = currentNode.path("firstName").asText();
int minutes = currentNode.path("minutes").asInt();
// read all attributes from your JSon file
try {
table.putItem(new Item()
.withPrimaryKey("lastName", lastName, "firstName", firstName)
.withInt("minutes", minutes));
System.out.println("PutItem succeeded: " + lastName + " " + firstName);
} catch (Exception e) {
System.err.println("Unable to add user: " + lastName + " " + firstName);
System.err.println(e.getMessage());
break;
}
}
parser.close();
Inserting the records in your table will depend of your schema, I just put an arbitrary example, but anyway this will get you the reading of your file and the way to insert into the dynamoDB table
As you talked about the different approaches, another possibility is to setup a AWS Pipeline

Headers also inserting into database while upload csv file data

Here headers are also inserting into database .here uploading the csv file with comma separated data
string Feedback = string.Empty;
string connString = ConfigurationManager.ConnectionStrings["DataBaseConnectionString"].ConnectionString;
using (MySqlConnection conn = new MySqlConnection(connString))
{
var copy = new MySqlBulkLoader(conn);
conn.Open();
try
{
copy.TableName = "BulkImportDetails";
copy.FileName = fileName;
copy.FieldTerminator = ",";
copy.LineTerminator = #"\n";
copy.Load();
Feedback = "Upload complete";
}
catch (Exception ex)
{
Feedback = ex.Message;
}
finally { conn.Close(); }
}
return Feedback;
Use the NumberOfLinesToSkip property to skip the first line, like so:
copy.NumberOfLinesToSkip = 1;
The use of this property is clearly shown in the documentation for MySQLBulkLoader. You must make a habit of reading the documentation to resolve your queries before you post a question here.

not able to take backup of sql server compact database

I am trying to take backup of sql server compact database, I used this code but it is not working.
var srv = new Server(#".\SQLEXPRESS");
SaveFileDialog SD = new SaveFileDialog();
SD.ShowDialog();
Backup BkpDBase = new Backup();
this.Cursor = this.Cursor = Cursors.WaitCursor;
//this.dataGridView1.DataSource = string.Empty;
try
{
string fileName = SD.FileName;
BkpDBase.Action = BackupActionType.Database;
BkpDBase.Database = "TapeDatabase.sdf";
BackupDeviceItem bkpDevice = new BackupDeviceItem(fileName, DeviceType.File);
BkpDBase.Devices.Add(bkpDevice);
BkpDBase.SqlBackup(srv);
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
}
With SQL server compact, just use File.Copy