parsing issue with JSON data from SQL 2017 to MongoDB - json

I am working on c# utility to migrate data from SQL server 2017 to MongoDB. Below are steps I am following
1) Getting data from SQL server in JSON format (FOR JSON AUTO)
2) Parsing into BSON document
3) Then trying to insert into MongoDB
But I am getting error while reading JSON data from SQL.
My Json data is combination of root attributes as well as nested objects.
So Its dynamic data, that I want to PUSH as it is to MongoDB.
string jsonData = string.Empty;
foreach (var userId in userIdList)
{
using (SqlConnection con = new SqlConnection("Data Source=;Initial Catalog=;Integrated Security=True"))
{
using (SqlCommand cmd = new SqlCommand("Usp_GetUserdata", con))
{
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.Add("#userId", SqlDbType.Int).Value = userId;
con.Open();
var reader = cmd.ExecuteReader();
jsonResult = new StringBuilder();
//cmd.ExecuteNonQuery();
if (!reader.HasRows)
{
jsonResult.Append("[]");
}
else
{
while (reader.Read())
{
jsonResult.Append(reader.GetValue(0));
jsonData = reader.GetValue(0).ToString();
File.WriteAllText(#"c:\a.txt", jsonResult.ToString());
File.WriteAllText(#"c:\a.txt",jsonData);
jsonData.TrimEnd(']');
jsonData.TrimStart('[');
//Create client connection to our MongoDB database
var client = new MongoClient(MongoDBConnectionString);
//Create a session object that is used when leveraging transactions
var session = client.StartSession();
//Create the collection object that represents the "products" collection
var employeeCollection = session.Client.GetDatabase("mongodev").GetCollection<BsonDocument>("EmpData");
//Begin transaction
session.StartTransaction();
try
{
dynamic resultJson = JsonConvert.DeserializeObject(result);
var document = BsonSerializer.Deserialize<BsonDocument>(resultJson);
//MongoDB.Bson.BsonDocument document
// = MongoDB.Bson.Serialization.BsonSerializer.Deserialize<BsonDocument>(jsonResult);
employeeCollection.InsertOneAsync(document);
//BsonArray pipeline =
// MongoDB.Bson.Serialization.BsonSerializer.Deserialize<BsonArray>(jsonData);
//var documents = pipeline.Select(val => val.AsBsonDocument);
//employeeCollection.InsertManyAsync(documents);
session.CommitTransaction();
}
catch (Exception e)
{
Console.WriteLine(e);
session.AbortTransaction();
throw;
}
}
}
}
}
}

Related

How to call stored procedure in Entity Framework Core with input and output parameters using mysql

I am using ASP.net Core 2.2 with Entity Framework core 2.2.6 and Pomelo.EntityFrameworkCore.MySql 2.2.0 for connectivity with MySQL, I have a stored procedure which takes 3 input parameters and 1 output parameter. I am able to call it in MySQL workbench like
CALL GetTechniciansByTrade('Automobile', 1, 10, #total);
select #total;
Now I want to call this using entity framework core, the code I am currently using is
var outputParameter = new MySqlParameter("#PageCount", MySqlDbType.Int32);
outputParameter.Direction = System.Data.ParameterDirection.Output;
var results = await _context.GetTechnicians.FromSql("Call GetTechniciansByTrade(#MyTrade, #PageIndex, #PageSize, #PageCount OUT)",
new MySqlParameter("#MyTrade", Trade),
new MySqlParameter("#PageIndex", PageIndex),
new MySqlParameter("#PageSize", PageSize),
outputParameter).ToListAsync();
int PageCount = (int)outputParameter.Value;
Exception I am getting currently is
Only ParameterDirection.Input is supported when CommandType is Text (parameter name: #PageCount)
Can you try below things.
Use exec instead of call
var results = await _context.GetTechnicians.FromSql("EXEC GetTechniciansByTrade(#MyTrade, #PageIndex, #PageSize, #PageCount OUTPUT)"
Select PageCount in stored procedure
I got information from this github issue.
I found the solution using #matt-g suggestion based on this Question.
I had to use ADO.net for this as
var technicians = new List<TechnicianModel>();
using (MySqlConnection lconn = new MySqlConnection(_context.Database.GetDbConnection().ConnectionString))
{
lconn.Open();
using (MySqlCommand cmd = new MySqlCommand())
{
cmd.Connection = lconn;
cmd.CommandText = "GetTechniciansByTrade"; // The name of the Stored Proc
cmd.CommandType = CommandType.StoredProcedure; // It is a Stored Proc
cmd.Parameters.AddWithValue("#Trade", Trade);
cmd.Parameters.AddWithValue("#PageIndex", PageIndex);
cmd.Parameters.AddWithValue("#PageSize", PageSize);
cmd.Parameters.AddWithValue("#PageCount", MySqlDbType.Int32);
cmd.Parameters["#PageCount"].Direction = ParameterDirection.Output; // from System.Data
using (var reader = cmd.ExecuteReader())
{
while (reader.Read())
{
technicians.Add(new TechnicianModel()
{
City = reader["City"].ToString(),
ExperienceYears = reader["ExperienceYears"] != null ? Convert.ToInt32(reader["ExperienceYears"]) : 0,
Id = Guid.Parse(reader["Id"].ToString()),
Name = reader["Name"].ToString(),
Qualification = reader["Qualification"].ToString(),
Town = reader["Town"].ToString()
});
}
}
Object obj = cmd.Parameters["#PageCount"].Value;
var lParam = (Int32)obj; // more useful datatype
}
}

Insert data to sqlite from json array - UWP

I have stored json array values to a class and deserializing it using below code. How can I insert that json data to sqlite?
[{"ID":1,"name":"Shyam","class":"a"},{"ID":2,"name":"Bran","class":"b"}]
using Newtonsoft.Json;
using SQLitePCL;
var StudentJSON = await response.Content.ReadAsStringAsync();
var jsonData = JsonConvert.DeserializeObject<List<StudentClass>>(StudentJSON);
using (var connection = new SQLiteConnection(Windows.Storage.ApplicationData.Current.LocalFolder.Path + "\\Student_DB.sqlite"))
{
using (var statement = connection.Prepare(#"INSERT INTO Student (ID,name,class)
VALUES(?, ?,?);"))
{
// Inserts data.
}
}
Using 'SQLite.Net' you can insert data into sqlite table as,
// Insert the new student record in the Student table.
string dbPath = Path.Combine(Windows.Storage.ApplicationData.Current.LocalFolder.Path, "StudentDb.sqlite");
public void Insert(Student std)
{
using (SQLite.Net.SQLiteConnection conn = new SQLite.Net.SQLiteConnection(new SQLite.Net.Platform.WinRT.SQLitePlatformWinRT(), dbPath))
{
conn.RunInTransaction(() =>
{
conn.Insert(std);
});
}
}

Best approach to populate DynamoDb table

Please keep in mind this is a open question and I am not looking for a specific answer but just approaches and routes I can take.
Essentially I am getting a csv file from my aws s3 bucket. I am able to get it successfully using
AmazonS3 s3Client = new AmazonS3Client(new ProfileCredentialsProvider());
S3Object object = s3Client.getObject(
new GetObjectRequest(bucketName, key));
Now I want to populate a dynamodb table using this JSON file.
I was confused as i found all sorts of stuff online.
Here is one suggestion - This approach is however only reading the file it is not inserting anything to the dynamodb table.
Here is another suggestion - This approach is lot closer to what i am looking for , it is populating a table from a JSON file.
However i was wondering is there a generic way to ready any json file and populate a dynamodb table based on that ? Also for my case what approach is the best?
Since i originally asked the question I did more work.
What I have done so far
I have a csv file sitting in s3 that looks like this
name,position,points,assists,rebounds
Lebron James,SF,41,12,11
Kyrie Irving,PG,41,7,5
Stephen Curry,PG,29,8,4
Klay Thompson,SG,31,5,5
I am able to sucessfully pick it up as a s3object doing the following
AmazonS3 s3client = new AmazonS3Client(/**new ProfileCredentialsProvider()*/);
S3Object object = s3client.getObject(
new GetObjectRequest("lambda-function-bucket-blah-blah", "nba.json"));
InputStream objectData = object.getObjectContent();
Now I want to insert this in to my dynamodb table so i am attempting the following.
AmazonDynamoDBClient dbClient = new AmazonDynamoDBClient();
dbClient.setRegion(Region.getRegion(Regions.US_BLAH_1));
DynamoDB dynamoDB = new DynamoDB(dbClient);
//DynamoDB dynamoDB = new DynamoDB(client);
Table table = dynamoDB.getTable("MyTable");
//after this point i have tried many json parsers etc and did table.put(item) etc but nothing has worked. I would appreciate kind help
For CSV parsing, you can use plain reader as your file looks quite simple
AmazonS3 s3client = new AmazonS3Client(/**new ProfileCredentialsProvider()*/);
S3Object object = s3client.getObject(
new GetObjectRequest("lambda-function-bucket-blah-blah", "nba.json"));
InputStream objectData = object.getObjectContent();
AmazonDynamoDBClient dbClient = new AmazonDynamoDBClient();
dbClient.setRegion(Region.getRegion(Regions.US_BLAH_1));
DynamoDB dynamoDB = new DynamoDB(dbClient);
//DynamoDB dynamoDB = new DynamoDB(client);
Table table = dynamoDB.getTable("MyTable");
String line = "";
String cvsSplitBy = ",";
try (BufferedReader br = new BufferedReader(
new InputStreamReader(objectData, "UTF-8"));
while ((line = br.readLine()) != null) {
// use comma as separator
String[] elements = line.split(cvsSplitBy);
try {
table.putItem(new Item()
.withPrimaryKey("name", elements[0])
.withString("position", elements[1])
.withInt("points", elements[2])
.....);
System.out.println("PutItem succeeded: " + elements[0]);
} catch (Exception e) {
System.err.println("Unable to add user: " + elements);
System.err.println(e.getMessage());
break;
}
}
} catch (IOException e) {
e.printStackTrace();
}
Depending the complexity of your CSV, you can use 3rd party libraries like Apache CSV Parser or open CSV
I leave the original answer for parsing JSon
I would use the Jackson library and following your code do the following
AmazonS3 s3client = new AmazonS3Client(/**new ProfileCredentialsProvider()*/);
S3Object object = s3client.getObject(
new GetObjectRequest("lambda-function-bucket-blah-blah", "nba.json"));
InputStream objectData = object.getObjectContent();
AmazonDynamoDBClient dbClient = new AmazonDynamoDBClient();
dbClient.setRegion(Region.getRegion(Regions.US_BLAH_1));
DynamoDB dynamoDB = new DynamoDB(dbClient);
//DynamoDB dynamoDB = new DynamoDB(client);
Table table = dynamoDB.getTable("MyTable");
JsonParser parser = new JsonFactory()
.createParser(objectData);
JsonNode rootNode = new ObjectMapper().readTree(parser);
Iterator<JsonNode> iter = rootNode.iterator();
ObjectNode currentNode;
while (iter.hasNext()) {
currentNode = (ObjectNode) iter.next();
String lastName = currentNode.path("lastName").asText();
String firstName = currentNode.path("firstName").asText();
int minutes = currentNode.path("minutes").asInt();
// read all attributes from your JSon file
try {
table.putItem(new Item()
.withPrimaryKey("lastName", lastName, "firstName", firstName)
.withInt("minutes", minutes));
System.out.println("PutItem succeeded: " + lastName + " " + firstName);
} catch (Exception e) {
System.err.println("Unable to add user: " + lastName + " " + firstName);
System.err.println(e.getMessage());
break;
}
}
parser.close();
Inserting the records in your table will depend of your schema, I just put an arbitrary example, but anyway this will get you the reading of your file and the way to insert into the dynamoDB table
As you talked about the different approaches, another possibility is to setup a AWS Pipeline

Headers also inserting into database while upload csv file data

Here headers are also inserting into database .here uploading the csv file with comma separated data
string Feedback = string.Empty;
string connString = ConfigurationManager.ConnectionStrings["DataBaseConnectionString"].ConnectionString;
using (MySqlConnection conn = new MySqlConnection(connString))
{
var copy = new MySqlBulkLoader(conn);
conn.Open();
try
{
copy.TableName = "BulkImportDetails";
copy.FileName = fileName;
copy.FieldTerminator = ",";
copy.LineTerminator = #"\n";
copy.Load();
Feedback = "Upload complete";
}
catch (Exception ex)
{
Feedback = ex.Message;
}
finally { conn.Close(); }
}
return Feedback;
Use the NumberOfLinesToSkip property to skip the first line, like so:
copy.NumberOfLinesToSkip = 1;
The use of this property is clearly shown in the documentation for MySQLBulkLoader. You must make a habit of reading the documentation to resolve your queries before you post a question here.

Convert SqlCommand Output to List<MyType>?

I am using an ADO.NET SqlCommand with a single SqlDbType.Structured parameter to send a table-valued parameter to a sproc. The sproc returns many rows, which I need to get into a strongly-Typed List of . What is the best way to convert the result set (whether DataTable from a DataAdapter or DataReader bits) into List?
Thanks.
You can use LINQ with a DataReader:
var list = reader.Cast<IDataRecord>()
.Select(dr => new YourType { Name = dr.GetString(0), ... })
.ToList();
The most efficient way is using datareader:
var items = new LinkedList<MyClass>();
using(var connection = GetConnection()) {
using(var cmd = connection.CreateCommand()){
cmd.CommandText = "... your SQL statement ...";
// ... add parameters
cnn.Open();
using(var reader = cmd.ExecuteReader()) {
// accessing values via number index is most efficient
//gets index of column with name "PrimaryKey"
var ndxPrimaryKey = reader.GetOrdinal("PrimaryKey");
var ndxColumn1 = reader.GetOrdinal("Column1");
var ndxColumn2 = reader.GetOrdinal("Column2");
while(reader.Read()) {
var item = new MyClass();
// returns value of column "PrimaryKey" typed to nullable Guid
item.PrimaryKey = reader.GetValue(ndxPrimaryKey) as Guid?;
item.Column1 = reader.GetValue(ndxColumn1) as string;
item.Column2 = reader.GetValue(ndxColumn2) as int?;
items.AddLast(item);
}
}
cnn.Close();
}
}
return items;
i think you can use Dapper to convert a query to a class.
for more information see my answer in this link