I am working on c# utility to migrate data from SQL server 2017 to MongoDB. Below are steps I am following
1) Getting data from SQL server in JSON format (FOR JSON AUTO)
2) Parsing into BSON document
3) Then trying to insert into MongoDB
But I am getting error while reading JSON data from SQL.
My Json data is combination of root attributes as well as nested objects.
So Its dynamic data, that I want to PUSH as it is to MongoDB.
string jsonData = string.Empty;
foreach (var userId in userIdList)
{
using (SqlConnection con = new SqlConnection("Data Source=;Initial Catalog=;Integrated Security=True"))
{
using (SqlCommand cmd = new SqlCommand("Usp_GetUserdata", con))
{
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.Add("#userId", SqlDbType.Int).Value = userId;
con.Open();
var reader = cmd.ExecuteReader();
jsonResult = new StringBuilder();
//cmd.ExecuteNonQuery();
if (!reader.HasRows)
{
jsonResult.Append("[]");
}
else
{
while (reader.Read())
{
jsonResult.Append(reader.GetValue(0));
jsonData = reader.GetValue(0).ToString();
File.WriteAllText(#"c:\a.txt", jsonResult.ToString());
File.WriteAllText(#"c:\a.txt",jsonData);
jsonData.TrimEnd(']');
jsonData.TrimStart('[');
//Create client connection to our MongoDB database
var client = new MongoClient(MongoDBConnectionString);
//Create a session object that is used when leveraging transactions
var session = client.StartSession();
//Create the collection object that represents the "products" collection
var employeeCollection = session.Client.GetDatabase("mongodev").GetCollection<BsonDocument>("EmpData");
//Begin transaction
session.StartTransaction();
try
{
dynamic resultJson = JsonConvert.DeserializeObject(result);
var document = BsonSerializer.Deserialize<BsonDocument>(resultJson);
//MongoDB.Bson.BsonDocument document
// = MongoDB.Bson.Serialization.BsonSerializer.Deserialize<BsonDocument>(jsonResult);
employeeCollection.InsertOneAsync(document);
//BsonArray pipeline =
// MongoDB.Bson.Serialization.BsonSerializer.Deserialize<BsonArray>(jsonData);
//var documents = pipeline.Select(val => val.AsBsonDocument);
//employeeCollection.InsertManyAsync(documents);
session.CommitTransaction();
}
catch (Exception e)
{
Console.WriteLine(e);
session.AbortTransaction();
throw;
}
}
}
}
}
}
How do a format my json data and/or change my function so that it gets stored as columns in Azure table storage?
I am sending a json string to the IoT hub:
{"ts":"2017-03-31T02:14:36.426Z","timeToConnect":"78","batLevel":"83.52","vbat":"3.94"}
I run the sample function (in the Azure Function App module) to transfer the data from the IoT hub into my storage account:
'use strict';
// This function is triggered each time a message is revieved in the IoTHub.
// The message payload is persisted in an Azure Storage Table
var moment = require('moment');
module.exports = function (context, iotHubMessage) {
context.log('Message received: ' + JSON.stringify(iotHubMessage));
context.bindings.deviceData = {
"partitionKey": moment.utc().format('YYYYMMDD'),
"rowKey": moment.utc().format('hhmmss') + process.hrtime()[1] + '',
"message": JSON.stringify(iotHubMessage)
};
context.done();
};
But in my storage table, it shows up as a single string rather than getting split into columns (as seen in the storage explorer.
How do I get it into columns for ts, timeToConnect, batLevel, and vbat?
In case anyone is looking for a solution in c#:
private static async Task ProcessMessage(string message, DateTime enqueuedTime)
{
var deviceData = JsonConvert.DeserializeObject<JObject>(message);
var dynamicTableEntity = new DynamicTableEntity();
dynamicTableEntity.RowKey = enqueuedTime.ToString("yyyy-MM-dd HH:mm:ss.fff");
foreach (KeyValuePair<string, JToken> keyValuePair in deviceData)
{
if (keyValuePair.Key.Equals("MyPartitionKey"))
{
dynamicTableEntity.PartitionKey = keyValuePair.Value.ToString();
}
else if (keyValuePair.Key.Equals("Timestamp")) // if you are using a parameter "Timestamp" it has to be stored in a column named differently because the column "Timestamp" will automatically be filled when adding a line to table storage
{
dynamicTableEntity.Properties.Add("MyTimestamp", EntityProperty.CreateEntityPropertyFromObject(keyValuePair.Value));
}
else
{
dynamicTableEntity.Properties.Add(keyValuePair.Key, EntityProperty.CreateEntityPropertyFromObject(keyValuePair.Value));
}
}
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("myStorageConnectionString");
CloudTableClient tableClient = storageAccount.CreateCloudTableClient();
CloudTable table = tableClient.GetTableReference("myTableName");
table.CreateIfNotExists();
var tableOperation = TableOperation.Insert(dynamicTableEntity);
await table.ExecuteAsync(tableOperation);
}
How do I get it into columns for ts, timeToConnect, batLevel, and
vbat?
To get these attributes as separate columns in table, you would need to defalte the object and store them separately (currently you are just converting the entire object into string and storing that string).
Please try the following code:
module.exports = function (context, iotHubMessage) {
context.log('Message received: ' + JSON.stringify(iotHubMessage));
var deviceData = {
"partitionKey": moment.utc().format('YYYYMMDD'),
"rowKey": moment.utc().format('hhmmss') + process.hrtime()[1] + '',
};
Object.keys(iotHubMessage).forEach(function(key) {
deviceData[key] = iotHubMessage[key];
});
context.bindings.deviceData = deviceData;
context.done();
};
Please note that I have not tried to execute this code so it may contain some errors.
I have a web API returning 117k JSON objects.
Edit: The API is calling MySQL to fetch 117k rows of data, putting them into a IEnumerable and sending them through JSON
All I see is
[{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},... the entire page...
I wanted to ask how someone what is happening and how you would handle a large JSON transfer. Prefer to get it all in one go to avoid querying back and forth (delay time).
The function call is this:
public IEnumerable<Score> Get(int id)
{
string mConnectionString = System.Configuration.ConfigurationManager.AppSettings["mysqlConnectionString"];
MySqlConnection mConn;
MySqlDataReader mReader;
List<Score> returnedRows = new List<Score>();
if (String.IsNullOrEmpty(mConnectionString))
{
return returnedRows;
}
try
{
// prepare the dump query
MySqlCommand dumpCmd;
string query = "SELECT * FROM score where id = "+id+";";
using (mConn = new MySqlConnection(mConnectionString))
{
using (dumpCmd = new MySqlCommand())
{
dumpCmd.Connection = mConn;
dumpCmd.CommandText = query;
mConn.Open();
mReader = dumpCmd.ExecuteReader(); /
if (mReader.HasRows)
{
while (mReader.Read())
{
string[] rowCols = new string[mReader.FieldCount]; // there are 20+ columns, at least the primary keys are not null
for (int i = 0; i < rowCols.Length; ++i)
{
rowCols[i] = mReader.GetString(i);
}
returnedRows.Add(new Score(rowCols));
}
mConn.Close();
return returnedRows;
}
else
{
// should return a 404 cause nothing found
mConn.Close();
}
}
}
}
catch (Exception e)
{
return returnedRows;
}
return returnedRows;
}
Either mReader.GetString(i) is returning null or you have no data in the columns.
I am trying to update my sql data base using data adapter in a while loop. First loop is good but when I update the database and then in the second loop the row is blank for the select query. Does the adapter lock the database?
while (true)
{
using (MySqlConnection dbConnection = new MySqlConnection(connectionString))
{
using (MySqlCommand selectCommand = dbConnection.CreateCommand())
{
selectCommand.CommandText = "Select * from Status";
using (MySqlDataAdapter adapter = new MySqlDataAdapter(selectCommand))
{
using (MySqlCommandBuilder builder = new MySqlCommandBuilder(adapter))
{
using (DataTable table = new DataTable())
{
adapter.Fill(table);
table.Rows[0][1] = "data-" + count.ToString();
adapter.Update(table);
}
}
}
}
}
System.Threading.Thread.Sleep(1000);
}
I am using an ADO.NET SqlCommand with a single SqlDbType.Structured parameter to send a table-valued parameter to a sproc. The sproc returns many rows, which I need to get into a strongly-Typed List of . What is the best way to convert the result set (whether DataTable from a DataAdapter or DataReader bits) into List?
Thanks.
You can use LINQ with a DataReader:
var list = reader.Cast<IDataRecord>()
.Select(dr => new YourType { Name = dr.GetString(0), ... })
.ToList();
The most efficient way is using datareader:
var items = new LinkedList<MyClass>();
using(var connection = GetConnection()) {
using(var cmd = connection.CreateCommand()){
cmd.CommandText = "... your SQL statement ...";
// ... add parameters
cnn.Open();
using(var reader = cmd.ExecuteReader()) {
// accessing values via number index is most efficient
//gets index of column with name "PrimaryKey"
var ndxPrimaryKey = reader.GetOrdinal("PrimaryKey");
var ndxColumn1 = reader.GetOrdinal("Column1");
var ndxColumn2 = reader.GetOrdinal("Column2");
while(reader.Read()) {
var item = new MyClass();
// returns value of column "PrimaryKey" typed to nullable Guid
item.PrimaryKey = reader.GetValue(ndxPrimaryKey) as Guid?;
item.Column1 = reader.GetValue(ndxColumn1) as string;
item.Column2 = reader.GetValue(ndxColumn2) as int?;
items.AddLast(item);
}
}
cnn.Close();
}
}
return items;
i think you can use Dapper to convert a query to a class.
for more information see my answer in this link