I'm creating an application which allows you to manage various data. The application is designed to work in a network, and thus in multi-user. For this reason I decided to trust the Datatable.
I have a class created by me for the management of operations MYSQL Database but now I still can not create a streamlined process to send the datatable to MySQL database.
Currently I am so
Dim SQLStm As String
'variable for sql query
Dim SQLManager As New ER.DB.ERMysql
For Each Riga In Datatable.Rows
'example query
SQLStm = "INSERT INTO test(Name,Phone)VALUES(Riga("Name"),Riga("Phone"))"
Try
Dim CMD As New MySqlCommand
CMD.Connection = connection
CMD.CommandText = SQLStm
CMD.ExecuteNonQuery()
Catch ex As Exception
End Try
Next
End Sub
or skim all the rows and gradually sending to the database. There is a better way to accomplish this?
Thanks to all
I would say the best way to do this would be via XML. Convert the datatable to xml format and pass it through to a procedure which accepts an XML. This saves the process from running once per every line within a datatable, and it is all done in one go. The current way you are doing this would not scale well for large data sets, but XML would scale far better.
instead performing a db insert for each row, build the whole query string first and then perform it as one large INSERT command. It's much faster.
Related
I am using an ACCESS database to generate reports on a relatively big dataset. As I don't want to wait eternally and also ACCESS has certain size limitations, I execute a complex sql query on my server.
Now I would like to use the result of this complex sql query inside ACCESS to do some final stuff and then display in a form.
I know how to connect ACCESS directly to database tables, I also know how to work from there. But here I have a gap: I have a SQL with some few thousand lines and receive the result as recordset. How to make ACCESS accepting this in it's own architecture?
Dim SQL As String
Dim conn As ADODB.Connection
Dim rs As ADODB.Recordset
'SQL query
SQL = " SELECT something quite complex and long"
Set conn = New ADODB.Connection
conn.ConnectionString = "my great server"
conn.Open
Set rs = New ADODB.Recordset
rs.Open SQL, conn
'--- here is my personal gap ---
conn.Close
If your "very complex" SQL code is a stored procedure - you can execute that and return results from it using a pass through query. See here Two ways to execute a Stored procedure in VBA, Which one is better?
If it's a View you can simply link to the view, just like a table.
create a passthru query (or a normal one) with your complex request and save it
create a Make Table Access query that basically contains
select * from myComplexPTQ into myLocalTable
done
If you use normal query rather than passthru, and you find it slow, I suggest you give a look at http://allenbrowne.com/QueryPerfIssue.html
I design (PHP) db editors in the gaming world as a past time. I took up the challenge of doing the same thing, but this time with VB (originally 6, but MySQL is tricky) so I'm working with 2008.
One of the tables has a massive amount of info and I only need to display 4 or 5 fields from it as a search result (there are over 100+ fields)
I know how to populate the grid with the whole table, but do not know how to do it via specific fields without going into a very long way around it.
This is my first time from VB6 to VB.NET - sadly not too impressed (looks like they (M$) have deviated from "BASIC" and went for the C++ engine format - super ugly (but I digress).
Try
conn.Open()
da = New MySqlDataAdapter(sqlQRY, conn)
Dim cb As MySqlCommandBuilder = New MySqlCommandBuilder(da)
da.Fill(ds, "big_table")
DataGridView1.DataSource = ds
DataGridView1.DataMember = "big_table"
Catch ex As Common.DbException
MsgBox(ex.ToString)
Finally
conn.Close()
End Try
The above works fine, but I do not not need all 100+ fields displaying. Just want player name, level, if they are online and a few other fields to show - from here I can select a row and process the data elsewhere in the program.
Hope that made sense :-)
edit:
clarification: I need to know how, at run time, to create the datagrid to accept the results of my query so it does NOT display the whole record.
Is it possible you specify all fields in your query with a *? Just declare sqlQRY like:
SELECT Player_Name, Level, Online_Status, A_Few_Other_Fields FROM Players;
I have a program written in VB.NET that as part of its function requires the use of a number of database classes.
At the moment the classes are programmed specifically to use objects originating from System.Data.SqlClient and classes such as SqlConnection, SqlCommand, SqlParameter and SqlDataAdapter are used.
My aim is to use the analogous classes from Mysql.Data.MySqlClient (obtained via the Connector/Net download on the MySQL site). These for example would be: MySqlConnection, MySqlCommand, MySqlParameter and MySqlDataAdapter.
Is there some way in the code that I could maybe specify an abstract version of the classes (something like AbstractSqlCommand, AbstractSqlParameter) and be able to pick the correct implementation between SqlCommand and MySqlCommand based on the use of some other config variable.
Dim command As New AbsSqlCommand(sql, connection)
For Each p As AbsSqlParameter In param
command.Parameters.Add(p)
Next
Dim timeout As Integer = 3000
command.CommandTimeout = timeout
Try
connection.Open()
Catch
Throw New Exception("Connection failed")
End Try
Dim Adapter As New AbsSqlDataAdapter(command)
Adapter.Fill(table)
Return table
So in the case above some kind of global or configuration variable could be ussed to differentiate between whether AbsSqlCommand is actually used as a MySqlCommand or a SqlCommand [MSSQL] without the need for having to recode every instantiation of these objects to suit the particular database platform.
This is really a broad question that will be best answered by a full article like this, but look at
System.Data.Common Namespace
The classes in System.Data.Common are intended to give developers a
way to write ADO.NET code that will work against all .NET Framework
data providers.
Well you could have two linq to sql instances (there is no constraint on the number of instances and calsses there in), but I have no experince of using linq2sql with MySQL so I dont know how well it works. Id be inclined to set up a test project add a linq2sql setofdatatclesses and try to connect to a MySQL database, see what happens)
I want to create a Dot net application and provide an environment to a user may be Multiline Text box, where user can Paste the predefined SP and Execute. After execution this Sp should be created in DB
Any ideas are invited..
I assume you want to do thid for a internal support application or something like that, not to the end-user, right?
Anyway, you need to be more specific, but the way you would create a procedure doesnt differ the way you would run a insert statment for example.
Simple example:
SqlConnection objConnection = new SqlConnection(your_connection_string);
SqlCommand objCommand = new SqlCommand(tbProcedureCode.text, objConnection);
objCommand.CommandType = CommandType.Text;
objConnection.Open();
objCommand.ExecuteNonQuery();
I am running a SSIS package to load say a million rows from a flat file, which uses a script task for complex transformations and a SQL Server table destination. I am trying to figure out the best way (well, ANY way at this stage) to write out to a different table the row count (probably in multiples of 1000 to be more efficient) DURING the data flow processing. This is so that I can determine the percentage of progress throughout a task that might take a few minutes, simply by querying the table periodically.
I can't seem to add any SQL task into the flow, so I'm guessing the only way is to connect to the SQL database inside the .NET script. This seems painful and I'm not even sure it is possible. Is there another more elegant way? I've seen reference to "Rows Read" performance counter but not sure where I access this in SSIS and still not sure how to write it to a SQL table during the Data Flow processing.
Any suggestions appreciated.
Glenn
there are two easy options here:
Option 1: use the built-in logging with SSIS and watch the on progress event. this can be configured to log to several different outputs including relational database and flat files
See more Here
Option 2: you could add a SSIS script component that could fire off notifications to an external system like a database table
I recently solved this in a slightly different manner, which I find superior to using scripting and opening separate connections in code to DBs:
In the source query or a transform shape, add a row count (incremental)
In a conditional branching, use a modulo expression (%) to branch whenever the number is a multiple of for example 1000, but this could be configurable or based on source data (for example 0.0% to 100.0% of the data)
Create a log connection manager and use a destination. Control the batching sizes so that rows are immediately committed to the target table.
Why not write a .NET application and you can integrate into that to get information as to where the SSIS package is at.
Basically everything that is sent to the console you can get, and there are event handlers you can attach to to get information while it is processing the package.
Here is a link that may help you to go with this approach:
http://www.programminghelp.com/database/sqlserver/sql-server-integration-services-calling-ssis-package-in-c/
OK, had some success at last.... added a call to the following sub in the script component:
Sub UpdateLoadLog(ByVal Load_ID As Int32, ByVal Row_Count As Int32, ByVal Row_Percent As Int32, ByVal connstr As String)
Dim dbconn As OleDbConnection
Dim Sql As String
Dim dbcomm As OleDbCommand
dbconn = New OleDbConnection(connstr)
dbconn.Open()
Sql = "update myTable set rows_processed = " & Row_Count & ", rows_processed_percent = " & Row_Percent & " where load_id = " & Load_ID & " and load_log_type = 'SSIS'"
dbcomm = New OleDbCommand(Sql, dbconn)
dbcomm.ExecuteNonQuery()
dbconn.Close()
dbconn = Nothing
dbcomm = Nothing
End Sub
This gets executed every 1000 rows, and successfully updates the table. The row already existed as it gets created in the control flow at the start of the package, and updated again in the control flow at the very end with final rowcount and 100%.
Thanks for all your suggestions guys.
Is the application consuming the row count a .net application? When it comes to sharing information between applications there are a lot of accepted practices. May be you should take a look in to them. And for your particular case, if it is .net application that consumes this row number for calculating progress, may be you can store the information some place else other than a DB table, like file system, web service, windows environment variables, log (like windows events log), etc are some that came to my mind now. I think updating a windows environment variable with row count form with in your script component will be a good enough solution. Just like using a global variable to share data between two functions inside a program. :)