I am using Entity Framework 4.1 and SQL Express. I have been trying to create models inside the emdx file and from that create the tables inside the a .mdf file. However, I am unable to get that work.
However, I am able to get the "Update Model from Database" to work, so there don't seem to be a problem with the connection string.
What am I doing wrong?
I tend to copy the generated SQL script and execute it myself using SQL (Enterprise manager 2008 in my case), gives you better feedback and more control.
Haven't really bothered setting it up so that it executes automatically, because EF sometimes makes mistakes in its scripting (e.g. trying to delete every FK twice. Once in the beginning, and then again before the containing table will be deleted).
Also, if you made a lot of changes or dropped some tables, sometimes the script isn't 100% compatible with deleting the existing database. I then just drop all FK's and tables (not just what the script tells me to) and then execute the script.
But that's just how I like to do it.
Related
I have an access database that connects to a vb6 application and this whole thing is connected between two computers via a shared network one running win 8 and other a win 7, and there is no internet involved in any sorta way nor should it be that is a requirement in fact
sorry I advance I have tried researching on the net but there is really short time and a lot of confusing material online
I am creating a WPF app connected to MySQL DB
now I have copied the access file and imported the contents of the DB in MySQL
things are a real mess in the imported DB so I am fixing it
what I am confused is how I am going to make it work there
do I go and install MySQL and do the whole process manually there, repeating all the steps and changes
is made
make a document that contains the code/script for all the changes I have made and run the data through
it, and is there even a way to implement that as a whole in a singular go
connect both databases together, i don't even know if this is possible
yes, in place of a simple "file share" of the Access file, you now are going to run some kind of SQL server system. In this case MySQL. But it could be PostgreSQL or any kind of "server" database.
That instance of "sql server" thus has to be setup, installed and you ensure that the "box" running that instance of MySQL also allows external connections (often by default the given computer firewall settings prevent this).
At that point, 2 or 10 different computers on that same network can now simply connect to the SQL server. The code of course is going to be VERY simular. You almost for sure used the oleDB provider for use with Access. However, you can use the ODBC provider, or even use the provider from MySQL. Those providers thus means you change the connect object, datareader object etc. However the "base" .net types such as row, or datatable, or dataset can remain as before (so you only change the provider). If you have a lot of code based on oleDB, then you could well consider to contine to use that oleDB provider code in .net, and thus you change the connection strings to now point to MySQL.
If you don't have a lot of code, then for sure do adopt the mySQL provider for .net. But as noted the least amount of changes would be to continue to use a oleDB provider for mysql, and that would suggest the least amount of code to be changed.
As for the msaccess data migration? Well, it not clear what tools and how you doing that now. But, once you transfer the data to the MySQL server (assuming you installed + setup my sql to run on one computer). The it is a simple matter to point your .net connection(s) in your code to Now MySQL as opposed to Acess. As a result, most if not all of your code logic for working with the tables can remain as before - but as noted you have to swap out the provider parts in .net
Now, if your REALLY lucky and the .net code used the ODBC provider? Then all you have to do is change your connection strings. And of course "some" SQL syntax in your code may have to be tweaked, as like Oracle, MS SQL server, postgreSQL, and MySQL?
Well, they all have some features and syntax that is different - this is especially in regards to date/time calculations, datediff() etc. But the general sql you have/had in your .net code should continue to run mostly un-changed against MySQL data tables.
As for how to migrate the data? I think that a really good tool is of course to use MS-Access. What you do is get MySQL up and running. Then use ms-access to open that database. You then add linked tables from MS-access to the MySQL tables.
At that point, you can now run append queries from Access to move/send the data to MySQL. It really depends on how many tables, and how many related tables are in that database. The more complex and the greater number of related tables in Access then the more the challenge to move such data up to MySQL.
Transferring Excel or a small or even big table is a breeze. (again, use MS Access and link to the tables on the sql server). However, where things can become messy is that if you have say 25 tables, and they are all related, many have cascade delete and say enforced parent to child relationships. So the more tables, and especially a larger number of related data tables, then the more work such a data migration task will become.
I think MS Access is a really good tool, since if you setup a connection to MySQL, then you can execute a transferDatabase commend in Access to send up one table to MySQL, and even all the columns and data types for those columns will be automatic created for you. So not only can Access transfer the data, but MORE valuable is it has the abilty to create the target tables on MySQL for you - and that will save you large amounts of time to build + setup the tables on MySQL.
I'm kinda new to this kind of problem. I'm developing a web-app and changing DB design trying to improve it and add new tables.
well since we had not published the app since some days ago,
what I would do was to dump all the tables in server and import my local version but now we've passed the version 1 and users are starting to use it.
so I can't dump the server, but I still would need to update design of server DB when I want to publish a new version. What are the best practices here?
I like to know how I can manage differences between local and server in mysql?
I need to preserve data in server and just change the design, data on local DB are only for test.
Before this all my other apps were small and I would change a single table or column but I can't keep track of all changes now, since I might revert many of them later and managing all team members on this is impossible.
Assuming you are not using a framework that provides a migration tool for database, you need to keep track of the changes manually.
Create a folder sql_upgrades (or whatever name you name) in your code repository
Whenever a team member updates the SQL schema, he creates a file in this folder with the corresponding ALTER statements, and possibly UPDATE, CREATE TABLE etc. So basically the file contains all the statements used to update the dev database.
Name the files so that it's easy to manage, and that statements for the same feature are grouped together. I suggest something like YYYYMMDD-description.sql, e.g. 20150825-queries-for-feature-foobar.sql
When you push to production, execute the files to upgrade you SQL schema in production. Only execute the files that have been created since your last deployment, and execute them in the order they have been created.
Should you need to rollback a file, check the queries it contains, and write queries to undo what was done (drop added columns, re-create dropped columns, etc.). Note that this is "non-trivial", as many changes cannot be rolled back fully (e.g. you can recreate a dropped column, but you will have lost the data inside).
Many web frameworks (such as Ruby of Rails) have tools that will do exactly that process for you. They usually work together with the ORM provided by the framework. Keeping track of the changes manually in SQL works just as well.
I have an .sql script that contains inserts and creates tables. I used the "Create EER Model From Script"
It created the tables but I can't see the data inside these tables.
I went to the query menu and tried to make a query but it gives me an error about not being able to connect to localhost.
Am I doing it right?
As documented under Create EER Model from SQL Script:
Clicking this action item launches the Reverse Engineer SQL Script wizard. This is a multi-stage wizard that enables you to select the script you want to create your model from.
For further information, see Section 7.7.9.1, “Reverse Engineering Using a Create Script”.
Following that link:
However, if you are working with a script that also contains DML statements you need not remove them; they will be ignored.
Instead, you want the Manage Data Import/Export option under Server Administration (within the Workspace section of the Home window).
You are confusing things here. Creating a model from a script is a process where meta data is examined and a model is created that you can then use to modify your schema structure, further design your db objects and all that. Modeling is a design process for the structure of your schema/db so it only deals with meta data. It's also used for documentation (e.g. in teams).
On the other hand there's normal sql work with existing db objects and/or actually creating/deleting/modifying db objects. In order to do the latter you must have an understanding of the design of the schema (which you could get by using the modeling part of MySQL Workbench, but not only by that). This is also the place to load a script, run it to insert data and such.
The error you mentioned regarding the connection is yet another problem and you need to solve this first to be able to even access your server. And yes, you have to install a server first somewhere. MySQL Workbench is a tool to visually work with your server(s) in opposition to the MySQL command line client which is a pure text interface (but still also a client application for your MySQL servers).
If you are on Windows and want a MySQL server installed locally (e.g. for testing) your best option is to download the MySQL Installer which greatly simplifies installing any of the tools from the MySQL family (server, client tools, connectors, documentation and more).
We are possibly looking at switching our tables, for views in EF 4.3.1.
We are using db first via the edmx file, so it generates our entities and dbcontext.
Has anyone got any tips for remapping our entities from tables to views?
Is this prone to disaster? We've had trouble with updating the edmx file in the past via the designer where the underlying changes weren't reflected deep somewhere within the code and we ended up with missing columns.
Or will views act very similar to tables in the EF world?
Designer handles views in completely different way - first of all all views used by EF through designer are read only unless you map stored procedures or custom SQL commands to insert, update and delete operation for each entity you want to modify.
Normally if you have updatable view you can simply modify SSDL part of EDMX and cheat it to pretend that the view is actually a table but this has two consequences:
You must modify EDMX directly as XML
You must not use Update from database any more because it always deletes whole SSDL part and creates a new one without your changes = you must maintain your EDMX manually or buy some extension for VS which will allow you updating only selected tables.
I have a database (mdb file) that I am currently busy with. I would like to know if it is possible to generate MySQL code that would be used to create this database?
There are a couple of tools you can look at to try to do the conversion.
DataPump
Microsoft DTS (Nos Called SQL Server Integration Services)
Other option might be generate MySQL code from Access' DB MetaData you can access from JDBC, ODBC, ADO.NET or any other database access technology with metadata support. For this option you need to generate a piece of code (script). So it will only make sense if your access DataBase has a lot of table with a lot of columns or if you are planning to do this task several times.
Of course, using one of the mentioned tools will be faster if it works.
You can certainly write DDL to create and populate a MySQL database from the work that you've already done on Microsoft Access. Just put it in a text file that you execute using MySQL batch and you're all set.
If you intend to keep going with developing both, you'll want to think about how you'll keep the two in synch.