My project (WPF, .Net 4.5, EF6) has to target different DBMS, so far it's MSSQL, Oracle, MySql and Firebird.
I started with creating dbs-dependent scripts that generate the database(s) and then used Entity Frameworks Database-First-approach to create the models. By having a default edmx based on MSSQL and creating separate ssdls for the other providers (the differing ssdl-files can be configured in the connection strings) it all worked pretty good for all dbms.
The problem I see now is maintaining installation/updates for 4 or more different dbms for even many more customers. We won't send admins to install updates at our customers, we rather need something like a generic setup/update routine for all. It's possible but you would have to maintain different versions of sql scripts for each dbms and kind of a setup tool that handles those scripts and knows which to execute for which database (depending on dbms and current version).
When looking for alternatives I came across EF Code First Migrations and tried switching to this approach. My tries are based on MSSQL and MySql so far.
It all works good when I stick to either MSSQL or MySql. Creating migrations, applying them to existing or non-exisiting databases, all this works pretty fine.
But I'm stuck with bringing both system together. Applying MSSQL-based migrations to MySql for example seems to be impossible. The database will be created but it's not possible to connect due to type mismatches and so on.
My guess is that the "__Migration"-Table contains a model which was created based on MSSQL and is incompatible with the MySql-provider now. Just a theory but maybe someone knows better.
Does anyone know a solution for this? Is there any way to target different dbms with EF?
I can't believe that I am the only one with this problem but it's really hard to find any information about this.
Any help is appreciated, even directing me to other approaches than using EF that way or using EF at all.
I've got same situation, but I found simple workaround for that. Everything you need is just create seperate projects for each database system with enabled migration. In your case, two database contexts should inherit from one base context where all model builders and DbSets are kept.
Example model:
// MyModel.Base.csproj
public class Person { /*..*/ }
public class BaseDbContext : DbContext
{
public DbSet<Person> People { get; set; }
/*...*/
}
// MyModel.Sql.csproj
public class SqlDbContext : BaseDbContext {}
// MyModel.MySql.csproj
public class MySqlDbContext : BaseDbContext {}
Maybe having two migrations settings it's not so awesome idea, but you are able to use some database-specific scripts in your migrations.
Related
I'm still kind of new to Grails (and Groovy), so apologies if this question seems dumb.
I'm trying to access a SQL database, and it seems that I could use SQL commands in the Controller (taken from this StackOverflow question):
import groovy.sql.Sql
class MyFancySqlController {
def dataSource // the Spring-Bean "dataSource" is auto-injected
def list = {
def db = new Sql(dataSource) // Create a new instance of groovy.sql.Sql with the DB of the Grails app
def result = db.rows("SELECT foo, bar FROM my_view") // Perform the query
[ result: result ] // return the results as model
}
}
I know that if I were to create a domain class with some variables, it would create a database table in SQL:
package projecttracker2
class ListProject {
String name
String description
Date dueDate
static constraints = {
}
}
but this would create the table named "list_projects". If I didn't do that and just created the SQL table outside of Grails, and if this follow-up question says that you can disconnect the Domain class from your database, what purpose do Domain classes serve? I'm looking to do some sql queries to insert, update, delete, etc. data, so I'm wondering what's the best way to do this.
Domain classes are used to model your domain of knowledge within your application. This is not only the structure of the data but also the basis of interaction of those models within your domain of knowledge.
That said, there is no reason why you can't create a Grails project without any domain classes and use your own SQL statements to create, read, update, and delete data in your database. I have worked on projects where there was no domain classes and everything was modeled using DTO (data transfer objects) and services for accessing an already existing database and tables.
Of course by not using Domain classes you lose the integration with GORM, but that doesn't seem like an issue for your case (nor was it in the case I outlined above).
That's part of the beauty of Grails. You don't have to use all of it, you can use only the parts that make sense for your project.
In one of my projects I needed to dump the contents of a MySQL into a Lucene index. Creating the the whole domain class structure for such an one-off operation would be an overkill, so the groovy SQL API did just fine.
So, my answer is no, you DON'T have to use the domain classes if you don't want to.
I agree with what #joshua-moore have said, plus domain classes can drastically simplify you project if you use them properly
I agree to both answers but for your particular case, I would suggest having a domain model for the underlying table.
Reasons:
You mentioned about all CRUD operations in your requirement. With a domain class it will be convenient to let GORM handle the boiler plate code for any of the CRUD operation.
When using SQL, you have to handle transactions manually for update operation, if transaction is a requirement. With GORM and Hibernate, you get that handled automatically.
Code will be DRY. You do not have to create a SQL instance every time you need a operation to be done.
You can conveniently create domain classes for existing tables using db-reverse-engineer plugin
You get one level of abstraction using domain classes. In future, if there is a plan to replace a MySQL db with Oracle or a no-sql db then all that will be needed is to change the driver (in most cases, with Mongodb there will be a bit of churn involved but very less as compared to replacing SQL queries)
Auditing can be easily achieved if domain class is used.
This feature (add/update/delete) can be easily exposed as a service, if required.
Data Validation is easier in domain classes
Better support to avoid SQL Injection as compared to plain vanilla queries.
I learned play by following the tutorial on their website for building a little blogging engine.
It uses JPA and in it's bootstrap calls Fixtures.Deletemodels(), (or something like that).
It basically nukes all the tables every time it runs and I lose all the data.
I've deployed a production system like (sans the nuke statement).
Now I need to deploy a large update to the production system. Lots of classes have changed, been added, and been removed. In my testing locally, without nuking the tables on every run I ran into sync issues. When I would try to write or read from tables play would throw errors. I opened up mysql and sure enough the tables had only been partially modified and modified incorrectly in some cases. Even if I have the DDL mode set to "create" in my config JPA can't seem to "figure out" how to reconcile the changes and modify my schema accordingly.
So I have to put back in the bootstrap statement that nukes all my tables.
So I started looking into database evolutions within Play and read an article on the play framework website about database evolutions. The article talked about version scripts, but it said, "If you work with JPA, Hibernate can handle database evolutions for you automatically. Evolutions are useful if you don’t use JPA".
So if JPA is supposed to be taking care of this for me, how do I deploy large updates to a large Play app? So far JPA has not been able to make the schema changes correctly and the app will throw errors. I'm not interested in losing all my data so the fix on dev "Fixtures.deleteModels()" can't really be used in prod.
Thanks in advance,
Josh
No, JPA should not take care of it for you. It's not a magic tool. If you decide to rename the table "client" to "customer", the column "street" to "line1" and to switch the values of the customer type column from 1, 2, 3 to "bronze", "silver", "gold", there is no way for JPA to read in your mind and figure all the changes to do automagically.
To migrate from one schema to another, you use the same tools as if you didn't use JPA: SQL scripts, or more adavanced schema and data migration tools, or even custom migration JDBC code.
Have a look at flyway. You may trigger database migrations from your code or maven.
There is a property called hbm2ddl.auto=update which will update your schema.
I would STRONGLY suggest to not use this setting in production as it introduces a whole new level of problems if something goes wrong.
It's perfectly fine for development though.
When a JPA container starts (say, EclipseLink or any other), it expects to find a database which matches #Entity classes you've defined in your code. If the database has been migrated already, everything will work smoothly; otherwise: probably it will fail.
So, long story short, you need to perform database migrations (or evolutions, if you prefer) before the JPA container starts. Apparently, Play performs migrations for you, before Play kicks off the database manager you configured. So, in theory, regardless the ORM you are using, Play decides when it's time for the ORM to start its work. So, conceptually it should work.
For a good presentation about this subject, please have a look at the second video at: http://flywaydb.org/documentation/videos.html
I was watching some videos and tutorials for EF 4.1, and I do not understand any benefit of CodeFirst (except some if DB is very small 3-4 tables and I am lazy for creating DB first).
Mostly, the best approach so far is to create Database in some sort of Database editor, which is sure faster then editing in the Entity Model and EF picks up every relationships and creates associations correctly. I know there are challenges in naming convention etc, but I feel its very confusing to manage Code First as everything looks like code and its too much to code either.
What is it that CodeFirst can do and Db first cannot?
CodeFirst cannot do anything that DB first cannot. At the end of the day they are both using the Entity Framework.
The main advantages of using codefirst are:
Development Speed - You do not have to worry about creating a DB you just start coding. Good for developers coming from a programming background without much DBA experience. It also has automatic database updates so whenever you model changes the DB is also automatically updated.
POCOs - The code is a lot cleaner you do not end up with loads of auto-generated code. You have full control of each of your classes.
Simple - you do not have a edmx model to update or maintain
For more info see Code-first vs Model/Database-first
and here Code-First or Database-First, how to choose?
Coming from a DataCentric approach, I will always find it strange the people like to create in a Code First Approach. When I design my database, I am already thinking about what each of the tables are as if they were classes anyway. How they link together and how the data will flow. I can image the whole system through the database.
I have always been taught that you work from the ground up, get your foundations right and everything else will follow. I create lots and lots of different systems for lots of different companies and the speed that I do it is based on the fact that once I have got a strong database model, I then run my custom code generator that creates the Views/Stored Procedures as well as my Controller/BusinessLayer/DataLayer for me, Put all of these together and all I have to do is create the front end.
If I had to create the whole system in code first to generate the database, as well as all of the other items then I would image it taking a lot longer. I am not saying that I am right in any terms, and I am sure that there are probably faster and more experienced ways of developing systems, but so far, I haven't found one.
Thanks for letting me speak and I hope my views have helped a little.
Migration was enabled in EntityFramework 4.3 for CodeFirst , so
you can easily update changes from model to the database seamlessly Reference 1
detailed video:Complete Reference Video
Well, it depends on your project. I'll try to make a synthase some ideas:
You have total control on the entity classes. They are no more generated, you don’t have to update T4 templates or use partial classes…
EDMX model will disappear in EF7 in favor of CodeFirst model. So keep in mind if you plan to migrate to EF or you have projects start in the near future that could use EF7.
Easier to do merge in case multiple devs are working on the model
+/- Annotations and mapping should be done manually. I would say code first approach seems lighter (less bloat) and we can keep things simple (visual model could hide undesired complexity). Open to Fluent API.
You can still visualize model via Power Tools, but the model is read-only. Any change to the model should be done manually (even the initial entities can be generated from scratch). You don’t have partial models (diagrams), but our models should be small enough.
It seems database first is better integrated with SPs and function results (some improvements have been done in EF6)
First a bit about the environment:
We use a program called Clearview to manage service relationships with our customers, including call center and field service work. In order to better support clients and our field technicians we also developed a web site to provide access to the service records in Clearview and reporting. Over time our need to customize the behavior and add new features led to more and more things being tied to this website and it's database.
At this point we're dealing with things like a Company being defined partly in the Clearview database and partly in the website database. For good measure we're also starting to tie the scripting for our phone system into the same website, which will require talking to the phone system's own database as well.
All of this is set up and working... BUT we don't have a good data layer to work with it all. We moved to Linq to SQL and now have two DBMLs that we can use, along with some custom classes I wrote before I'd ever heard of Linq, along with some of the old style ADO datasets. So yeah, basically things are a mess.
What I want is a data layer that provides a single front end for our applications, and on the back end manages everything into the correct database.
I had heard something about Entity Framework allowing classes to be built from multiple sources, but it turns out there can only be one database. So the question is, how could I proceed with this?
I'm currently thinking of getting the Linq To SQL classes all set for each database, then manually writing Linq compatible front ends that tie those together. Seems like a lot of work, and given Linq's limitations (such as not being able to refresh) I'm not sure it's a good idea.
Could I do something with Entity Framework that would turn out better? Should I look into another tool? Am I crazy?
The Entity Framework does give a certain measure of database independence, insofar as you can build an entity model from one database, and then connect it to a different database by using a different entity connect string. However, as you say, it's still just one database, and, moreover, it's limited to databases which support the Entity Framework. Many do, but not all of them. You could use multiple entity models within a single application in order to combine multiple databases using the Entity Framework. There is some information on this on the ADO.NET team blog. However, the Entity Framework support for doing this is, at best, in an early stage.
My approach to this problem is to abstract my use of the Entity Framework behind the Repository pattern. The most immediate benefit of this, for me, is to make unit testing very simple; instead of trying to mock my Entity model, I simply substitute a mock repository which returns IQueryables. But the same pattern is also really good for combining multiple data sources, or data sources for which there is no Entity Framework provider, such as a non-data-services-aware Web service.
So I'm not going to say, "Don't use the Entity Framework." I like it, and use it, myself. In view of recent news from Microsoft, I believe it is a better choice than LINQ to SQL. But it will not, by itself, solve the problem you describe. Use the Repository pattern.
if you want to use tools like Linq2SQl or EF and don't want to have to manage multiple DBMLS (or whaetever its called in EF or other tools), you could create views in your website database, that reference back to the ClearView or Phone system's DB.
This allows you to decouple your web site from their database structure. I believe Linq2Sql and EF can use a view as the source for an Entity. If they can't look at nHibernate.
This will also let you have composite entities that are pulled from the various data sources. There are some limitations updating views in SQL Server; however, you can define your own Instead of trigger(s) on the view which can then do the actual insert update delete statements.
L2S works with views, perfectly, in my project. You only need to make a small trick:
1. Add a secondary DB table to the current DB as a view.
2. In Designer, add a primary key attribute to a id field on the view.
3. Only now, add an association to whatever other table you want in the original DB.
Now, you might see the view available for the navigation.
I've just started using Linq to SQL, and I'm wondering if anyone has any best practices they can share for managing dbml files.
How do you keep them up to date with the database?
Do you have a single dbml file for the entire database, or is it split into multiple logical units?
How does managing this file work in a team environment?
Any other tips and tricks welcome.
Have you looked at SqlMetal? It's officially supported, although not promoted too much. You can use it to build dbmls from the commandline - we've used it as part of a db's continous integration updates (make sure you have really good code separation if you do this though - partial classes are a saviour - as the dbml will get overwritten).
If I recall correctly it doesn't have quite the same features as the model designer in Visual Studio (I think it handles pluralisation differently). There a good post about it on Ben Hall's blog.
The fact that the L2S designer doesn't support syncing with the database structure is a huge limitation in my mind. However, there is an add-in available that provides some re-sync capabilities:
http://www.huagati.com/dbmltools/
Unfortunately, it's no longer free.
Since you asked for other tips and tricks for managing DBML...
When DBML files are refreshed from the database, there are certain schema settings which they don't pick up on, such as default column values, forcing you to manually change the setting. This can lead to lost hours every time you refresh the DBML without realizing or remembering where you need to make manual adjustments, and your code starts failing.
To guard against this, one trick is to write a unit test which uses reflection to check the LINQ metadata for those (manual) settings. If the test fails, it gives a descriptive error message, instructing the user to make the proper change to the column properties. It's not a perfect solution, and it might not be convenient if you have many manual settings, but it can help avoid some major pain for yourself and your team.
Here's an example of an nunit test to check that a column is set to auto-generate from the DB.
[Test]
public void TestMetaData()
{
MyObj my_obj = new MyObj()
{
Foo = "bar",
};
Type type = MyObj.GetType();
PropertyInfo prop = type.GetProperty("UpdatedOn");
IEnumerable<ColumnAttribute> info = (IEnumerable<ColumnAttribute>)prop.GetCustomAttributes(typeof(ColumnAttribute), true);
Assert.IsTrue(
info.Any<ColumnAttribute>(x => x.IsDbGenerated == true),
"The DBML file needs to have MyObj.UpdatedOn AutoGenerated == true set. This must be done manually if the DBML for this table gets refreshed from the database."
);
}
PLINQO is a set of code generation templates generating LINQ to SQL. It supports syncing with the database and splitting entities into multiple classes along with many other features that make LINQ to SQL easy to use.
Check out the PLINQO site at http://www.plinqo.com as well as the intro videos.
Here is a link that provides good information about LINQ to SQL best practices
http://www.a2zmenu.com/LINQ/LINQ%20to%20SQL%20Best%20Practice.aspx