Using Entity 4.1 for a project, and couldn't figure out where the DB file been stored inside my project.
I think I've read that Entity code first will still store your data in SQL Express DB at a default location, but couldn't find out where it is.
What I did is:
create Entity DB project (Project A) in my solution, this project will have an Initializer to generate sample data for testing.
I also create a separate project (Project B) to save my Entity Code first data for testing by another application.
Then I create another WinForm project(Project C) in the same solution, and access DBContext from Project A.
I would assume that the DB should be somewhere in my Project C, and test projects in the solution shouldn't make a difference?
With a default installation of SQL Server Express and no connection string the database will be created in the DATA directory of the installation, for example something like: C:\Program Files\Microsoft SQL Server\MSSQL10_50.SQLEXPRESS\MSSQL\DATA (for 2008 R2 version). The name of the database is namespace.contextname, for example: MyNamespace.MyContext.mdf (and .ldf). Under this name you can also find them in SQL Server Management Studio.
I know this is an ancient thread covering a different scenario, but google directed me here when I was trying to figure this out for a UWP app, so if anybody follows in my footsteps, here's my solution.
If you're using entity with sqlite in a windows app, your database file will likely be created in the "working directory" of your app. In this case it's:
C:\Users\<Username>\Local\Packages\{Package-Name}\LocalState\
you can find your {Package-Name} by looking in the app manifest. It'll probably be a long string of numbers and letters by default.
Related
Been tasked with moving a code first database from MSSQL to MySQL. After a few hours of kung fu, I was able to get the asp.net core project to properly deploy all migrations to mysql. Now I need to migrate the data inside of the existing mssql tables. I saw posts mentioning MySQL Migration Toolkit but that appears to be old. Also attempted to do so with MySQL Workbench and DBLoad's Data Loader but haven't had any luck.
Table structure is pretty simple with incremental integer keys + the usual crap with asp.net core identify framework (GUID). Just need to keep that consistent during the migration. What is the best way to migrate the data now that the table structure is setup in MySQL? Any recommendations would be greatly appreciated!
Update: Some more details...
I attempted a direct migration of the database from MSSQL to MySQL using MySQL WorkBench and DBLoader. But failed on the ASP.net Identity tables big time plus other issues. .net core api took a huge dump in multiple places so that idea is out.
From that point, I migrated the api controller over to mysql and then had to fix a myriad of issues related to mssql fks being too long and a few other issues.
So at this point, the controller works on mysql. I just need to dump all of the data into MySQL and keep the FKs consistent.
I have had a few thoughts with it such as CSV export>import and/or trying a few other things. Any recommendations?
You can use Data Export wizard built-in with SqlServer Management Studio (task -> Export Data)
or use SSIS package to migrate data.
Tried MySQL Workbench, DBLoad from DBLoad.com etc to migrate the data directly and they all failed.
So ended up finding a "solution"...
First off, I modified the ASP.net Core project with:
//services.AddDbContext<ApplicationDbContext>(options =>
// options.UseSqlServer(
// Configuration.GetConnectionString("DefaultConnection")));
services.AddDbContext<ApplicationDbContext>(options =>
options.UseMySql(
Configuration.GetConnectionString("DefaultConnection")));
Then went to Package Manager Console and ran: update-database
This created all of the tables in MySQL.
Then opened up Microsoft SQL Management Studio. Right clicked on the database Tasks > Generate Scripts. Saved everything to one file with Advanced option Schema and Data selected.
Then opened the db script in Notepad++ and applied the following edits using Replace with Extended Search Mode enabled:
GO -> blank
[dbo]. -> blank
[ -> blank
] -> blank
)\r\n -> );\r\n
\r\n' -> '
DateTime2 -> DATETIME
After edits were made in Notepad++, I removed all of the SET, ALTER and CREATE related stuff from the text file and then copied all of the insert lines into MySQL Workbench in order to ensure foreign keys were already populated before that table's data was inserted. Thank goodness there were no Stored Procedures to deal with! What a pain in the tucus!
Also on a side note, the app is hosted on Azure. Spent a couple of hours fighting the API not connecting to the database. This was not apparent at first as the app was fake throwing a 404 error to Postman. When I first attempted to wire up the API controllers to the MySQL DB, I entered the database connection string into Azure's App Service Configuration. It didn't work at all even though running the app locally worked fine. Ended up finding another post on this here site mentioning to get rid of the database connection string out of the App Service > Configuration window. Worked like a champ after that. All of the data with its auto incremented keys linked up without issue.
I am very pleased with the results and hope I never have to go through this process again! It is always a nice feeling to know an app now runs on a completely open source infrastructure. Hope this helps someone. Good luck.
I have two projects on a Visual Studio 2010 solution:
1- A .dll class library called DATA which has a .edmx Entity Framework Model which was created using the wizard and based on the schema of a MYSQL database file.
2- A Windows Forms project which is the User interface and just have the minimum information to sent and receive data to the .dll, which in turn will do the same against the database. This project have a reference to the first one.
The thing is that if I set the connection string on the app.config file of the .dll then when I run my app, it says to me that doesnt find the connection. So I have to set the connection string on the FORM project, which I dont like since Im trying to achieve a minimum of independence from each other.
The same happens with the mysql references... I need to set them on the FORMS project to make the app work... but this is not what I have in plans when I thought of making two separate projects, each one with its own responsibilities.
What I am thinking/doing wrong?
I am somewhat new to SSIS.
I have to deliver a 'generic' SSIS package, that the client will make multiple copies of, deploy and schedule each copy for different source databases. I have a single SSIS Configuration table in a separate common database. I would like to use this single configuration table for all connections. However the challenge is with the configuration filter. When client makes a copy of my package, it will have the same configuration filter just like others. I would like to give an option to the client to change the configuration filter before deploying, because for this new copy, the source database can be different. I do not find an option to control this.
Is there a way to change the configuration filter from outside the package (without editing the executable .dtsx file)? Or is there a better approach that I can follow? I do not prefer XML configuration files, the primary reason being my packages are deployed onto SQL server.
Any help would be greatly appreciated.
-Shahul
Your preferred solution does not align well with the way that SSIS package configurations are typically used. See Jamie Thomson's answer to a similar question on the MSDN forums.
I have created a package with the same requirements for my company. It loads data from different sources and loads them into different destinations based on individual configurations for the instances. It is used as an internal ETL.
We have adapters that connect to different sources and pass data to a common staging table in XML format and the IETL Package loads this data into different tables depending on a number of different settings etc.
i.e. Multiple SSIS package instances can be executed with different configurations. You are on the right track. It can be achieved using SQL Server to hold configurations and XML Config file to hold the database info that has this configurations. When an instance of the package executes it will load the default values configured with the package, but needs to update all variables to reflect the purpose of the new instance.
I have created a Windows app to configure these instances and they settings in the database to make it really easy for the client or consultant to configure them without actually opening the package.
I'm developing a Web App that is the front end to a database. I was asked to handle all database readings and writings through store procs. I have about 80 up until now.
I wonder what's the best way to deploy all this procs to production environment one I'm done with the development.
EDIT:
I already have a bunch of .sql files. I was wondering if there is a way to create kind of an installer that runs all these files I already have at once
You could use some tool like RedGate SQL Compare, but also you can export the procedures using SQL Management Studio (script as Create) and then import them in the production database. However, RedGate tool will be able to inspect, merge an do other more complicated tasks.
You can also use something like scriptdb if you need more flexibility.
Visual Studio has a Database Project which can generate a schema (has nice support for diff'ing/cherry-picking between the project and a data source) that can be command-line deployed with VSDBCMD.
One thing I like is that it uses a consistent layout and generates many ".sql" files. VSDBCMD does a schema compare (of the project output) and generates a TSQL script file on-the-fly that is then run to apply the appropriate changes to the target. VSDBCMD (and/or the VS Database Project) can be run from a staging system so long as it can connect to the SQL Server instance.
"It works well enough here", but I can't vouch for it over other tools. It is more than sufficient to create/update stored procedures and comes "for free" with certain Visual Studio versions, if that is already a sunk-cost.
Happy coding
I used xSQL Executor. It lets you select all sql scripts that you want to run at once. Then, you can create a command that you can execute from the command line. I created batch file with this command and handed it along with the stored procedure scripts and the xSQL Executor executable file to the DBAs.
I get this error no matter what version of SubSonic I use. When I query the database for data, it errors out, saying it can not connect to the database.
However, it is able to generate the .cs classes(ActiveRecord, Context, etc) when told to do so.
Any help is appreciated.
Thanks folks...
My guess is that you have your SubSonic generated classes in a separate project from where your main application is (in another project in the same solution). Your main application project references the project containg the SubSonic generated classes.
If this is the case, your main application project must also contain the connection string in a config file, similarly to what your other project has. You might also need to copy over some of the other SubSonic related items from your other project's config file as well.