I'm trying to add a text field to my EDMX file which I have set to generate DDL for MySql. but the only option I have is to add a string with the maximum length set to Max. This is reporting a error when executing the SQL statements against the database, that the maxlength of 4000 is not supported.
I also tried it the other way around, updating the field in the database and than update the EDMX file based on the database, but that sets the field back to a string field with maximum length set to None.
Am I overlooking something? Have anyone used this field?
Right now I have a kind of workaround to have my text field in the database mapped to the string property in the EDMX model. I generate the database script from the EDMX file and manually update the type for the TEXT columns from nvarchar(1000) to TEXT, execute it against the database and after that validate the mappings in the EDMX file.
Hopefully someone will come up with a better solution, because this is definitly not a cool workaround.
Update
This bug is fixed in Mysql Connector for .Net version 6.4.4
I'm not familiar with the entity framework, however a TEXT field in mysql does not have a length as part of its definition. Only CHAR/VARCHAR does.
There is a maximum lenght of data that can be stored in TEXT, which is 64kb.
Related
I am using EF to update a field in my MySql DB and ran across the issue of attempting to save data that is not allowable due to collation. For example, ^âÂêÊîÎôÔûÛŵŷ has characters outside the column with character set of latin1.
Running an update/insert with above example I get the exception:
The database update did not take place due to..Incorrect string value
I know what the problem is, but I don't want to keep the characters, the data being provided is usually via UI which would often control what is passed in, however it is also callable by API allowing whatever data the caller would like to send. In the above case, I would like to drop those characters or just replace with a question mark, basically ignore them.
This system already exists in an older language and the rule to (silently..) ignore them exists, I need the error not to be raised and for it to save what it can. I have seen how I can modify the statements for this, or how I can modify the string data coming in. I have 1000s of these. Is there another method to achieve this?
I have a column in my MySQL database that stores images as a byte array.
I am trying to update a row to insert a new image. The new image is 163K, and when I convert it to a byte[], the number of elements in the array is 167092. When I run the stored procedure that does the update, I get an error "Data too long for column 'x' at row 1. I already have an existing image in the database that has 8844 byte[] elements when converted.
The column datatype is LONGBLOB. From my understanding, I should have appox 4Gb to work with.
I have tried updating my my.ini file to make the MAX_ALLOWED_PACKETS=16M and I even tried 100M.
I am using the MySQL .NET Connector libraries to execute my stored procedures.
Does anyone have any ideas on how to fix this issue? I know I could store the image paths instead of storing the images directly into the database. But I would like to know how to solve my current issue and still store the images in the database first before trying to change my approach.
I've had exactly the same problem...
In my case I was passing the LONGBLOB via a TEXT parameter since I wanted to use CONCAT inside the stored procedure in order to create dynamic SQL.
The solution was simply to change TEXT into LONGTEXT. That's it :)
That really took some time to figure out...
Hope I could help even after almost three years.
I have an application that uses a mysql database but I would like to run the unit tests for the application in a hsqldb in-memory database. The problem is that some of my persistable model objects have fields which I have annotated as columnDefinition = "TEXT" to to force mysql to cater for long string values, but now hsqldb doesn't know what TEXT means. If I change it to CLOB, then hsqldb is fine but mysql fails.
Is there a standard column definition that I can use for long strings that is compatible with mysql AND hsqldb?
What worked for me was to just enable MySQL compatibility mode by changing the connection URL to jdbc:hsqldb:mem:testdb;sql.syntax_mys=true
You can use the same solution as offered in this post for PostgreSQL TEXT columns and HSQLDB with Hibernate:
Hibernate postgresql/hsqldb TEXT column incompatibility problem
As HSQLDB allows you to define TEXT as a TYPE or DOMAIN, this may be a solution if you find out how to execute a statement such as the one below before each test run with HSQLDB via Hibernate.
CREATE TYPE TEXT AS VARCHAR(1000000)
Update for HSQLDB 2.1 and later: This version support a MySQL compatibility mode. In this mode, the MySQL TEXT type is supported and translated to LONGVARCHAR. LONGVARCHAR is by default a long VARCHAR, but a property (sql.longvar_is_lob) allows it to be interpreted as CLOB. See:
http://hsqldb.org/doc/2.0/guide/dbproperties-chapt.html#dpc_sql_conformance
http://hsqldb.org/doc/2.0/guide/compatibility-chapt.html#coc_compatibility_mysql
Not really. MySQL has TEXT and BLOB, with various size prefixes to indicate their maximum size. hsqldb only appears to have clob and various varchars. Most like you'd have to special case your tests depending on which database you're talking to.
If your text strings are short enough, you could use varchars, but those are limited to just under 64k in mysql, and that's the max size of a row, so the larger the varchar, the less space for other fields.
You can also solve some issues at the JPA-vendor (Hibernate etc.) level.
With #Lob for example, the long/large type is determined in runtime based on the vendor (longvarchar/longtext MySql vs clob in H2).
I am developing a Ruby on Rails application that stores a lot of text in a LONGTEXT column. I noticed that when deployed to Heroku (which uses PostgreSQL) I am getting insert exceptions due to two of the column sizes being too large. Is there something special that must be done in order to get a tagged large text column type in PostgreSQL?
These were defined as "string" datatype in the Rails migration.
If you want the longtext datatype in PostgreSQL as well, just create it. A domain will do:
CREATE DOMAIN longtext AS text;
CREATE TABLE foo(bar longtext);
In PostgreSQL the required type is text. See the Character Types section of the docs.
A new migration that updates the models datatype to 'text' should do the work. Don't forget to restart the database. if you still have problems, take a look at your model with 'heroku console' and just enter the modelname.
If the db restart won't fix the problem, the only way I figured out was to reset the database with 'heroku pg:reset'. No funny way if you already have important data in your database.
Hi when saving to a model, my created and modified fields aren't automatically populated by CakePHP. It was automatically populated when I was using MySQL but now it isn't. I'm not using NOW() back when I was still using MySQL. Why is it? Also when a field's value is not set 'NULL' (with quotes) is inserted causing errors because SQL Server says I can't insert a string to a field of type smallint/date etc. How do I fix this?
Thanks in advance!
I would set NULL as a keyword rather than quoting it, which I imagine is why your database thinks that it's a string.
Have you double checked the schema of the database to ensure that the created and modified fields are still DATETIME fields.
Also you say "SQL Server", and mention MySQL, so I assume that you are now using MSSQL?