I have a table of IPA characters where the IPA character is being stored as Short Text in the field (no duplicates). I cannot store i AND ɪ in this field as MS Access thinks they are the same. Is there a way to be able to save these to the DB table in MS Access?
You have the answer here:
Are unique indices on Access text fields always case insensitive?
but please note, that the newer answer by miroxlav is the correct answer.
Addendum:
To convert to a string:
Text = CStr([BinaryField1])
However, this will be zero-filled, thus the length of Text will always be size_of_field / 2.
To obtain the net length, first replace the zero characters with a space:
TrueText: Replace(CStr([BinaryField1]),Chr(0),"")
or, in code:
TrueText = Replace(CStr([BinaryField1]), vbNullChar, "")
Then the net length of TrueText can be obtained.
Related
Sometimes, when the text is copy pasted from a third website in my form based application (in the textarea) the data don't get inserted in database, instead throw this below error.
Incorrect string value: '\xE2\x80\xAF(fo...' for column 'my_column_name' at row 1 Error: INSERT INTO my_table_name
I tried the below query in mysql workbench to solve this issue.
ALTER TABLE my_database_name.my_table CONVERT TO CHARACTER SET utf8
But I am getting the below error from the database.
Error Code: 1118. Row size too large. The maximum row size for the used table type, not counting BLOBs, is 65535. This includes storage overhead, check the manual. You have to change some columns to TEXT or BLOBs
Your column data type accepts maximum limit of 65535 bytes. you need to change the column data type to text or BLOB
One more thing while copying content from website or word document just paste in any plain text editor and check whether expected content is copied
You can use $content= preg_replace('/[\xE2\x80\xAF]/', '', $content); in programming. the above example is in PHP
Don't use whitespace in names: hex E280AF is UTF-8 FOR "NARROW NO-BREAK SPACE".
I worry that doing ALTER TABLE my_database_name.my_table CONVERT TO CHARACTER SET utf8 without first diagnosing the problem has only made things worse.
You were probably using latin1 before? Did you have any other non-English text in the database? They may (or may not) be messed up.
We may be able to fix the mess, but we need to know more details about what you originally had, and what steps lead to this.
Also, what language(s) do you expect your customers to be using?
When I create an append Query in ms-access 2013 with parameters and any of those parameters' type is set to LongText the query fails with error code 3001 Invalid Argument. Changing the type to ShortText however results in a working query. Both version are runnable by double clicking the query in access itself, but the first one fails when running it via following code:
Dim db As DAO.Database
Set db = CurrentDb
Dim qdf As QueryDef
Set qdf = db.QueryDefs("NeuerFachlicherInhalt")
qdf!Inhalt = inhalte("DefaultInhalt")
qdf!Formular = inhalte("Formular")
qdf.Execute
The table I insert the parameter to has a field type of LongText and therefore I would expect this to work - what is the root cause of the issue here? And how can I pass in a long text if I am unable to specify a LongText as parameter?
I think it might be connected to the length limitations of Strings in access. What exactly are those limitations? Google redirects you to concatenation and max length of string in VBA, access regarding the question for string lengths, but i can not find a definite answer to the length question(s):
how long can the text for ShortText be?
how long can the text for LongText be?
how long can the text for a vba String be?
My queries in the two cases look like
PARAMETERS Inhalt LongText, Formular Short;
INSERT INTO FachlicherInhalt ( Inhalt, Formular )
SELECT [Inhalt] AS Expr1, [Formular] AS Expr2;
PARAMETERS Inhalt Text ( 255 ), Formular Short;
INSERT INTO FachlicherInhalt ( Inhalt, Formular )
SELECT [Inhalt] AS Expr1, [Formular] AS Expr2;
ShortText (simply Text prior to Access 2013) can be up to 255 characters in length.
LongText (Memo prior to Access 2013) can be up to 1 GB in length, but most Access controls can only display 64000 characters. (A Textbox in a Form will start behaving weird when editing the text, if it contains much less than those 64000 characters.)
See the Access 2013 Documentation for further details.
A VBA variable-length String can be up to 2^31 characters
See the Visual Basic for Applications Language Reference for further details.
Now for your question regarding the LongText-Parameter in the QueryDef-Object. Unfortunately DAO does not support LongText as Parameter-Type for a Query even though it lets you create the parameter in query design.
You have got the following options as a workaround:
Open a recordset and add/update the record there
Use an ADO-Command-Object for that query
Hardcode your function inhalte("DefaultInhalt") into the SQL of the query
Or concatenate your own SQL string including the values (Total SQL lenght limited to 64000 characters!)
So long as I am reading your question correctly, I'm almost certain you can't use a longtext/memo field as a parameter. As per the information found here: Any way to have long text (memo) parameters in DAO and MS Access?
I'm importing data from a progress database.
I am getting the following error:
Progress openedge wire protocol column in table has value exeeding its
max length or precision
Is there a way to specify a specific length of the select column's data in the select statement?
For example:
SELECT SUBSTRING(EMAIL,15) FROM SQL92.PROGRESSTABLE
SUBSTRING does give me the substring of a valid field value, but still fails with the above error when the dataset hits the "dirty" row.
I don't have access to the Progress database, so I can't run the progress DBTool to fix data.
The same kind of question was asked here, but the solution never posted.
Can I make an IDataReader ignore column length definitions?
The answer is here:
ODBC Error "Column x in table y has value exceeding its max length or precision"
Use this curly brace syntax to run a native RDBMS (Progress) function and fix the data before it hits ODBC:
SELECT { fn CONVERT(SUBSTRING( EMAIL,1,15) , SQL_VARCHAR) }
FROM SQL92.PROGRESSTABLE
I can't believe people choose to use a database that so easily allows corrupt data.
As some background, you will likely encounter two kinds of errors if using SSIS against Progress:
The data type of "output column "xyz" (n)" does not match the data type "System.Decimal" of the source column "xyz"
(Could be any data type)
I guess this means that the data type has been automatically changed behind the scenes by Progress. It differs to the one saved in SSIS which of course it doesn't like.
The short term solution is to open the package and refresh metadata by double clicking the source
The other error is:
Column xxx in table yyy has value exceeding its max length or precision.
Which means for example there is data that is 371 chars long in the database but the data dicitionary says its 324 chars long.
The long term solution to both of these is to wrap everything in a similar construct to above - cast it before it gets to the ODBC driver to get a consistent data type. It will of course truncate but that's probably better than it failing.
Data length for SQL data retrieved from an ODBC connection is determined by "Width" property of the Progress database fields. This can be accessed through "Progress Data Dictionary" by selecting the table that you want, then choosing in menu bar "Options", "Adjust Field Width". In CHAR fields normally the "Width" property is defined with a value that correspond to twice "Format" width.
For example : a field FOO, type CHAR, format "x(20)" by default has a 40 SQL Width value, twice as original format size.
Despite you can write data with length greater than 20 chars in a CHAR field formatted as "x(20)" as long as Progress don't care on what amount of data stored in his fields and "Format" phrase is only for display purposes (within his data type size limits, of course), it limits data length for SQL connections with a hard limit, such as Oracle does. In other words, for instance, as long as you can't write N + n chars in a Oracle database field defined as N length you can't retrieve data from a Progress database through an ODBC connection if they fields are defined as format "x(N)" and data stored on it exceeds, tipically, 2N.
If you don't have access to Progress database you may contact the person who's responsible for data stored on the Progress database and ask him to resize the fields on database schema to match the full amount of data stored. The programs whose is storing data on database may be discordant with database field size. Otherwise you won't be able to retrieve data from tables that have fields which stores more than "Width" chars.
I'm using Access 2000 and I have a query like this:
SELECT function(field1) AS Results FROM mytable;
I need to export the results as a text file.
The problem is:
function(field1) returns a fairly long string (more than 255 char) that cannot be entirely stored in the Results field created from this query.
When i export this query as a text file, i can't see the string entirely. (truncated)
Is it possible to cast function(field1) so it returns a Memo type field containing the string ?
Something like this:
SELECT (MEMO)function(field1) AS Results FROM mytable;
Do you know others solutions?
There is an official microsoft support page on this problem:
ACC2000: Exported Query Expression Truncated at 255 Characters
They recommend that you append the expression data to a table that has a memo field, and export it from there. It's kinda an ugly solution, but you cannot cast parameters to types in MS Access, so it might be the best option available.
i don't know how to do quite what you're hoping (which makes sense) but a possible alternative could be to create 2 or 3 fields (or separate queries) and extract different portions of the text into each then concat after retrieved.
pseudo: concat((chars 1-255) & (chars 256-510) & (chars 511-etc...))
edit: it's odd that a string longer than 255 is stored but it's not memo. what's up there? another alternative, if you have access to the db, is change the field type. (backup the db first!)
I have a sql table that gets populated via SQLBulkCopy from Excel. The copy down is done using the Microsoft ACE drivers.
I had a problem with one particular file - when it was loaded down to sql, some of the columns (which appear empty in excel) contained an odd value.
For example, running this sql:
SELECT
CONVERT(VARBINARY(10),MyCol),
LEN(MyCol)
FROM MyTab
would return
0x, 0
i.e. - converting the value in the column to varbinary shows something, but doing length of the varchar shows no length. I realise that the value shown is the stem of a hex value, but its weird that its gets there, and how hard it is to detect.
Obviously I can just clear out the cells in Excel, but I really need to detect this automatically as end users will have the same issue. It is causing issues further down the line when the data gets processed. Its quite hard to trace the problem back from its eventual symptoms to being this issue in the source.
Other than the above conversion to varbinary to output in SSMS I've not come up with a way of detecting these values, either in Excel or via a SQL script to remove them.
Any ideas?
This may help you:
-- Conversion from hex string to varbinary:
DECLARE #hexstring VarChar(MAX);
SET #hexstring = 'abcedf012439';
SELECT CAST('' AS XML).Value('xs:hexBinary( substring(sql:variable("#hexstring"), sql:column("t.pos")) )', 'varbinary(max)')
FROM (SELECT CASE SubString(#hexstring, 1, 2) WHEN '0x' THEN 3 ELSE 0 END) AS t(pos)
GO
-- Conversion from varbinary to hex string:
DECLARE #hexbin VarBinary(MAX);
SET #hexbin = 0xabcedf012439;
SELECT '0x' + CAST('' AS XML).Value('xs:hexBinary(sql:variable("#hexbin") )', 'varchar(max)');
GO
One method is to add a new column, convert the data, drop the
old column and rename the new column to the old name.
As Martin points out above, 0x is what you get when you convert an empty string. eg:
SELECT CONVERT(VARBINARY(10),'')
So the problem of detecting it obviously goes away.
I have to assume that there is some rubbish in the excel cell, that is being filtered out in the process of the write down by either the ACE driver or the SQLBulkCopy. Because there was something in the field originally, the value written is empty instead of null.
In order to make sure that everything is consistent in the data we'll need to do a post process to switch all empty values to nulls so that the next lots of scripts work.