I have been trying to insert a huge text-editor string in to my database. The application I'm developing allows my client to create and edit their website terms and conditions from the admin part of their website. So as you can imagine, they are incredibly long. I have got to 18,000+ characters in length and I have now received an error when trying to add another load of text.
The error I am receiving is this:
ADODB.Command error '800a0d5d'
Application uses a value of the wrong type for the current operation
Which points to this part of my application, specifically the Set newParameter line:
Const adVarChar = 200
Const adParamInput = 1
Set newParameter = cmdConn.CreateParameter("#policyBody", adVarChar, adParamInput, Len(policyBody), policyBody)
cmdConn.Parameters.Append newParameter
Now this policy I am creating, that is currently 18,000+ characters in length, is only half complete, if that. It could jump to 50 - 60,000! I tried using adLongVarChar = 201 ADO type but this still didn't fix it.
Am I doing the right thing for such a large entry? If I am doing the right thing, how can I fix this issue? ...or if I'm doing the wrong thing, what is the right one?
Try to avoid putting documents in your database if you can. Sometimes it's a reasonable compromise, serialised objects, mark up snippets and such.
If you don't want to query the document with sql the only benefit is the all in one place thing. ie back up your db, you back up your documents as well, and you can use your db connectivity exclusively.
That said nothing is free, carting all that stuff about in your database costs you.
If you can.
have a documents table, User name for the file, and internal name in your documents directory, so the file name is unique in the file system, and a path description, if there could be more than one.
Then just upload and download the selected document as a file, on a get or set of the related database entity.
You'll need to dal with deployment issues, document directory exists, and the account you are running mysql daemon as can see it, but most of the time, the issues you have keeping documents seperate fromthe db, are much easier to deal with than the head scratchers you are running into now.
Related
Disclaimer: new to SSIS and Active Directory
I have a need to extract all users within a particular Active Directory (AD) domain and import them into Excel. I have followed this: https://www.itnota.com/query-ldap-in-visual-studio-ssis/ in order to create my SSIS package. My SQL is:
LDAP://DC=JOHN,DC=JANE,DC=DOE;(&(objectCategory=person)(objectClass=user)(name=a*));Name,sAMAccountName
As you know there is a 1,000 row limit when pulling from the AD. In my SQL I currently have (name=a*) to test the process and it works. I need to know how to setup a loop with variables to pull all records and import into Excel (or whatever you experts recommend). Also, how do I know what the other field names are that are available to pull?
Thanks in advance.
How do I see what's in Active Directory
Tool recommendations are off topic for the site but a tool that you can download, no install required, is AD Explorer It's a MS tool that allows you to view your domain. Highly recommend people that need to see what's in AD use something like this as it shows you your basic structure.
What's my domain controller?
Start -> Command Prompt
Type set | find /i "userdnsdomain" and look for USERDNSDOMAIN and put that value in the connect dialog and I save it because I don't want to enter this every time.
Search/Find and then look yourself up. Here I'm going to find my account by using my sAMAccountName
The search results show only one user but there could have been multiples since I did a contains relationship.
Double clicking the value in the bottom results section causes the under pane window to update with the details of the search result.
This is nice because while the right side shows all the properties associated to my account, it's also updated the left pane to navigate to the CN. In my case it's CN=Users but again, it could be something else in your specific environment.
You might discover an interesting categorization for your particular domain. At a very large client, I discovered that my target users were all under a CN
(Canonical Name, I think) so I could use that in my AD query.
There are things you'll see here that you sure would like to bring into a data flow but you won't be able to. Like the memberOf that's a complex type and there's no equivalent in the data flow data types for it. I think Integer8 is also something that didn't work.
Loop the loop
The "trick" here is that we'll need to take advantage of the
The name of the AD provider has changed since I last looked at this. In VS 2017, I see the OLE DB Provider name as "OLE DB Provider for Microsoft Directory Service"
Put in your query and you should get results back. Let that happen so the metadata is set.
An ADO.NET source does not support parameterization as the OLE DB does. However, you can apply an Expression on the Data Flow which surfaces the component and that's what we'll do.
Click out of the Data Flow and back into the Control Flow and right click on the Data Flow and select Properties. In that properties window, find Expressions and click the ellipses ... Up pops the Property Expressions Editor
Find the ADO.NET source under Property and in the Expressions section, click the Ellipses.
Here, we'll use your same source query just to prove we're doing the right things
"LDAP://DC=JOHN,DC=JANE,DC=DOE;(&(objectCategory=person)(objectClass=user)(name=" + "a" + "*));Name,sAMAccountName"
We're doing string building here so the problem we're left to solve is how we can substitute something for the "a" in the above query.
The laziest route would be to
Create an SSIS variable of type String called CurrentLetter and initialize it to a
Update the expression we just created to be "LDAP://DC=JOHN,DC=JANE,DC=DOE;(&(objectCategory=person)(objectClass=user)(name=" + #[USer::CurrentLetter] + "*));Name,sAMAccountName"
Add a Foreach Loop Container (FELC) to your Control Flow.
Configure the FELC with an enumerator of "Foreach Item Enumerator"
Click the Columns...
Click Add (this results in Column 0 with data type String) so click OK
Fill the collection with each letter of the alphabet
In the Variable Mappings tab, assign Variable User::CurrentLetter to Index 0
Click OK
Old blog posts on the matter because I like clicks
https://billfellows.blogspot.com/2011/04/active-directory-ssis-data-source.html
http://billfellows.blogspot.com/2013/11/biml-active-directory-ssis-data-source.html
Sorry to darken your day with my troubles, but SSIS has broken me! I am new to SSIS and I just seem to be misunderstanding it.
For background: I have a few versions of a basic package that includes a Foreach Loop container and a Data Flow with a few Derived Columns that imports CSV files into a SQL Server Staging table. It is very straightforward and does include an Execute SQL task and a File Move but those work fine. The issues are with the Foreach loop and the Data Flow.
I have one version of this package (let’s call it “A”) that seemed to be working fine. It would process multiple files in a folder, insert records into the staging table, properly execute the SQL Statements, and move the files to Archive. Everything seemed fine until I carefully QA’d the process. Turns out it was duplicating the data from one file, and never importing the data from a second Source File! Yet, the second/dupe round of data included the Source Filename (via a derived column) of the second file (but the data from the first). So it looked like I had successfully processed BOTH files until I looked at the actual data and saw that none of the values from the second source file were ever written to the Staging table.
Once I discovered this, I figured that the problem was in the Foreach loop and how I setup the different file path & name variables. So, I decided to try to make a new version of the package. I started by copying package A and created package B. In B, I deleted the Source Connection manager and created a new Connection Manager along with all new file & path variables. I then tried to cleanup/fix/replace various elements in my Data Flow and Foreach loop. In the process, I discovered that the Advanced Mappings from A – which DID work – were virtually all setup as String (even the Currency and Date columns). That did not seem right, so I modified each source money column by changing to data type Currency, and changed each date-related column to data type Date.
What followed has been dozens and dozens of Errors and I cannot get Package B to run. I have even changed all of the B data types back to String (mirroring the setup in Package A which DID work). But, still no joy.
This leads me to ask a few questions to those of you smarter than I:
1) Why can’t SSIS interpret Source CSV data using the proper data type? I.e. why do I need to set every Input column as a STRING when some columns are clearly & completely Numeric, Currency or Dates? (Yes, the Source CSV files are VERY clean – most don’t even have NULLS)
a. When I do change the Advanced mapping for a date-related Source column to Date, I get the ever present error message: [Flat File Source [30]] Error: Data conversion failed. The data conversion for column "Settle Date" returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
2) When I reset the data types back to String in package B, I still get errors – usually Truncation errors (and Yes – I have adjusted the length to 250 in one of these columns).
a. Error Message: "The value could not be converted because of a potential loss of data.".
b. When I reset the Mappings to ignore the column (as a test), it throws a similar error at the next column.
3) Any ideas why Package A would dupe a file’s data and not process the second file, yet throw no errors and move both to Archive?
4) Why does the Data Viewer appear to have parsing errors (it shows data in the wrong columns) but when you use the Copy data feature in the data viewer and paste it into Excel, all of the data lines up perfectly?
5) Are there any tips & tricks that a rookie SSIS user needs to understand and which might not be apparent through the documentation and searching web articles as well as this site?
I can provide further details if they will help, but these packages are really very simple and should not be causing me this much frustration.
THANKS for any insights.
DGP
Wow seems like you have a lot of ssis issues... I think the reason for the same file being extracted is because of the the way your 'variable mappings' is defined.
Have you had a look and followed this guide:
https://www.simple-talk.com/sql/ssis/ssis-basics-introducing-the-foreach-loop-container/
Hope this helps.
Shaheen
Thanks Tab & Shaheen,
To all SSIS rookies - please learn from my mistakes!
It appears that my issue was actually in how I identified the TEXT QUALIFIER in the Connection Manager. I had entered "" and that was causing problems with how my columns were being parsed. The parsing issues caused unexpected values to appear in some of the columns and that was causing the errors in the package.
When I tried changing the the Text Qualifier to only ONE double quote - " - the whole thing worked!
As I mentioned - and as Shaheen suspected - my initial issues with the duplicate processing was probably due to how I setup the foreach loop. I had already fixed that, bit was still getting errors until I fixed the Text Qualifier.
I have only tested it a few times but it looks like that was the issue.
Thanks for the contributions.
DGP
Problem.
I regularly receive a feed files from different suppliers. Although the column names are consistent the problem comes when some suppliers send text files with more or less columns in there feed file.
Furthermore the arrangement of these files are inconsistent.
Other than the Dynamic data flow task provided by Cozy Roc is there another way I could import these files. I am not a C# guru but i am driven torwards using a "Script Task" control flow or "Script Component" Data flow task.
Any suggestion, samples or direction will greatly be appreciated.
http://www.cozyroc.com/ssis/data-flow-task
Some forums
http://www.sqlservercentral.com/Forums/Topic525799-148-1.aspx#bm526400
http://www.bidn.com/forums/microsoft-business-intelligence/integration-services/26/dynamic-data-flow
Off the top of my head, I have a 50% solution for you.
The problem
SSIS really cares about meta data so variations in it tend to result in exceptions. DTS was far more forgiving in this sense. That strong need for consistent meta data makes use of the Flat File Source troublesome.
Query based solution
If the problem is the component, let's not use it. What I like about this approach is that conceptually, it's the same as querying a table-the order of columns does not matter nor does the presence of extra columns matter.
Variables
I created 3 variables, all of type string: CurrentFileName, InputFolder and Query.
InputFolder is hard wired to the source folder. In my example, it's C:\ssisdata\Kipreal
CurrentFileName is the name of a file. During design time, it was input5columns.csv but that will change at run time.
Query is an expression "SELECT col1, col2, col3, col4, col5 FROM " + #[User::CurrentFilename]
Connection manager
Set up a connection to the input file using the JET OLEDB driver. After creating it as described in the linked article, I renamed it to FileOLEDB and set an expression on the ConnectionManager of "Data Source=" + #[User::InputFolder] + ";Provider=Microsoft.Jet.OLEDB.4.0;Extended Properties=\"text;HDR=Yes;FMT=CSVDelimited;\";"
Control Flow
My Control Flow looks like a Data flow task nested in a Foreach file enumerator
Foreach File Enumerator
My Foreach File enumerator is configured to operate on files. I put an expression on the Directory for #[User::InputFolder] Notice that at this point, if the value of that folder needs to change, it'll correctly be updated in both the Connection Manager and the file enumerator. In "Retrieve file name", instead of the default "Fully Qualified", choose "Name and Extension"
In the Variable Mappings tab, assign the value to our #[User::CurrentFileName] variable
At this point, each iteration of the loop will change the value of the #[User::Query to reflect the current file name.
Data Flow
This is actually the easiest piece. Use an OLE DB source and wire it as indicated.
Use the FileOLEDB connection manager and change the Data Access mode to "SQL Command from variable." Use the #[User::Query] variable in there, click OK and you're ready to work.
Sample data
I created two sample files input5columns.csv and input7columns.csv All of the columns of 5 are in 7 but 7 has them in a different order (col2 is ordinal position 2 and 6). I negated all the values in 7 to make it readily apparent which file is being operated on.
col1,col3,col2,col5,col4
1,3,2,5,4
1111,3333,2222,5555,4444
11,33,22,55,44
111,333,222,555,444
and
col1,col3,col7,col5,col4,col6,col2
-1111,-3333,-7777,-5555,-4444,-6666,-2222
-111,-333,-777,-555,-444,-666,-222
-1,-3,-7,-5,-4,-6,-2
-11,-33,-77,-55,-44,-666,-222
Running the package results in these two screen shots
What's missing
I don't know of a way to tell the query based approach that it's OK if a column doesn't exist. If there's a unique key, I suppose you could define your query to have only the columns that must be there and then perform lookups against the file to try and obtain the columns that ought to be there and not fail the lookup if the column doesn't exist. Pretty kludgey though.
Our solution. We use parent child packages. In the parent pacakge we take the individual client files and transform them to our standard format files then call the child package to process the standard import using the file we created. This only works if the client is consistent in what they send though, if they try to change their format from what they agreed to send us, we return the file.
Visual Web Developer. Entity data sources model. I have it creating the new database fine. Example
creates SAMPLE1.MDF and SAMPLE1.LDF
When I run my app, it creates another SAMPLE1_LOG.lDF file.
When I run createdatabase, is there a place I can specify the _LOG.ldf for the log file? SQL 2008 r2.
It messes up when I run the DeleteDatabase functions... 2 log files...
How come it does not create the file SAMPLE1_Log.ldf to start with, if that is what it is looking for...
Thank you for your time,
Frank
// database or initial catalog produce same results...
// strip the .mdf off of newfile and see what happens?
// nope. this did not do anything... still not create the ldf file correctly!!!
// sample1.mdf, sample1.ldf... but when run, it creates sample1_log.LDF...
newfile = newfile.Substring(0, newfile.Length - 4);
String mfile = "Initial Catalog=" + newfile + ";data source=";
String connectionString = FT_EntityDataSource.ConnectionManager.GetConnectionString().Replace("data source=", mfile);
// String mexclude = #"attachdbfilename=" + "|" + "DataDirectory" + "|" + #"\" + newfile + ";";
// nope. must have attach to create the file in the app_data, otherwise if goes to documents & setting, etc sqlexpress.
// connectionString = connectionString.Replace(mexclude, "");
Labeldebug2.Text = connectionString;
using (FTMAIN_DataEntities1 context = new FTMAIN_DataEntities1(connectionString))
{
// try
// {
if (context.DatabaseExists())
{
Buttoncreatedb.Enabled = false;
box.Checked = true;
boxcreatedate.Text = DateTime.Now.ToString();
Session["zusermdf"] = Session["zusermdfsave"];
return;
// Make sure the database instance is closed.
// context.DeleteDatabase();
// i have entire diff section for deletedatabase.. not here.
}
// View the database creation script.
// Labeldebug.Text = Labeldebug.Text + " script ==> " + context.CreateDatabaseScript().ToString().Trim();
// Console.WriteLine(context.CreateDatabaseScript());
// Create the new database instance based on the storage (SSDL) section
// of the .edmx file.
context.CreateDatabaseScript();
context.CreateDatabase();
}
took out all the try, catch so i can see anything that might happen...
==========================================================================
Rough code while working out the kinks..
connection string it creates
metadata=res://*/FT_EDS1.csdl|res://*/FT_EDS1.ssdl|res://*/FT_EDS1.msl;provider=System.Data.SqlClient;provider connection string="Initial Catalog=data_bac100;data source=.\SQLEXPRESS;attachdbfilename=|DataDirectory|\data_bac100.mdf;integrated security=True;user instance=True;multipleactiveresultsets=True;App=EntityFramework"
in this example, the file to create is "data_bac100.mdf".
It creates the data_bac100.mdf and data_bac100.ldf
when I actually use this file and tables to run, it auto-creates data_bac100_log.LDF
1) was trying just not to create the ldf, so when the system runs, it just creates the single one off the bat...
2) the Initial Catalog, and/or Database keywords are ONLY added to the connection string to run the createdatabase().. the regular connection strings created in web config only have attachdbfilename stuff, and works fine.
I have 1 connection string for unlimited databases, with the main database in the web.config.. I use a initialize section based on the user roles, whether visitor, member, admin, anonymous, or not authenticated... which sets the database correctly with a expression builder, and function to parse the connection string with the correct values for the database to operate on. This all runs good.
The entity framework automatically generates the script. I have tried with and without the .mdf extensions, makes no difference... thought maybe there is a setup somewhere that holds naming conventions for ldf files...
Eventually all of this will be for naught when start trying to deploy where not using APP_Data folder anyways...
Here is an example of connection string created when running application
metadata=res://*/FT_EDS1.csdl|res://*/FT_EDS1.ssdl|res://*/FT_EDS1.msl;provider=System.Data.SqlClient;provider connection string="data source=.\SQLEXPRESS;attachdbfilename=|DataDirectory|\TDSLLC_Data.mdf;integrated security=True;user instance=True;multipleactiveresultsets=True;App=EntityFramework"
in this case, use the TDSLLCData.mdf file...
04/01/2012... followup...
Entity Framework
feature
Log files created by the ObjectContext.CreateDatabase method
change
When the CreateDatabase method is called either directly or by using Code First with the SqlClient provider and an AttachDBFilename value in the connection string, it creates a log file named filename_log.ldf instead of filename.ldf (where filename is the name of the file specified by the AttachDBFilename value).
impact.
This change improves debugging by providing a log file named according to SQL Server specifications. It should have no unexpected side effects.
http://msdn.microsoft.com/en-us/library/hh367887(v=vs.110).aspx
I am on a Windows XP with .net 4 (not .net 4.5)... will hunt some more.. but looks like a issue that cannot be changed.
4/1/2012, 4:30...
ok, more hunting and searching and some of the inconsistancies I have experienced with createdatabase and databaseexists... so .net 4.5 is supposed to add the _log.ldf, and not just .ldf files, so they must have addressed this for some reason....
found others with same issues, but different server....
MySQL has a connector for EF4, the current version is 6.3.5 and its main functionalities are working fine but it still has issues with a few methods, e.g.
•System.Data.Objects.ObjectContext.CreateDatabase()
•System.Data.Objects.ObjectContext.DatabaseExists()
which makes it difficult to fully use the model-first approach. It's possible by manually editing the MySQL script (available with the CreateDatabaseScript method). The MySQL team doesn't seem eager to solve those bugs, I'm not sure what the commitment level actually is from their part but it certainly is lower than it once was.
That being said, the same methods fail with SQL CE too (they are not implemented, and I don't see the MS team as likely to tackle that soon).
Ran out of space below... it just becomes a problem when create a database, and it does not create the _log.ldf file, but just the ldf file, then use the database, and it creates a _log.ldf file... now you have 2 ldf files.. one becomes invalid.. Then when done with the database, delete it, then try to create a new, and a ldf exists, it will not work....
it turns out this is just the way it is with EF4, and they changed with EF4.5 beta to create the _log.ldf file to match what is created when the database is used.
thanks for time.
I've never used this "mdf attachment" feature myself and I don't know much about it, but according to the xcopy deployment documentation, you should not create a log file yourself because it will be automatically created when you attach the mdf. The docs also mention naming and say that the new log filename ends in _log.ldf. In other words, this behaviour appears to be by design and you can't change it.
Perhaps a more important question is, why do you care what the log file is called? Does it actually cause any problems for your application? If so, you should give details of that problem and see if someone has a solution.
The 1,500 page Access 97 Bible (don't laugh!) that I've been given by my boss to solve his problem doesn't solve my problem of how to solve his problem, because it has nee VBA code.
Let me first make clear that I've made attempts to solve this without (much) coding, and that I've coded quite a bit in VBA already, so I'm basically familiar with most things including recordsets, queries, etc etc but have problems with MS Access limits on how to form a report with data coming from VBA variables. I'm also versatile in most programming languages, but this is not a language problem but rather a "how to/what's possible" problem.
My problem right now is that dragging the query fields into the Detail subform and putting them into cells in columns setting Left and Top with VBA code are moving them alright, but each cell is on a new page. Unfortunately, there is multiple data in each cell that won't conform to the Create Report Guide options available.
So my question is simply this: Can someone point me to working examples of code that create, place, and fill with VBA variable strings, text fields at any coordinate I please on a paper size of my choice?
Edit: The above is not an option, as I understand this will prohibit the client from getting an .mde database. What remains, then, is to merely ask for some sound advice on how to get several rows GROUPed BY weekday and machine (see below) into a recordset or similar for each cell. I guess the best way is to count the number of columns in the table (machines in the sql result) and create 5 rows of these with dummy data, then go through the result rows and place the data in the relevant controls. But if you have ideas for doing this work better and faster, write them as answers.
Sorry for this, I knew there was something I wasn't understanding. Basically, I thought Access supported creating reports dynamically via VBA, ie. "generating pages with data" rather than "preparing a flow of controls connected to datasources". But Access requires that you create an ample amount of dummy, unlinked controls manually, then either fill or hide them and that's how they become "dynamic".
This is for Access 2003 on a remote server accessing local and remote ODBC SQL database tables, if relevant. The goal is to make a week schedule of n columns (n=number of machines at a certain plant) x 5 rows (weekday Mon-Fri), and put 1 or more recordset rows (=scheduled activities for that day on that machine) in each of the "n by 5 table" cells.
If you detect venting frustration in this post I can only ask your forgiveness and hope for your understanding.
So, has many techniques for this:
Ex: 1) using dinamic sql for this:
'Create a function to make sql query
Function MakeMySQlReport(Parameters):
Dim strSql as string
Dim strMyVar as string
strsql = vbnullstring
strsql = "Select " & myVar1 & " as MyFieldVar1, * from myTable where Fieldx =" & Parameters
MyReport.recordSource = ssql
End Function
Ex: 2) create function that returns yours strings:
Function MyString1() as string
MyString1 = 'ABC'
end Function
An in your report, select the textbox will receive the value and type =MyString1()]
I hope this help to you, need more examples?
Solution:
Create many objects manually (grr!)
name them systematically
put them in a Control Array (get all Me.Controls, sift out the ones you're interested in, and put them in an indexed array)
go through the array and change their properties