Can you generate XML or JSON from SQL Server metadata? - json

I have a PowerShell script that uses SQL Server Management Objects (SMO) to create a .SQL file containing all the metadata from a SQL Server database. However, SMO cannot natively generate XML or JSON output. Is there a means to turn the .SQL output into either of these formats?

In SMO you can script individual objects to strings with the Scripter, and then use .NET libraries to add those scripts to a custom class which you serialize to XML/JSON, or use the XML and JSON libraries to construct the document directly.

Related

How can I read csv data from a file and convert into json using CSV-connector in wso2 Integration studio?

I am trying to read csv data from a file and convert it into json data by using the CSV-Connector in WSO2 Integration studio. The data is converted to json when I pass it as payload, but how do I do it when I want to read csv data from a file by using the Connector? What extra connectors or mediators might be required for the same?
First, you need to read the file using the File Connector or using the VFS transport. (Either use a VFS Listener proxy or a File Inbound Endpoint). There is a File Inbound sample on the getting started page in the Integration Studio.
After reading the file youcan use the CSV-Connector, Datamapper Mediator or use the Smooks Mediator to convert the CSV into a XML and then to a JSON.
You can use WSO2 File connector https://docs.wso2.com/display/ESBCONNECTORS/Working+with+the+File+Connector
https://docs.wso2.com/display/ESBCONNECTORS/File+Connector
or you can write new custom mediator https://docs.wso2.com/display/EI660/Class+Mediator+ with your logic.

How to parse JSON into a SQL Table using SSIS

I've got a test SSIS package that reads this API https://api.coindesk.com/v1/bpi/currentprice.json
Which exports it to a table in SQL Server.
What is the best way of parsing this data so it is split into multiple columns correctly?
Disclaimer - I work for ZappySys (Company which makes API Connectors / Drivers for SSIS and ODBC)
Loading data from JSON file or REST API into SQL Server can be done few ways. For example, I literally took URL you supplied and put in JSON Source and got it working in 2 mins.
Method-1: Use 3rd party JSON Source Component (e.g. ZappySys)
Here is how to do using SSIS JSON Source by ZappySys (3rd party)
Method-2: Use C# code in Script Component
If you like to use FREE approach, then you can write C# code like this.

Azure Data Factory v2 Data Transformation

I am new to Azure Data Factory. And my question is, I have a requirement to move the data from an on-premise Oracle and on-premise SQL Server to a Blob storage. The data need to be transformed into JSON format. Each row as one JSON file. This will be moved to an Event Hub. How can I achieve this. Any suggestions.
You could use lookup activity + foreach activity. And inside the foreach, there is a copy activity. Please reference this post. How to copy СosmosDb docs to Blob storage (each doc in single json file) with Azure Data Factory
The Data copy tool as part of the azure data factory is an option to copy on premises data to azure.
the data copy tool comes with a configuration wizard where you do all the required steps like configuring the source, sink, integration pipeline etc.
In the source you need to write a custom query to fetch data from the tables you require in json format.
In case of SQL server to select json you would use the options OPENJSON, FOR JSON AUTO to convert the rows to json. Supported in SQL 2016. For older versions you need to explore the options available. Worst case you can write a simple console app in C#/java to fetch the rows and then convert them to json file. And then you can upload the file to azure blob storage. If this is an one time activity this option should work and you may not require a data factory.
In case of ORACLE you can use the JSON_OBJECT function.

MarkLogic Java API batch upload files (.csv)

Im trying out the MarkLogic Java API and would want to bulk upload some files with the extension .csv
I'm not sure what to use, since the Java API only supports JSON, XML, and TXT files.
How do I batch upload files using the MarkLogic Java api? Do i convert everything to JSON?
Do i convert everything to JSON?
Yes, that is a common way to do it.
If you would like additional examples of how you can wrangle CSV with the Java Client API, check out OpenCSVBatcherExample and JacksonDatabindTest.testDatabindingThirdPartyPojoWithMixinAnnotations. The first demonstrates converting the csv to XML and using a custom REST extension. The second example (well, unit test...) demonstrates converting the csv to JSON and using the batch upload (Bulk Writes) capabilities Justin linked to.
If you have CSV files on your filesystem, I’d start with mlcp, as suggested above. It will handle all of the parsing and splitting into multiple transactions/batches for you. Take a look at the mlcp documentation for more details and some example configurations.
If you’d like more control over the parsing and splitting logic than mlcp gives you out-of-the-box or you’re getting CSV from some other source (i.e. not files on the filesystem), you can use the Java Client API. The Java Client API allows you to efficiently write batches using a WriteSet. Take a look at the “Bulk Writes” example.
According to your reply to Justin, you cannot use MLCP because it is command line and you need to integrate it into a web portal.
Well, MLCP is released as open cource software under the Apache2 licence. So if you are happy with this licence, then you have the source to integrate.
But what I see as your main problem statement is more specific:
How can I create miltiple XML OR JSON documents from a CSV file [allowing the use of the java API to then upload them as documents in MarkLogic]
With that specific problem statement:
1) have a look at SplitDelimitedTextReader.java from the mlcp source
2) try some java libraries for this purpose such as http://jsefa.sourceforge.net/quick-tutorial.html

Where should I place XSD files/content in a SSIS project

I have an XML Source task which requires a XSD to generate my data flow work from my XML files in SSIS. How can I embed the XSD file within the SLN/DTSX project and have it be referenced correctly? It would seem the best case would be to have the XSD source inside a variable, but I dont see where I can do that given the XML Source data task
Using Project Deployment model it is possible to get the miscellaneous files bundled with the deployment. However I want to use the Project Deployment model. That model does not allow for files other than the ispac file. To get around the problem using this following post as reference SSIS dynamic xsd source file
Effectively you store the XSD within a available sql server database as a XML schema, use a dataflow to query for it, and store it in a RAW FILE. Then save the RAW file to disk for future use. Very Effective