liquibase : modify csv file and insert new records - mysql

I load data into my database from a CSV file :
<loadUpdateData encoding="UTF-8"
primaryKey="pk_id"
file="config/liquibase/roles_admin.csv"
separator=";"
tableName="role_admin">
<column name="libelle" type="STRING"/>
</loadUpdateData>
Is it possible to tell liquibase to insert new records if I add lines to my csv file?

You can modify your changeSet and add runAllways="true" this way it will run always (even if there is no change).

Using runOnChange="true" on your changeSet should do the trick.

The only way that i found to do that, is creating other csv file with the same columns like the original csv file and with the updated data, and create a new changeSet with loadUpdatedData inside and pointed to the new csv file.

Related

In rsreportserver.config file, how can I set CSV export to have a field delimiter of none

In rsreportserver.config file, how can I set CSV export to have a field delimiter of none So that I can extract a fixed length file? I tried keeping it empty but it gives commas after each field value.
Also if possible, please provide the script to get fixed length file and some of the fields can have commas in between values.
I would really appreciate any input as my client is fully determined to use SSRS and not SSIS.
You can achieve this by adding a new custom extension
use the following as an example :
<Extension Name="csvnoseperator" Type="Microsoft.ReportingServices.Rendering.DataRenderer.CsvReport,Microsoft.ReportingServices.DataRendering">
<OverrideNames>
<Name Language="en-US">csvnoseperator</Name>
</OverrideNames>
<Configuration>
<DeviceInfo>
<FieldDelimiter></FieldDelimiter>
<UseFormattedValues>False</UseFormattedValues>
<NoHeader>True</NoHeader>
<Encoding>ASCII</Encoding>
<FileExtension>csv</FileExtension>
</DeviceInfo>
</Configuration>
</Extension>
You should be able to then use this as an export format from the front end.
Please ensure you backup your config file before you make this change! If you get it wrong, you will not be able to starts Reporting Service

XML to phpmyadmin

I want to import a XMl file into phpmyadmin.
I have a XML file with 75000 lines so inserting individually the values is a huge no, when I try to import the file as XMl to the database it gave the following message:
Import has been successfully finished, 0 queries executed.
The following structures have either been created or altered. Here you can:
View a structure's contents by clicking on its name.
Change any of its settings by clicking the corresponding "Options" link.
Edit structure by following the "Structure" link.
wp_db (Options)
(xmlfile.xml)
-----------------------------------------------------------------------
Notice in ./libraries/plugins/import/ImportXml.php#158
Undefined index: pma
Backtrace
./import.php#652: PMA\libraries\plugins\import\ImportXml->doImport(array)
The phpmyadmin is behind MAMP, a short version of the xml file looks like this:
<Report>
<Row1>
<Reference>123</Reference>
<Nature>Outside</Nature>
<Disponibility>Full</Disponibility>
<State>In progress</State>
<Person>Jon</Person>
<Seller></Seller>
</Row1>
<Row2>
<Reference>123</Reference>
<Nature>Outside</Nature>
<Disponibility>Full</Disponibility>
<State>In progress</State>
<Person>Jon</Person>
<Seller></Seller>
</Row2>
</Report>
If it's not possible at least what is the easiest way to do it?
Thanks!
phpMyAdmin supports importing XML only if it follows a special format. You can see the exact format by exporting in phpMyAdmin a table in XML. You would need to programmatically modify your existing file to adapt to the supported format.

cannot load simple csv file into tableau public 9.3

I am trying to load the following simple csv file into tableau public 9.3:
customers,item1,item2,item3,item4
1,0,0,0,0
2,0,0,0,0
3,0,0,0,0
However, it doesn't read the file as separate columns, despite the field separator being Comma. Instead it treats the whole line as one column. Any help would be greatly appreciated :
If you change your locale settings to English US you will be able to load the file. You should also be able to work around this by creating a schema.ini file.
Go to Data > Manage fields > [Field] Options
You can also control imported CSV behavior post import both by splitting individual columns (which will remain split on update as well), or by the image below at the CSV level.
That doesn`t work for me. So I reopen the .csv file in Excel and save it again in .csv format with ',' as the delimeter.
After that my file looks like .csv with ';' delimeter and works with Tableau.

Getting strange characters when using cffile to loop over a csv

I' am on ColdFusion 11. I' am using the following code to loop over a CSV File and output the first row in the loop.
<cffile action="read" file="C:\inetpub\wwwroot\test\file.csv" variable="csvfile">
<cfloop index="index" list="#csvfile#" delimiters="#chr(10)##chr(13)#">
<cfoutput>#listgetAt('#index#',1, ',')#</cfoutput>
</cfloop>
It's outputting something strange characters. Here is the screenshot.
My CSV Structure
Please help!
You are reading an XLSX (MS Excel) file that has had its changed to CSV.
Notice how it starts with PK and is followed by .xml. This is a PK ZIP of XML, which is the native format for XLXS.
As a test, you can rename it to .zip and unzip it. You will see lots and lots of folders and .xml files
How to correct
You need to save as CSV, not just rename to CSV

Uploading CSV in neo4j

I am trying to upload the following csv (https://www.dropbox.com/s/95j774tg13qsdxr/out.csv?dl=0) file in to neo4j by following command
LOAD CSV WITH HEADERS FROM
"file:/home/pavan637/Neo4jDemo/out.csv"
AS csvimport
match (uniprotid:UniprotID{Uniprotid: csvimport.Uniprot_ID})
merge (Prokaryotes_Proteins: Prokaryotes_Proteins{UniprotID: csvimport.DBUni, ProteinID: csvimport.ProteinID, IdentityPercentage: csvimport.IdentityPercentage, AlignedLength:csvimport.al, Mismatches:csvimport.mm, QueryStart:csvimport.qs, QueryEnd: csvimport.qe, SubjectStrat: csvimport.ss, SubjectEnd: csvimport.se, Evalue: csvimport.evalue, BitScore: csvimport.bs})
merge (uniprotid)-[:BlastResults]->(Prokaryotes_Proteins)
I used "match" command in the LOAD CSV command in order to match with the "Uniprot_ID's" of previously loaded CSV.
I have first loaded ReactomeDB.csv (https://www.dropbox.com/s/9e5m1629p3pi3m5/Reactomesample.csv?dl=0) with the following cypher
LOAD CSV WITH HEADERS FROM
"file:/home/pavan637/Neo4jDemo/Reactomesample.csv"
AS csvimport
merge (uniprotid:UniprotID{Uniprotid: csvimport.Uniprot_ID})
merge (reactionname: ReactionName{ReactionName: csvimport.ReactionName, ReactomeID: csvimport.ReactomeID})
merge (uniprotid)-[:ReactionInformation]->(reactionname)
into neo4j which was successful.
Later on I am uploading out.csv
From both the CSV files, Uniprot_ID columns are present and some of those ID's are same. Though some of the Uniprot_ID are common, neo4j is not returning any rows.
Any solutions
Thanks in Advance
Pavan Kumar Alluri
Just a few tips:
only use ONE label and ONE property for MERGE
set the others with ON CREATE SET ...
try to create nodes and rels separately, otherwise you might get into memory issues
you should be consistent with your spelling and upper/lowercase of properties and labels, otherwise you will spent hours in debugging (labels, rel-types and property-names are case-sensitive)
you probably don't need merge for relationships, create should do fine
for your statement:
CREATE CONSTRAINT ON (up:UniprotID) assert pp.Uniprotid is unique;
CREATE CONSTRAINT ON (pp:Prokaryotes_Proteins) assert pp.UniprotID is unique;
USING PERIODIC COMMIT 10000
LOAD CSV WITH HEADERS FROM "file:/home/pavan637/Neo4jDemo/out.csv" AS csvimport
merge (pp: Prokaryotes_Proteins {UniprotID: csvimport.DBUni})
ON CREATE SET pp.ProteinID=csvimport.ProteinID,
pp.IdentityPercentage=csvimport.IdentityPercentage, ...
;
LOAD CSV WITH HEADERS FROM "file:/home/pavan637/Neo4jDemo/out.csv" AS csvimport
match (uniprotid:UniprotID{Uniprotid: csvimport.Uniprot_ID})
match (pp: Prokaryotes_Proteins {UniprotID: csvimport.DBUni})
merge (uniprotid)-[:BlastResults]->(Prokaryotes_Proteins);