I'm using MySQL Workbench 6.3 and got 2 Tables included. My task here is to upload XML file from a web server to my Table. One of them looks like this:
-<candidates>
<publicationDate>2016-04-26 15:00:00</publicationDate>
-<candidate>
<name>John</name>
<party>Party1</party>
</candidate>
-<candidate>
<name>Tom</name>
<party>Party2</party>
</candidate>
</candidates>
.....
and goes like this with 13 records.At the very beginning I got a notification that given XML file consists no information about styles included. Also, the other information says that default format of returning data is JSON, so to get XML in the header Accept must be included with value application/xml.
It's my very beginning with MySQL and I will be very thankful for any advises.
Related
I am using a csv file to load the data into jmeter.
Also, a part of the sever name is stored in the same csv file.
I need to create the sever name something like this:
${usernameInCSV}.twitter.com
The above syntax is not working for me. In solutions to achieve this?
You syntax looks correct.
Check the CSV DataSet configuration (can you show yours?):
You have set variables (one of them is usernameInCSV) are separated by comma in the element configuration
you have set separator and it matches the one in csv file
I try to import a XML file into a MYSQL table using the LOAD XML function:
LOAD XML INFILE 'test.xml INTO TABLE edge_delete ROWS IDENTIFIED BY '<edge>';
the XML File is structured like this:
<netstate xmlns:xsi=....>
<timestep time="2">
<edge id="10">
<lane id="10_0">
<vehicle id="veh1" pos="4.60" speed="0.00"/>
</lane>
</edge>
</timestep>
The Problem:
All node levels start with the attribute "id". The import does not distinguish between the node levels.
Each node level should be one corresponding column in my sql table: edge | lane | vehicle id |...
Thank you for you help
A solution which worked for me:
Use a python script which replaces the "id" attributes with another attribute string (e.g. with "edge"). The script replaces the string. Maybe not the best and efficient way, but solves the problem.
Now I can import all columns of the XML file into the mySQL table.
Another workaround: Using a text editor with a find/replace function.
Did not work properly with the tools I found for large xml files > 1GB.
The script can be found here:
https://studiofreya.com/2016/11/17/replace-string-in-xml-file-with-python/#comment-86628
I want to import a XMl file into phpmyadmin.
I have a XML file with 75000 lines so inserting individually the values is a huge no, when I try to import the file as XMl to the database it gave the following message:
Import has been successfully finished, 0 queries executed.
The following structures have either been created or altered. Here you can:
View a structure's contents by clicking on its name.
Change any of its settings by clicking the corresponding "Options" link.
Edit structure by following the "Structure" link.
wp_db (Options)
(xmlfile.xml)
-----------------------------------------------------------------------
Notice in ./libraries/plugins/import/ImportXml.php#158
Undefined index: pma
Backtrace
./import.php#652: PMA\libraries\plugins\import\ImportXml->doImport(array)
The phpmyadmin is behind MAMP, a short version of the xml file looks like this:
<Report>
<Row1>
<Reference>123</Reference>
<Nature>Outside</Nature>
<Disponibility>Full</Disponibility>
<State>In progress</State>
<Person>Jon</Person>
<Seller></Seller>
</Row1>
<Row2>
<Reference>123</Reference>
<Nature>Outside</Nature>
<Disponibility>Full</Disponibility>
<State>In progress</State>
<Person>Jon</Person>
<Seller></Seller>
</Row2>
</Report>
If it's not possible at least what is the easiest way to do it?
Thanks!
phpMyAdmin supports importing XML only if it follows a special format. You can see the exact format by exporting in phpMyAdmin a table in XML. You would need to programmatically modify your existing file to adapt to the supported format.
I am downloading CSV files which are comma-separated. The problem i'm having is that the commas are screwing-up my import into a database table (SQL Server). For example, I have a header row called hotel_name, but some of the names are like the following:
HOTEL_NAME
hilton
cambridge,the
The problem is that fields containing a comma in the hotel name will move to the adjacent column, like this I'm wondering if converting from CSV to a pipe-delimited format will work.
The problem i'm having is that i'm not sure how to get started. I've tried following the Powershell documentation but get basic errors. I think this is because i'm new to Powershell and not understanding something. Can someone please post a script of how to change the comma-separated file to a pipe-delimited file?
Sorry if this is confusing, i'm finding the formatting on StackOverflow to be a bit crazy.
Taken from Dealing with commas in a CSV file
Use " to wrap data that contains a comma.
For example
Server000,"Microsoft(R) Windows(R) Server 2003, Enterprise Edition"
I need to export varbinary data to file. But, when I do it using Column Transformations in SSIS, the exported files are corrupt. There are few junk characters at the start of the file. On removing them, the file opens fine.
A similar post for BCP, says that these characters specify the data length.
Would like to know how to address this issue in SSIS?
Thanks
Export transformation is used for converting the varbinary to files.I have tried something similar using Adventure works which has image type of var-binary data.
Following Query is used for the Source query. I have Modified the query
since it does not have the full path to write image files.
SELECT [ProductPhotoID]
,[ThumbNailPhoto]
,'D:\SSISTesting\ThumnailPhotos\'+[ThumbnailPhotoFileName]
,[LargePhoto]
,'D:\SSISTesting\LargePhotos\'+[LargePhotoFileName]
,[ModifiedDate]
FROM [Production].[ProductPhoto]
Used the Export column transformation[also available in 2005 and
2008] and configured as follows.
Mapped rest of the columns to the destination.
After running package all the image files are written into the
respective folders[D:\SSISTesting\ThumnailPhotos\ and D:\SSISTesting\LargePhotos].
Hope this helps!