Error when reading graphml file in r graph - igraph

I'm trying to read a graph ml file in r using the i graph package.
The code I'm using is the following
g<-read.graph("graph_bustuberail_london_500m",format=c("graphml")) #import gml
I get the following error message
Error in .Call("R_igraph_read_graph_graphml", file, as.numeric(index), :
At rinterface.c:5866 : Cannot open GraphML file, File operation error
Not sure why this is not loading in, can anyone help me?

Igraph is not very informative regarding certain types of error it displays: I have found that many times the above error is caused by simply a misspelling in the filename.
As a consequence I suggest you to start by checking that you have written the file extension correctly, i.e. with .gml ending and not with .glm.

Related

Errors occurred:Technical error: Unexpected response returned by server. Import stopped. - Prestashop 1.7.1.1

I'm importing my old prestashop data to new site with updated prestashop (ver. 1.7.1.1). When I'm importing the category csv file I got this error
Errors occurred:Technical error: Unexpected response returned by server. Import stopped.
I've got other csv's naming products,customers and manufacturers. So which csv is first imported to prestashop ? I've got bit confused here.
Is there anyone can help me ? Any help is appreciated. I'm attaching the screenshot of error here.
I found that it is badly handled exceptions during the import AJAX request. In my case, it was caused by one value in the feature field in the csv import file that was larger than 255 chars. I solved it by increasing the size attribute in FeatureValue.php file on line 53.
The best way how to more info about Your problem is open Chrome Debugging tool where You get more information in case of exception during the report.

Error when loading shape files into Bluemix dashDB

I am running into the following error when I am loading my shape files through the DashDB console:
My shape files are the following:
Would anyone have experience working with DashDB and ran into a similar problem?
UPDATE:
I downloaded a separate dataset with the following files, and I still running into the same error:
Please find the following sample files https://www.dropbox.com/s/bkrac971g9uc02x/deng.zip?dl=0
I brought the Shapefile into QGIS easily, so I knew the format was OK. I unzipped the Shapefile, changed the file names to lower-case and re-zipped it up. Then I was able to get further in the dashDB upload UI. I got to a message saying the SRS was unknown. I then used QGIS to convert the SRS (spatial reference system) into a known one -- EPSG:4269, NAD83, and I was then able to upload it into dashDB. Here's the version of your file that works:
https://dl.dropboxusercontent.com/u/8196680/dc.zip

DTS_E_PRIMEOUTPUTFAILED with error code 0xC0202091 when loading flat file

I get an error message when I try run my SSIS Package the error is:
[Flat File Source [1]] Error: The column delimiter for column "Column
8" was not found.
[Flat File Source [1]] Error: An error occurred while skipping data
rows.
[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on component "Flat File Source" (1) returned
error code 0xC0202091. The component returned a failure code when the
pipeline engine called PrimeOutput(). The meaning of the failure code
is defined by the component, but the error is fatal and the pipeline
stopped executing. There may be error messages posted before this
with more information about the failure.
Most of the csv files load no problem but a handful of csv files don't and prior to that the package has been working fine for years.
Encountered this error too, turned out that its skipping data rows because my CSV file has missing columns. Try checking if the columns in your file is correct.
Possible Scenario would be Multiple CSV Files doesn't have the same structure (Column names
)
Similar to user2545231's answer (can't comment based on reputation), I solved this by clicking Reset Columns in the connection manager for the file I was trying to import.
I also came across the same error. Yet I got it resolved by checking the flat file sources. Make sure that there are no unnecessary spaces, verify delimiter used for eg: a comma,etc and also the data entry should be in sync with the first row. I hope it helps you out.
I found through our FTP configuration older files can hang in the SAN or the FTP and this can cause this error as well. Very frustrating.
Another possible source for this error is if you've created your system with a new version of an incoming file and then go back to import older ones and find the error ... check that all the fields are there! I discovered that for a few days "back then" three fields were missing, causing this error.
I have got the same error when I am processing a file.
In my case, the problem is the expected delimiter is TAB whereas the file that I have received was a comma(,) delimited file.
When I have correct the input file format, it has resolved the issue.
I had a task that involved reading files and skiping rows if a certain condition was true. The name of the column used in there was "Pxs" and the actual name in the files was "PXS". It is case sensitive, it couldn't find the column. Corrected to caps and it works fine now.
In my case, just deleted those data flow tasks, clean designer, recreated same.. mentioned error got resolved. Its Microsoft :(!
-check the detail message that is next to that one
-check the columns: there might be difference between what you look in this file and that is it actually

RExcel clashes when attempting to download a file in csv format

I am working on an Excel file that has embedded RExcel and calls a function from R through RApply in thousands of cells. The problem that I encounter is when I try to download some files in CSV format, I obtained a message which states that a severe error has occured and all my Excel files are shut down.
I would appreciate some assistance in what I could do in order to prevent this problem!

Apache POI - Reading .xlsx and .xls files using usermodel

I have code using Apache POI written to read data from .xls worksheets using hssf. I'd like for the program to read .xlsx worksheets as well using org.apache.poi.ss.usermodel. Here is the code:
(_fileName is passed into the function)
java.io.FileInputStream fs = new java.io.FileInputStream(_fileName)
Workbook book = WorkbookFactory.create(fs);
It throws the following exception for a .xlsx file: InavlidFormatException - Can't read the content types part !
I'm doing this in Visual Studio, so the output window says "A first chance exception of type 'org.apache.poi.openxml4j.exceptions.InvalidFormatException' occurred in poi-ooxml-3.7-20101029.dll'.
And for a .xls file, the output window says "A first chance exception of type 'java.io.IOException' occurred in IKVM.OpenJDK.Core.dll"
It would be great if someone could help me solve this issue. Been working on this since yesterday.
Thank you so much!!!
Soundarya
The error message indicates that your .xlsx file isn't a valid one. I'd double check you're passing in the correct file, and that it's really an Excel one (and not something else - in your .xlsx case I suspect you just have a normal zip file)