C# - Can not access File which is already being used - Iron OCR - ocr

I am using "Iron OCR", something like "Tesseract" to detect and scan certain Text from Screenshots.
Now I have the following error. Every time Iron OCR is used to scan an image for text it tries to access the Iron OCR log file which is somehow still used by the process before. So every time I get the error message that it can't access the log file because it is already in use. Nevertheless the Scan still works and I get a valid result even tho it gives me an exception because of that error.
My program works like this:
it takes a screenshots of certain areas of my screen.
it analyzes that image with Iron OCR and looks for text.
this process repeats itself infinitely.
I have following code:
//------------------------- # Capture Screenshot of specific Area # -------------------------\\
Rectangle bounds3;
Rect rect3 = new Rect();
bounds3 = new Rectangle(rect3.Left + 198, rect3.Top + 36, rect3.Right + 75 - rect3.Left - 10, rect3.Bottom + 30 - rect3.Top - 10);
CursorPosition = new Point(Cursor.Position.X - rect.Left, Cursor.Position.Y - rect.Top);
Bitmap result3 = new Bitmap(40, 14);
using (Graphics g = Graphics.FromImage(result3))
{
g.CopyFromScreen(new Point(bounds3.Left, bounds3.Top), Point.Empty, bounds3.Size);
}
//------------------------- # Analyze Image for Text # -------------------------\\
var Ocr = new IronTesseract();
using (var Input = new OcrInput(result))
{
Input.Contrast();
Input.EnhanceResolution(300);
Input.Invert();
Input.Sharpen();
Input.ToGrayScale();
try
{
//------------------- # This causes the Error - Using Try Catch to Ignore it # -------------------\\
var Result = Ocr.Read(Input);
text = Result.Text;
}
catch
{
}
}
Also removing all the above only using their "1 Line Code" gives the same error message:
var Result = new IronTesseract().Read(#"images\image.png").Text;
I hope someone can help me to figure out what exactly causes that issue.

Related

Why do I get odd 0,0 point in Octave trisurf

I am trying to draw a surface from a file on disk (shown below). But I get an odd additional point at co-ords (0,0).
The file appears to be in correct shape to me.
I draw the chart from my C# application with a call to Octave .Net. Here is the Octave part of the script:
figure (1,'name','Map');
colormap('hot');
t = dlmread('C:\Map3D.csv');
# idx = find(t(:,4) == 4.0);t2 = t(idx,:);
tx =t(:,1);ty=t(:,2);tz=t(:,3);
tri = delaunay(tx,ty);
handle = trisurf(tri,tx,ty,tz);xlabel('Floor');ylabel('HurdleF');zlabel('Sharpe');
waitfor(handle);
This script is called from my C# app, with the following , very simple, code snippet:
using (var octave = new OctaveContext())
{
octave.Execute(script, int.MaxValue);
}
Can anyone explain if my Octave script is wrong, or the way I have structured the file.
Floor,HurdleF,Sharpe,Model
1.40000000,15.00000000,-0.44,xxx1.40_Hrd_15.00
1.40000000,14.00000000,-0.49,xxx1.40_Hrd_14.00
1.40000000,13.00000000,-0.19,xxx1.40_Hrd_13.00
1.40000000,12.00000000,-0.41,xxx1.40_Hrd_12.00
1.40000000,11.00000000,0.42,xxx1.40_Hrd_11.00
1.40000000,10.00000000,0.17,xxx1.40_Hrd_10.00
1.40000000,9.00000000,0.28,xxx1.40_Hrd_9.00
1.40000000,8.00000000,0.49,xxx1.40_Hrd_8.00
1.40000000,7.00000000,0.45,xxx1.40_Hrd_7.00
1.40000000,6.00000000,0.79,xxx1.40_Hrd_6.00
1.40000000,5.00000000,0.56,xxx1.40_Hrd_5.00
1.40000000,4.00000000,1.76,xxx1.40_Hrd_4.00
1.30000000,15.00000000,-0.46,xxx1.30_Hrd_15.00
1.30000000,14.00000000,-0.55,xxx1.30_Hrd_14.00
1.30000000,13.00000000,-0.24,xxx1.30_Hrd_13.00
1.30000000,12.00000000,0.35,xxx1.30_Hrd_12.00
1.30000000,11.00000000,0.08,xxx1.30_Hrd_11.00
1.30000000,10.00000000,0.63,xxx1.30_Hrd_10.00
1.30000000,9.00000000,0.83,xxx1.30_Hrd_9.00
1.30000000,8.00000000,0.21,xxx1.30_Hrd_8.00
1.30000000,7.00000000,0.55,xxx1.30_Hrd_7.00
1.30000000,6.00000000,0.63,xxx1.30_Hrd_6.00
1.30000000,5.00000000,0.93,xxx1.30_Hrd_5.00
1.30000000,4.00000000,2.50,xxx1.30_Hrd_4.00
1.20000000,15.00000000,-0.40,xxx1.20_Hrd_15.00
1.20000000,14.00000000,-0.69,xxx1.20_Hrd_14.00
1.20000000,13.00000000,0.23,xxx1.20_Hrd_13.00
1.20000000,12.00000000,0.56,xxx1.20_Hrd_12.00
1.20000000,11.00000000,0.22,xxx1.20_Hrd_11.00
1.20000000,10.00000000,0.56,xxx1.20_Hrd_10.00
1.20000000,9.00000000,0.79,xxx1.20_Hrd_9.00
1.20000000,8.00000000,0.20,xxx1.20_Hrd_8.00
1.20000000,7.00000000,1.09,xxx1.20_Hrd_7.00
1.20000000,6.00000000,0.99,xxx1.20_Hrd_6.00
1.20000000,5.00000000,1.66,xxx1.20_Hrd_5.00
1.20000000,4.00000000,2.23,xxx1.20_Hrd_4.00
1.10000000,15.00000000,-0.31,xxx1.10_Hrd_15.00
1.10000000,14.00000000,-0.18,xxx1.10_Hrd_14.00
1.10000000,13.00000000,0.24,xxx1.10_Hrd_13.00
1.10000000,12.00000000,0.70,xxx1.10_Hrd_12.00
1.10000000,11.00000000,0.31,xxx1.10_Hrd_11.00
1.10000000,10.00000000,0.76,xxx1.10_Hrd_10.00
1.10000000,9.00000000,1.24,xxx1.10_Hrd_9.00
1.10000000,8.00000000,0.94,xxx1.10_Hrd_8.00
1.10000000,7.00000000,1.09,xxx1.10_Hrd_7.00
1.10000000,6.00000000,1.53,xxx1.10_Hrd_6.00
1.10000000,5.00000000,2.41,xxx1.10_Hrd_5.00
1.10000000,4.00000000,2.16,xxx1.10_Hrd_4.00
1.00000000,15.00000000,-0.41,xxx1.00_Hrd_15.00
1.00000000,14.00000000,-0.24,xxx1.00_Hrd_14.00
1.00000000,13.00000000,0.33,xxx1.00_Hrd_13.00
1.00000000,12.00000000,0.18,xxx1.00_Hrd_12.00
1.00000000,11.00000000,0.61,xxx1.00_Hrd_11.00
1.00000000,10.00000000,0.96,xxx1.00_Hrd_10.00
1.00000000,9.00000000,1.75,xxx1.00_Hrd_9.00
1.00000000,8.00000000,0.74,xxx1.00_Hrd_8.00
1.00000000,7.00000000,1.63,xxx1.00_Hrd_7.00
1.00000000,6.00000000,2.12,xxx1.00_Hrd_6.00
1.00000000,5.00000000,2.73,xxx1.00_Hrd_5.00
1.00000000,4.00000000,2.03,xxx1.00_Hrd_4.00
0.90000000,15.00000000,-0.42,xxx0.90_Hrd_15.00
0.90000000,14.00000000,-0.37,xxx0.90_Hrd_14.00
0.90000000,13.00000000,0.58,xxx0.90_Hrd_13.00
0.90000000,12.00000000,0.03,xxx0.90_Hrd_12.00
0.90000000,11.00000000,0.68,xxx0.90_Hrd_11.00
0.90000000,10.00000000,0.79,xxx0.90_Hrd_10.00
0.90000000,9.00000000,1.54,xxx0.90_Hrd_9.00
0.90000000,8.00000000,0.82,xxx0.90_Hrd_8.00
0.90000000,7.00000000,1.81,xxx0.90_Hrd_7.00
0.90000000,6.00000000,2.33,xxx0.90_Hrd_6.00
0.90000000,5.00000000,2.99,xxx0.90_Hrd_5.00
0.90000000,4.00000000,1.71,xxx0.90_Hrd_4.00
0.80000000,15.00000000,-0.46,xxx0.80_Hrd_15.00
0.80000000,14.00000000,-0.26,xxx0.80_Hrd_14.00
0.80000000,13.00000000,0.55,xxx0.80_Hrd_13.00
0.80000000,12.00000000,0.07,xxx0.80_Hrd_12.00
0.80000000,11.00000000,0.65,xxx0.80_Hrd_11.00
0.80000000,10.00000000,1.08,xxx0.80_Hrd_10.00
0.80000000,9.00000000,1.27,xxx0.80_Hrd_9.00
0.80000000,8.00000000,1.12,xxx0.80_Hrd_8.00
0.80000000,7.00000000,1.98,xxx0.80_Hrd_7.00
0.80000000,6.00000000,2.62,xxx0.80_Hrd_6.00
0.80000000,5.00000000,3.35,xxx0.80_Hrd_5.00
0.80000000,4.00000000,1.27,xxx0.80_Hrd_4.00
0.70000000,15.00000000,-0.56,xxx0.70_Hrd_15.00
0.70000000,14.00000000,-0.33,xxx0.70_Hrd_14.00
0.70000000,13.00000000,0.24,xxx0.70_Hrd_13.00
0.70000000,12.00000000,-0.22,xxx0.70_Hrd_12.00
0.70000000,11.00000000,0.74,xxx0.70_Hrd_11.00
0.70000000,10.00000000,1.19,xxx0.70_Hrd_10.00
0.70000000,9.00000000,1.24,xxx0.70_Hrd_9.00
0.70000000,8.00000000,1.14,xxx0.70_Hrd_8.00
0.70000000,7.00000000,2.26,xxx0.70_Hrd_7.00
0.70000000,6.00000000,2.70,xxx0.70_Hrd_6.00
0.70000000,5.00000000,3.52,xxx0.70_Hrd_5.00
0.70000000,4.00000000,1.05,xxx0.70_Hrd_4.00
0.60000000,15.00000000,-0.50,xxx0.60_Hrd_15.00
0.60000000,14.00000000,-0.60,xxx0.60_Hrd_14.00
0.60000000,13.00000000,0.11,xxx0.60_Hrd_13.00
0.60000000,12.00000000,-0.16,xxx0.60_Hrd_12.00
0.60000000,11.00000000,0.73,xxx0.60_Hrd_11.00
0.60000000,10.00000000,1.08,xxx0.60_Hrd_10.00
0.60000000,9.00000000,1.31,xxx0.60_Hrd_9.00
0.60000000,8.00000000,1.38,xxx0.60_Hrd_8.00
0.60000000,7.00000000,2.24,xxx0.60_Hrd_7.00
0.60000000,6.00000000,2.89,xxx0.60_Hrd_6.00
0.60000000,5.00000000,3.50,xxx0.60_Hrd_5.00
0.60000000,4.00000000,1.11,xxx0.60_Hrd_4.00
0.50000000,15.00000000,-0.40,xxx0.50_Hrd_15.00
0.50000000,14.00000000,-0.37,xxx0.50_Hrd_14.00
0.50000000,13.00000000,0.13,xxx0.50_Hrd_13.00
0.50000000,12.00000000,-0.11,xxx0.50_Hrd_12.00
0.50000000,11.00000000,0.61,xxx0.50_Hrd_11.00
0.50000000,10.00000000,0.92,xxx0.50_Hrd_10.00
0.50000000,9.00000000,1.41,xxx0.50_Hrd_9.00
0.50000000,8.00000000,1.39,xxx0.50_Hrd_8.00
0.50000000,7.00000000,2.19,xxx0.50_Hrd_7.00
0.50000000,6.00000000,2.80,xxx0.50_Hrd_6.00
0.50000000,5.00000000,3.41,xxx0.50_Hrd_5.00
0.50000000,4.00000000,1.05,xxx0.50_Hrd_4.00
0.40000000,15.00000000,-0.25,xxx0.40_Hrd_15.00
0.40000000,14.00000000,-0.44,xxx0.40_Hrd_14.00
0.40000000,13.00000000,0.02,xxx0.40_Hrd_13.00
0.40000000,12.00000000,0.00,xxx0.40_Hrd_12.00
0.40000000,11.00000000,0.69,xxx0.40_Hrd_11.00
0.40000000,10.00000000,0.67,xxx0.40_Hrd_10.00
0.40000000,9.00000000,1.02,xxx0.40_Hrd_9.00
0.40000000,8.00000000,1.29,xxx0.40_Hrd_8.00
0.40000000,7.00000000,2.17,xxx0.40_Hrd_7.00
0.40000000,6.00000000,2.88,xxx0.40_Hrd_6.00
0.40000000,5.00000000,3.19,xxx0.40_Hrd_5.00
0.40000000,4.00000000,0.98,xxx0.40_Hrd_4.00
0.30000000,15.00000000,-0.02,xxx0.30_Hrd_15.00
0.30000000,14.00000000,-0.36,xxx0.30_Hrd_14.00
0.30000000,13.00000000,-0.26,xxx0.30_Hrd_13.00
0.30000000,12.00000000,-0.11,xxx0.30_Hrd_12.00
0.30000000,11.00000000,0.50,xxx0.30_Hrd_11.00
0.30000000,10.00000000,0.50,xxx0.30_Hrd_10.00
0.30000000,9.00000000,1.01,xxx0.30_Hrd_9.00
0.30000000,8.00000000,1.28,xxx0.30_Hrd_8.00
0.30000000,7.00000000,2.11,xxx0.30_Hrd_7.00
0.30000000,6.00000000,2.89,xxx0.30_Hrd_6.00
0.30000000,5.00000000,3.16,xxx0.30_Hrd_5.00
0.30000000,4.00000000,0.95,xxx0.30_Hrd_4.00
What's happening
dlmread() is reading the file in as numeric data and returning a numeric matrix. It doesn't recognize the text in your header line, so it silently converts that row to all zeros. (IMHO this is a design flaw in dlmread.) Remove the header line.
How to debug this
So, you've got some zeros in your plot that you didn't expect to be there? Check for zeros in your input data:
ixZerosX = find(tx == 0)
ixZerosY = find(ty == 0)
ixZerosZ = find(tz == 0)
The semicolons are omitted intentionally there to get Octave to automatically display the results.
Better yet, since doubles are an approximate type, and the values might be close to but not actually zero, do a "near zero" search:
threshold = 0.1;
ixZerosX = find(abs(tx) < threshold)
ixZerosY = find(abs(ty) < threshold)
ixZerosZ = find(abs(tz) < threshold)

JSON Parse error: Unrecognized token '<'

I am getting this error (as per Safari's Web inspector) but I cannot see why. Most reports of this error suggest that it is reading a HTML tag somewhere ... but I cannot see it.
var oReq = new XMLHttpRequest(); //New request object
oReq.onload = function() {
document.getElementById("myConsole").innerHTML = this.responseText;
myData = JSON.parse(this.responseText);
...
The third line of code dumps the responseText onto my webpage (into a DIV called 'myConsole'). This shows what I believe to be standard JSON code ... and contains no '<' characters.
The second line of code tries to parse the responseText and give the '<' token error.
The php data source looks like this:
$rowCount = 0;
do { $rowCount += 1;
$dbCurrentRow = $resultSet->fetch_assoc();
$seats[$rowCount]['room'] = $dbCurrentRow['Room'];
$seats[$rowCount]['seat'] = $dbCurrentRow['Seat'] * 1;
$seats[$rowCount]['x'] = $dbCurrentRow['x'] * 1;
$seats[$rowCount]['y'] = $dbCurrentRow['y'] * 1;
$seats[$rowCount]['name'] = "Joe Bloggs";
$seats[$rowCount]['adno'] = "01234";
$seats[$rowCount]['ev6'] = true;
$seats[$rowCount]['eal'] = true;
$seats[$rowCount]['dpLast'] = "LS";
$seats[$rowCount]['dpCurrent'] = "WA";
$seats[$rowCount]['dpTarget'] = "TG";
$seats[$rowCount]['ma'] = 2 * 1;
} while ($rowCount < $resultSet->num_rows);
echo json_encode($seats);
and the JSON output is this:
{"1":{"room":"35","seat":1,"x":0,"y":0,"name":"Joe
Bloggs","adno":"01234","ev6":true,"eal":true,"dpLast":"LS","dpCurrent":"WA","dpTarget":"TG","ma":2},"2":{"room":"35","seat":2,"x":30,"y":60,"name":"Joe
Bloggs","adno":"01234","ev6":true,"eal":true,"dpLast":"LS","dpCurrent":"WA","dpTarget":"TG","ma":2},"3":{"room":"35","seat":3,"x":60,"y":0,"name":"Joe
Bloggs","adno":"01234","ev6":true,"eal":true,"dpLast":"LS","dpCurrent":"WA","dpTarget":"TG","ma":2},"4":{"room":"35","seat":4,"x":90,"y":90,"name":"Joe
Bloggs","adno":"01234","ev6":true,"eal":true,"dpLast":"LS","dpCurrent":"WA","dpTarget":"TG","ma":2}}
I do not believe it to be a server timing issue since it 'myConsole' dump precedes the error and works fine. It does not look like the JSON is faulty even with a 2d array. The strange thing is if I take the JSON output and save it as 'testDataSample.php' and link my main page directly to it then the same output works flawlessly.
//oReq.open("get", "testDataSample.php", false); //Text JSON output works fine
oReq.open("get", "getData.php", false); // Live from Server ... '<' error
oReq.send();
Any suggestions as to what is wrong, or how I would track this down would be most welcome.
Thank you.
Thank you raghav710 :-)
The console log showed it ... I had some comments at the top of the dataSource.php file which were being included in the echo.
Writing this to my web page ... they were ignored and invisible ... which means I could not see them, and could not see the difference between the two outputs; parsing the comments JSON caused the choke.
I have removed all of the comments at the top of my datasource.php and it work instantly.
Thank you again.

qmplot error Error in unlist(.all_aesthetics[1:42]) : object '.all_aesthetics' not found

I'm trying to plot points on a map using ggmap, ggplot2 libraries. I'm successful using get_map to prepare the map, then ggmap to plot it...although I'm only able to plot ~80 coordinate points before I get an error that I'm exceeding the google map api limit of 2048 chars. Does this limit seem correct/expected?
moving on to try using qmplot & qmap commands to (hopefully) overcome this constraint.
I'm successful with the qmplot command; I'm using:
qmap("austin", zoom = 11, source="google", maptype = "roadmap", scale = 2) to create the map.
NOT successful with qmap command. I'm using:
'qmplot(coord$lon, coord$lat, data = coord)`
coord is a df with lat/lon pairs.
I get the error: Error in unlist(.all_aesthetics[1:42]) :
object '.all_aesthetics' not found
I haven't been able to find (google) anything about this error mode.
To proof myself, I try running example code from pg 47 & 48:
https://cran.r-project.org/web/packages/ggmap/ggmap.pdf, example top of page 47
violent_crimes <- subset(crime,
offense != "auto theft" &
offense != "theft" &
offense != "burglary"
)
qmplot(lon, lat, data = violent_crimes, colour = offense,
size = I(3.5), alpha = I(.6), legend = "topleft")
preparing the violent crimes (using a built-in R dataset) command work fine. qmplot command results in the same error message that I"m getting with my code.
It's a bug that was addressed. See here
devtools::install_github("dkahle/ggmap")

Unable to delete file through function in Scilab

I have written the following code to plot a graph with the data present in the 'datafile'. After the graph has been plotted, I want to delete the file.
function plot_torque(datafile)
//This will call a datafile and plot the graph of net_torque vs. time
verbose = 1;
// Columns to plot
x_col = 1;
y_col = 2;
// open the datafile
file1 = file('open', datafile,'old');
data1 = read(file1, -1, 4);
time = data1(:,x_col);
torque = data1(:,y_col);
plot(time, torque, '.-b');
xtitle("Torque Generated vs. Time" ,"Time(s)" , "Torque Generated(Nm/m)");
file('close',file());
//%________________%
endfunction
In the place that I have marked as //%________% I have tried
deletefile(datafile);
and
mdelete(datafile);
None of them have worked.
And I have set the working directory to where the above '.sci' file is present and the 'datafile' is present. I am using scilab-5.4.1.
You probably leave (left) the file open. Try this:
fil="d:\Attila\PROJECTS\Scilab\Stackoverflow\file_to_delete.txt"; //change it!
fprintfMat(fil,rand(3,3),"%.2g"); //fill with some data
fd=mopen(fil,"r"); //open
//do something with the file
mclose(fd); //close
//if you neglect (comment out) this previous line, the file remains open,
//and scilab can not delete it!
//If you made this "mistake", first you should close it by executing:
// mclose("all");
//otherwise the file remains open until you close (and restart) Scilab!
mdelete(fil); //this works for me

HTML5 File API - slicing or not?

There are some nice examples about file uploading at HTML5 Rocks but there are something that isn't clear enough for me.
As far as i see, the example code about file slicing is getting a specific part from file then reading it. As the note says, this is helpful when we are dealing with large files.
The example about monitoring uploads also notes this is useful when we're uploading large files.
Am I safe without slicing the file? I meaning server-side problems, memory, etc. Chrome doesn't support File.slice() currently and i don't want to use a bloated jQuery plugin if possible.
Both Chrome and FF support File.slice() but it has been prefixed as File.webkitSlice() File.mozSlice() when its semantics changed some time ago. There's another example of using it here to read part of a .zip file. The new semantics are:
Blob.webkitSlice(
in long long start,
in long long end,
in DOMString contentType
);
Are you safe without slicing it? Sure, but remember you're reading the file into memory. The HTML5Rocks tutorial offers chunking the upload as a potential performance improvement. With some decent server logic, you could also do things like recovering from a failed upload more easily. The user wouldn't have to re-try an entire 500MB file if it failed at 99% :)
This is the way to slice the file to pass as blobs:
function readBlob() {
var files = document.getElementById('files').files;
var file = files[0];
var ONEMEGABYTE = 1048576;
var start = 0;
var stop = ONEMEGABYTE;
var remainder = file.size % ONEMEGABYTE;
var blkcount = Math.floor(file.size / ONEMEGABYTE);
if (remainder != 0) blkcount = blkcount + 1;
for (var i = 0; i < blkcount; i++) {
var reader = new FileReader();
if (i == (blkcount - 1) && remainder != 0) {
stop = start + remainder;
}
if (i == blkcount) {
stop = start;
}
//Slicing the file
var blob = file.webkitSlice(start, stop);
reader.readAsBinaryString(blob);
start = stop;
stop = stop + ONEMEGABYTE;
} //End of loop
} //End of readblob