BusinessWorks WriteFile not working - esb

I'm new to BusinessWorks. Here is the basic flow of my application:
ReceiveMail
Basic configuration. I'm positive this is working.
Flow (Link)
Checks for an attachment
Log
Outputs the attachment name like Email has attachment: application/xml; name="test.xml"
flow (link)
Checks if it is the right attachment (currently "test.xml")
WriteFile
Writes the file from $ReceiveMail/tns1:mimeEnvelopeElement/mimePart1/textContent
Log
Output filename from $WriteFile/fileInfo/fullName
So, given all that, here is my output:
12:23:22.551 INFO: Started BW Application [EmailTest.application:1.0]
12:23:53.058 INFO: Email has attachment: application/xml; name="test.xml"
12:23:53.062 INFO: File written: C:\temp\out\test.xml
This tells me that the path checking for the "test.xml" attachment worked. It also tells me that WriteFile is passing what I would expect as $WriteFile/fileInfo/fullName
What am I doing wrong?

So I was using WriteActivityInputTextClass in the WriteFile process, as you can see in the second WriteFile screenshot. I changed "Write As" to binary on the first screenshot and then I had to remap the content and now it works.
I discovered this by arbitrarily attaching a couple HTML files to my email and removing the filename filter. These worked fine.

Related

Solved - SSIS - Send mail on error task sends multiple mails

I build a SSIS-Package to import data from an excel-file. As the excel file is required to have a certain scheme, it fails sometimes if the person creating the file does not stick to the scheme. For this case, i want to send a mail to this person, informing about the fail. I tried different methods but the send mail task always sends 5 to 6 mails instead of one.
The basic structure of my package is this:
The mail task is a simple "Send Mail Task" via SMTP Connection Manager and without attachements, only a message.
Previous approaches are:
With the Event Handler: Settings i tried are Event Handler: OnError and Executable: Whole Package level, on the foreach loop container and on the single "Import File" Task, as this is the one that fails.
In the control flow: I conneted the mail task on failure with the loop container and also with the single import file task:
I also played with the "Delay Validation"-setting in the properties, setting it on true for the loop container and also the import file task.
Where is the mistake i make and how can i fix the package to send only one mail instead of 5-6? Thanks in advance!
Update: The Problem kinda solved itself. I changed from a send mail task to a script task to send a mail with a logfile, this worked perfectly. I also changed the mail_from in the send mail task from a mail-container to a single mail, and this also solved the problem. After that, i changed again to the mail-container and it also works how it should. Seems to be a bug. Thanks all.

Problem downloading package from git actions, results in mangled URL

In our git actions output for testing our custom R package (hosted on github), we're experiencing an error during execution:
Error in utils::download.file(url, path, method = method, quiet = quiet, :
cannot open URL 'https://api.github.com/repos/***/CirceR/contents/DESCRIPTION?ref=HEAD'
Calls: saveRDS ... github_DESCRIPTION -> download -> base_download -> base_download_headers
Execution halted
Error: Process completed with exit code 1.
The main repo is here: https://github.com/OHDSI/CohortGenerator
The github actions report is here: https://github.com/OHDSI/CohortGenerator/runs/3294257207?check_suite_focus=true
The referenced package CirceR is found here: https://github.com/ohdsi/circer
Our main question is: is it normal for the requested URL for the DESCRITPION file to be masked with the *** as in: cannot open URL 'https://api.github.com/repos/***/CirceR/contents/DESCRIPTION?ref=HEAD'
If we change the *** to the actual organization OHDSI for this URL the request works, so is it possible the URL is being mangled?
We've tested loading each individual package locally and this error doesn't occur, so we think it's localized to github actions.
I believe the output is trying to shorten the string so you can see how it starts, and how it ends, but not the middle...for readability purposes. My issue was I wanted the entire contents of the string (other errors in 'shorter' repository names gives the full URL). So, I believe this truncation is by design.
To solve the underlying issue, I had to specify a GIT_PAT to use to invoke the API and that cleared out the error I was getting.

Chrome doesn’t show un-minified code in spite of source map present

I’m using Grunt and UglifyJS to generate source maps for my AngularJS app. It produces a file customDomain.js and customDomain.js.map.
JS file
Last line of customDomain.js looks like this:
//# sourceMappingURL=customDomain.js.map
Map file
I find two references to customDomain.js inside of customDomain.js.map, one at the beginning:
"sources":["../../../.tmp/concat/scripts/customDomain.js"]
I think this looks weird so I trim it to:
"sources":["customDomain.js"]
The second reference is at the end:
"file":"customDomain.js"
...which I leave as it is.
Testing
When I run my app in Chrome I expect to see my development code when I click on customDomain.js, but I do not:
I can see on the console output from my web server that customDomain.js.map is indeed requested from the browser:
200 /js/customDomain.js.map (gzip)
What is missing?
"sources":["customDomain.js"] should be relative to the customDomain.map.js file.
Make sure they are in the same directory on your server if this is the case for you.
"file":"customDomain.js" should be changed to the name of the map file, in your case this would be "file":"customDomain.map.js".
Here's a map file example taken from treehouse (sourceRoot may be unnecessary in your case):
{
version: 3,
file: "script.js.map",
sources: [
"app.js",
"content.js",
"widget.js"
],
sourceRoot: "/",
names: ["slideUp", "slideDown", "save"],
mappings: "AAA0B,kBAAhBA,QAAOC,SACjBD,OAAOC,OAAO..."
}

invalid xml request for calculator service

I'm completely new to axis2c and I've just downloaded and unpacked
axis2c 1.6 for Windows (binary release).
I've followed the installation instructions and have successfully
started axis2_http_server.
Trying to access the Calculator service's WSDL works fine but any call to
the service's add method returns "invalid XML in request" as well as the
same text is shown in the console window where axis2_http_server is
running.
I've also tried soapUI. The request shown is:
<soapenv:Envelope
xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"
xmlns:typ="http://ws.apache.org/axis2/services/Calculator/types">
<soapenv:Header/>
<soapenv:Body>
<typ:add>
<param_1>1.0</param_1>
<param_2>1.0</param_2>
</typ:add>
The response is
<soapenv:Fault>
<faultcode>soapenv:Sender</faultcode>
<faultstring>Invalid XML format in request</faultstring>
</soapenv:Fault>
The problem is issued in in calc.c (function axis2_calc_add()), where
seq_node = axiom_node_get_first_child(complex_node, env);
returns NULL.
Calculator service example has multiple issues that prevents it to work.
Firstly, implementation of add operation is invalid, it expects request like that (here is only contents of soap body):
<typ:add>
<complex_node>
<seq_node>
<param_1>1</param_1>
<param_2>2</param_2>
</seq_node>
</complex_node>
</typ:add>
Looks like someone committed that code by mistake.
Secondly, code that is implemented in Calculator service does not allow to have whitespaces between request elements. It takes any first node hoping it is an element, but fails, because takes text node between elements.
To start that example without modification of the service:
use one of sub, div, mul operations.
remove all whitespaces in request element like that:
<typ:sub><param_1>3</param_1><param_2>2</param_2></typ:sub>
Then you will be able to call the service.
If you want to see fully working Calculator service, you can compile Axis2/C from axis2-unofficial project (or install it from binary archive).
Or, you can apply that changes to the original source code and recompile it.

Perl HTML file upload issue. File has zero size

I have a perl CGI script, that works, to upload a file from a PC to a Linux server.
It works exactly as intended when I write the call to the CGI in my own HTML form and then execute, but when I put the same call into an existing application, the file is created on the server, but does not get the data, it is size zero.
I have compared environment variables (those I can extract from %ENV) and nothing there looks like a cause. I actually tried changing several of the ENV in my own HTML script, to the values the existing application was using, and this did not reveal the problem.
Nothing in the log gives me a clue, the upload operation thinks it was successful.
The user is the same for both tests. If permissions were an issue, then the file would not even be created on the server.
Results are the same in IE as in Chrome (works from my own HTML script, not from within the application).
What specific set up should I be looking at, to compare?
This is the upload code:
if (open(UPLOADFILE, ">$upload_dir/$fname")) {
binmode UPLOADFILE;
while (<$from_fh>) {
print UPLOADFILE;
}
close UPLOADFILE;
$out_msg = "Done with Upload: upload_dir=$upload_dir fname=$fname";
}
else {
$out_msg = "ERROR opening for upload: upload_dir=$upload_dir filename=$filename";
}
I did verify that
It does NOT enter the while loop, when running from inside the application.
It does enter the while loop, when called from my own HTML script.
The value of $from_fh is the same for both runs.
All values, used in the below block, are exactly the same for both runs.
You could check the error result of your open?
my $err;
open(my $uploadfile, ">", "$upload_dir/$fname") or $err = $!;
if (!$uploadfile) {
my $out_msg = "ERROR opening for upload: upload_dir=$upload_dir filename=$filename: $err";
}
else {
### Stuff
...;
}
My guess, based on the fact you are embedding it in another application, is that all the input has been read already by some functionality that is part of the other application. For example, if I tried to use this program as part of a CGI script, and I had used the param() function from CGI.pm, then the entire file upload would have been read already. So if my own code tried to read the file again, it would receive zero data, because the data would have been ready already.