Using mysql from Perl in an Azure web app - mysql

I am using Azure to host a site which uses Perl and MySql. I have placed all my Strawberry perl files in the bin folder. I connect to Perl by adding a handler to the Perl executable which is located on my web app in d:\home\site\wwwroot\bin\perl\bin\perl.exe.
I can run simple Perl programs ok. However when I try to run a Perl script which connects to a MySql database I get the following error:
install_driver(mysql) failed: Can't load 'D:/home/site/wwwroot/bin/perl/vendor/lib/auto/DBD/mysql/mysql.xs.dll' for module DBD::mysql: load_file:The specified module could not be found at D:/home/site/wwwroot/bin/perl/lib/DynaLoader.pm line 193.
However mysql.xs.dll exists in that location. I have also copied into the same folder as mysql.xs.dll the file libmysql_.dll which I read somewhere was a dependency of mysql.xs.dll.
The datasource I am trying to get to looks like this:
my $data_source = "DBI:mysql:$database:$hostname";
I am using the same code and binary files as handlers for Perl in my local IIS so I think all the files needed are there in my bin folder.
What else can I try?

Related

File not exist when run mysql database migrations on windows

I cloned an existing repository (repository created by a team in my office which deals with subscriptions in a certain app we are working on) which have some database migration files inside the path ..\internal\db\migrations , this is migration files path.
First of all i run the command docker compose up .for an existing docker.yaml , then i run the command go build then go run . .
I made a debug and the app reaches the point when it is about to run the migration file then it displays an error:
Failed to initialize App. Error: first D:\subscription-store: file does not exist
although I checked the paths through debugging and they are correct and at the same time the migration files all are exists.
I am using visual studio code as an editor, Go version 1.15 ,docker and MySQL. I am running on enviroment windows 10.
After debugging and searching , it was found that the repository uses some paths to get the migration files from the local drive . the paths was written for Mac in the code base and i cloned the repository on a windows machine so it didn't work .
The error specifically happened in the call of the function
migrate.NewWithDatabaseInstance(
fmt.Sprintf("file://%s", fullPath),
"mysql",
driver,
)
The generated path for the first parameter was
file//d:\\subscription-store\\....\\db\\migrations
And this is wrong for windows as the driver d: shouldn't be supported in the path .
it is resolved as following
"file:///"+"subscription-store\\....\\db\\migrations"
When the above URL sent to the function rather than the old one , it worked successfully.

xxx.mex: failed to load: No such file or directory

For some time, I've been using some .mex files I created. Now I have a new computer. I copied all the files over and reinstalled Cygwin and Octave. When I try to execute any of the .mex routines I get a message like:
error: testm: /cygdrive/c/A/Cwin/...../quad.mex: failed to load: No such file or directory
The file is definitely there and I'm having no trouble loading .m files from the same directory. It does not say there is anything wrong with the file. I'm guessing this is some sort of configuration problem. I am running Octave 4.2.1. When I start it, I get the following message:
Octave was configured for "x86_64-unknown-cygwin".
Could that have something to do with it? I think I'm developing x64 paranoia, since all my Excel .dll macros no longer work either. Thanks.

How to access repo directory in a Perl program in Openshift?

I hit the 404 error when I use the following the code in index.pl (I only create one perl app in my account)
my $hello_link = "$ENV{'OPENSHIFT_REPO_DIR'}hello.pl";
Click <font color="FF00CC">here</font> to run hello world cgi.</br>
When I log to my web autortl-rdlgen.rhcloud.com and click on ¨Here"to run hello program. The error message showed
Not Found
The requested URL /var/lib/openshift/550f4df1fcf933b971000030/app-root/runtime/repo/hello.pl was not found on this server.
I do have uploaded hello.pl to my repo directory and can find it after SSH to my account:
[autortl-rdlgen.rhcloud.com repo]\> ls -tr hello.pl
hello.pl
You need to use a relative link like this "./hello.pl" instead of using the other link with the ENV variable, which is showing the complete filesystem path to your perl file.
You may want to check out some Perl tutorials to help you get started with your Perl web programming: http://www.tutorialspoint.com/perl/perl_cgi.htm

When running a working-script in RamDebugger, it refuses to "load" some dll?

My script (MyScript.tcl) includes this line:
load MyTclBridge.dll
And when I run it this way:
tclsh MyScript.tcl
It runs ok, but when I use RamDebugger to run MyScript.tcl, it stops with this error:
couldn't load library "MyTclBridge.dll":
this library or a dependent library could not be found in library path
while executing
"load MyTclBridge.dll"
("after" script)
MyTclBridge.dll is located in C:\Windows\System32. How can I run my script with the debugger?
With problems like this I usually start with Dependency Walker as this will show you which other Dlls MyTclBridge relies on. You can then use the env variable that Tcl maintains to show you what the actual PATH that your script has when running under RamDebugger - so check that MyTclBridge and all of it's dependancies are on the PATH.

Installing JSON.pm on a web host without shell access

My host (iPage) does not have JSON.pm installed. I don't want to use the modules they have installed (XML) in order to transfer data from a CGI script back to a web page. Is there anyway that I can use JSON without them installing it on Perl?
The reason I ask is because I noticed when I downloaded the JSON zip that I had to run a makefile json.pm command but I don't have access to a Unix shell or a SSH terminal.
If your Perl is new enough, 5.14 and up, it will come with JSON::PP, a pure Perl implementation of the JSON parser. Confusingly it does not come with JSON.pm. So try use JSON::PP and see if it works.
Otherwise, follow Ilmari's instructions. If you switch to a host with shell access, you can use local::lib to manage CPAN modules.
You should be able to install a local copy of the pure Perl version of the JSON module without shell access. Just download the .tar.gz archive to your own computer, unpack it and copy everything under the lib subdirectory to a suitable location on your webhost.
You'll also need to tell Perl where to find the module, for which you need to know the filesystem path to which you copied the module. For example, if you copied the contents of the lib directory to /home/username/perl-lib on your webhost, then you would include in your code the lines:
use lib '/home/username/perl-lib';
use JSON;
Depending on how your webhost is configured, you might also be able to use $ENV{HOME} to obtain the path to your home directory, so that you can write:
use lib "$ENV{HOME}/perl-lib";
use JSON;
or you could try using the FindBin module to find the path to the directory containing your script, and locate the lib directory from there (see the example in the FindBin documentation).