OK, so I am trying to get GraphViz working on a MediaWiki installation. I am using Windows 2012 R2 with WAMPserver.
I have installed GraphViz on my C drive, WAMP is on a D drive.
I have added the GraphViz extension as advised on the GraphViz page and changed the settings folder to point to my GraphViz installation. This is what I have in my settings file:
public $createCategoryPages;
/** * Constructor for setting configuration variable defaults. */
public function __construct() { // Set execution path if (
stristr( PHP_OS, 'WIN' ) && !stristr( PHP_OS, 'Darwin' ) ) {
$this->execPath = 'C:\Program Files (x86)\GraphViz\bin'; } else
{
$this->execPath = '/usr/bin/'; }
$this->mscgenPath = ; $this->defaultImageType = 'png';
$this->createCategoryPages = 'no'; } }
But when I insert the GraphViz code, for example:
=== Example 1 from http://www.mediawiki.org/wiki/Extension:GraphViz ===
<graphviz border='frame' format='png' desc='none'> digraph example1 {Hello-
>"World!"}
</graphviz>
It doesn't show anything. Anyone know what I have done wrong?
That's not how extensions are configured. You shouldn't have any constructor or class definitions in your LocalSettings.php.
Do you get any error messages? Make sure you've got things set up for debugging.
You probably want something like:
wfLoadExtension( 'GraphViz' );
$wgGraphVizSettings->execPath = 'C:\Program Files (x86)\GraphViz\bin';
Related
I have some working terraform definitions among a larger project:
resource "google_storage_bucket" "owlee_functions_bucket" {
name = "owlee_functions_bucket"
location = "europe-west2"
project = "owlee-software"
}
resource "google_storage_bucket_object" "archive" {
name = "index.zip"
bucket = google_storage_bucket.owlee_functions_bucket.name
source = "../apps/backend/dist/index.zip"
}
resource "google_cloudfunctions_function" "backend_function" {
name = "backend_function"
runtime = "nodejs16"
project = "owlee-software"
region = "europe-west2"
available_memory_mb = 128
source_archive_bucket = google_storage_bucket.owlee_functions_bucket.name
source_archive_object = google_storage_bucket_object.archive.name
trigger_http = true
entry_point = "OWLEE"
}
Then I'm trying to deploy via CI, for now, I'm just running terraform apply after zipping up the new version of the function to handle deployment.
It's not great and I'd like to change that to a non-terraform process ideally but that doesn't seem to be documented/possible anywhere which makes me think I have the wrong approach with this.
The second issue which is more urgent to solve --
I want to continue managing my infrastructure locally for now and do not want to have to zip up a new version of the function to deploy everytime I have to run terraform apply locally.
Is there a way -- after its creation -- to avoid overwriting/uploading the function via terraform?
I'm guessing this would be somewhat necessary for the CI deployment to work anyway.
I've looked at a handful of other SO threads but they were looking at specifics around cloud-build and the artifacts registry.
I recommend that you deploy the cloud function by terraform but that the CI of the cloud function is maintained by a cloud build (also created by terraform) I think this is the most logical solution since terraform manages the infrastructure not the implementation of the cloud function.
Instead of using a fixed name as you are, use a random string or depending on needs the commit hash for example. This can be prefixed with other things to make it even more unique.
resource "random_string" "function" {
length = 8
special = false
keepers = {
commit_hash = var.commit_hash,
environment = var.environment,
}
}
resource "google_storage_bucket_object" "archive" {
name = "index.zip"
bucket = google_storage_bucket.owlee_functions_bucket.name
source = "../apps/backend/dist/${random_string.function.result}.zip"
}
resource "google_cloudfunctions_function" "backend_function" {
name = "backend_function"
runtime = "nodejs16"
project = "owlee-software"
region = "europe-west2"
available_memory_mb = 128
source_archive_bucket = google_storage_bucket.owlee_functions_bucket.name
source_archive_object = google_storage_bucket_object.archive.name
trigger_http = true
entry_point = "OWLEE"
}
This way if you provide an environmnet such as prod and the same commit hash every time, it will create the same zip file.
If you provide a new environment, say "local", it will generate a new zip. You can then create multiple instances of functions or make more changes to the google_cloudfunctions_function so that it can be used with workspaces
I have 2 .NET Core 2.0 console applications. The first application calls the second one via System.Diagnostics.Process.Start(). Somehow the second app is inheriting the development configuration information located in the appsettings.development.json of the first app.
I execute the first app by running either dotnet run in the root of the project or dotnet firstapp.dll in the folder where the DLL exists. This is started from in Powershell.
Both apps are separate directories. I'm not sure how this is happening.
UPDATE WITH CODE
The apps reside in
C:\Projects\ParentConsoleApp
C:\Projects\ChildConsoleApp
This is how I call the app from parent application:
System.Diagnostics.Process.Start("dotnet", "C:\\projects\\ChildConsoleApp\\bin\\Debug\\netcoreapp2.0\\publish\\ChildConsoleApp.dll" + $" -dt {DateTime.Now.Date.ToString("yyyy-MM-dd")}");
This is how I load the configuration from JSON (this is same in both apps):
class Program
{
private static ILogger<Program> _logger;
public static IConfigurationRoot _configuration;
public static IServiceProvider Container { get; private set; }
static void Main(string[] args)
{
RegisterServices();
_logger = Container.GetRequiredService<ILogger<Program>>();
_logger.LogInformation("Starting GICMON Count Scheduler Service");
Configure();
// At this point DBContext has value from parent! :(
var repo = Container.GetService<ICountRepository>();
var results = repo.Count(_configuration.GetConnectionString("DBContext"), args[0]);
}
private static void Configure()
{
string envvar = "DOTNET_ENVIRONMENT";
string env = Environment.GetEnvironmentVariable(envvar);
if (String.IsNullOrWhiteSpace(env))
throw new ArgumentNullException("DOTNET_ENVIRONMENT", "Environment variable not found.");
_logger.LogInformation($"DOTNET_ENVIRONMENT environment variable value is: {env}.");
var builder = new ConfigurationBuilder().SetBasePath(Directory.GetCurrentDirectory()).AddJsonFile("appsettings.json");
if (!String.IsNullOrWhiteSpace(env)) // environment == "Development"
{
builder.AddJsonFile($"appsettings.{env}.json", optional: true);
}
_configuration = builder.Build();
}
private static void RegisterServices()
{
var services = new ServiceCollection();
services.AddSingleton<ILoggerFactory, LoggerFactory>();
services.AddSingleton(typeof(ILogger<>), typeof(Logger<>));
services.AddLogging((builder) => builder.SetMinimumLevel(LogLevel.Trace));
var serviceProvider = services.BuildServiceProvider();
var loggerFactory = serviceProvider.GetRequiredService<ILoggerFactory>();
loggerFactory.AddNLog(new NLogProviderOptions { CaptureMessageTemplates = true, CaptureMessageProperties = true });
loggerFactory.ConfigureNLog("nlog.config");
Container = serviceProvider;
}
}
The problem is caused by the fact that you set base path for configuration builder to the current working directory:
var builder = new ConfigurationBuilder().SetBasePath(Directory.GetCurrentDirectory()).AddJsonFile("appsettings.json");
When you create a child process, it inherits current directory from the parent process (unless you set current directory explicitly).
So the child process basically uses JSON configs from the directory of parent process.
There are several possible fixes:
Do not set base path to the current directory.
When the application is launched, you don't know for sure that current directory will match directory where application binaries are placed.
If you have an exe file in c:\test\SomeApp.exe and launch it from the command line while the current directory is c:\, then current directory of your application will be c:\. In this case, if you set base path for configuration builder to current directory, it will not be able to load configuration files.
By default, configuration builder loads config files from AppContext.BaseDirectory which is the directory where application binaries are placed. It should be desired behavior in most cases.
So just remove SetBasePath() call:
var builder = new ConfigurationBuilder().AddJsonFile("appsettings.json");
If for some reason you want to set the base path of configuration builder to the current directory, then you should set correct current directory for the launched child process:
var childDllPath = "C:\\projects\\ChildConsoleApp\\bin\\Debug\\netcoreapp2.0\\publish\\ChildConsoleApp.dll";
var startInfo = new ProcessStartInfo("dotnet", childDllPath + $" -dt {DateTime.Now.Date.ToString("yyyy-MM-dd")}")
{
WorkingDirectory = Path.GetDirectoryName(childDllPath),
};
Process.Start(startInfo);
As #CodeFuller explained, the reason is that both apps read the same appsettings.{env}.json file. For simplicity, you may just rename the config file (and the corresponding name in .AddJsonFile) for the second app to prevent any possible overrides.
See, when you register JSON file as a configuration source by .AddJsonFile, configuration API allows you to use whatever file name you need and
you are not forced to use the same $"appsettings.{env}.json" pattern for both applications.
Im using the TexturePacker implemented by LibGDX to load my sprites.
For some reason however, the files are not found and it gives me this exception:
Exception in thread "main" java.lang.RuntimeException: Error packing images.
at com.badlogic.gdx.tools.texturepacker.TexturePacker.process(TexturePacker.java:620)
at com.zebleck.OneRoom.desktop.DesktopLauncher.processSprites(DesktopLauncher.java:35)
at com.zebleck.OneRoom.desktop.DesktopLauncher.main(DesktopLauncher.java:17)
Caused by: java.lang.IllegalArgumentException: Input file does not exist: C:\Users\Kontor\Desktop\Codeporn\LibGDX-workspace\OneRoom\desktop\sprites\input
at com.badlogic.gdx.tools.FileProcessor.process(FileProcessor.java:117)
at com.badlogic.gdx.tools.texturepacker.TexturePackerFileProcessor.process(TexturePackerFileProcessor.java:70)
at com.badlogic.gdx.tools.texturepacker.TexturePacker.process(TexturePacker.java:618)
... 2 more
This code is causing the error:
public static void main (String[] arg) {
LwjglApplicationConfiguration config = new LwjglApplicationConfiguration();
config.width = 800;
config.height = 800;
deleteFiles();
processSprites();
new LwjglApplication(new OneRoom(), config);
}
public static void deleteFiles() {
File outputDir = new File("../android/assets/sprites/output");
File[] listFiles = outputDir.listFiles();
if (listFiles != null && listFiles.length > 0) {
for (File file : listFiles) {
file.delete();
}
}
}
public static void processSprites() {
TexturePacker.Settings settings = new TexturePacker.Settings();
//System.out.println(Gdx.files.internal("sprites/input/player.png").toString());
TexturePacker.process(settings, "sprites/input", "sprites/output", "pack"); // THIS LINE CAUSES THE ERROR
}
I also got the EXACT same code in another project and it works just fine. I haven't found any differences in the project properties yet.
Make sure the sprites actually exist in that directory.
Sounds patronising but I was having the same issue and for me I was being misled by my assets directory in my desktop project being a "Linked Folder" that was actually just a reference to the assets folder of my core project. So in eclipse the folder is there and looks like there should be no problem but looking through windows file explorer it was clear the files didn't actually exist at that location.
My fix was to change the input and output to step back and check the core directory instead of the desktop.
So instead of:
TexturePacker.process(settings, "sprites/input", "sprites/output", "pack");
The following would work:
TexturePacker.process(settings, "../core/sprites/input", "../core/sprites/output", "pack");
Now I don't know your exact setup but considering your code works in a different project I would wager that the other project has the assets actually stored in the desktop directory where as this one stores the images in the core directory.
What I'm trying to do in SSIS is have a WMI Event Watcher Task which watches a folder for a file to be created, then does something with it. The primary part is the "watching the folder for file creation".
I have a network folder (full path): \\srvblah10\main\child\target\
All the sites I've gone to has this as an example:
SELECT * FROM __InstanceCreationEvent WITHIN 10
WHERE TargetInstance ISA "CIM_DirectoryContainsFile"
AND TargetInstance.GroupComponent = "Win32_Directory.Name=\"d:\\\\NewFiles\""
Since the folder is a network folder, I can't provide the physical disk letter. So is there a way to use a similar WQL query but for network folder paths as opposed to physical folder paths?
You have to map the drive with a dos command:
net use s: \srvblah10\main\child\target\ /user dotnetN00b Pa$$word
then you can the WMI Event Watcher Task to watch it.
I was trying to do this for awhile, and finally gave up on trying to use the SSIS WMI Event Watcher task, and just wrote the equivalent in a Script task. The issue that was the challenge was getting the WMI Event Watcher to make the remote connection with specific user credentials that I wanted to obtain from a configuration section (not hard code into the package).
The second issue that was going to make not using a script difficult was simply translating the network share, into the local path name on the server, which the Event Watcher requires. You'll see from the scrip below, everything is accomplished with a minimal of effort.
Just an additional heads up, make sure to include the DTS.Variables the script uses in the ReadOnlyVariables (as normal). The code below requires three DTS variables, for example if you are trying to watch for files being dropped in the following location \copernicus\dropoff\SAP\Import, then you would set the variables as shown below:
User::ServerName - the hostname of the server where the share lives
(copernicus)
User::ShareName - the name of the network share
(dropoff)
User::ImportPath - the directory path of the directory to
watch for new files in (/SAP/Import)
public void Main()
{
string localPath = "";
try
{
ConnectionOptions connection = new ConnectionOptions();
connection.Username = "<valid username here>";
connection.Password = "<password here>";
connection.Authority = "ntlmdomain:<your domain name here>";
ManagementScope scope = new ManagementScope(#"\\" + Dts.Variables["User::FileServerName"].Value.ToString() + #"\root\CIMV2", connection);
scope.Connect();
/// Retrieve the local path of the network share from the file server
///
string queryStr = string.Format("SELECT Path FROM Win32_Share WHERE Name='{0}'", Dts.Variables["User::ShareName"].Value.ToString());
ManagementObjectSearcher mosLocalPath = new ManagementObjectSearcher(scope, new ObjectQuery(queryStr));
foreach (ManagementObject elements in mosLocalPath.Get())
{
localPath = elements["Path"].ToString();
}
queryStr = string.Format(
"SELECT * FROM __InstanceCreationEvent WITHIN 10 WHERE Targetinstance ISA 'CIM_DirectoryContainsFile' and TargetInstance.GroupComponent=\"Win32_Directory.Name='{0}{1}'\"",
localPath.Replace(#"\", #"\\"),
Dts.Variables["User::ImportPath"].Value.ToString().Replace(#"\", #"\\")); // query requires each seperator to be a double back slash
ManagementEventWatcher watcher = new ManagementEventWatcher(scope, new WqlEventQuery(queryStr));
ManagementBaseObject eventObj = watcher.WaitForNextEvent();
// Cancel the event subscription
watcher.Stop();
Dts.TaskResult = (int)ScriptResults.Success;
}
catch (ManagementException err)
{
Dts.Events.FireError((int)err.ErrorCode, "WMI File Watcher", "An error occurred while trying to receive an event: " + err.Message, String.Empty, 0);
Dts.TaskResult = (int)ScriptResults.Failure;
}
catch (System.UnauthorizedAccessException unauthorizedErr)
{
Dts.Events.FireError((int)ManagementStatus.AccessDenied, "WMI File Watcher", "Connection error (user name or password might be incorrect): " + unauthorizedErr.Message, String.Empty, 0);
Dts.TaskResult = (int)ScriptResults.Failure;
}
}
Good afternoon,
I would like create a application that can can create folders and short cuts to folders in the file system. The user will click a button and it will put a folder on there desktop that has short cuts to files like //server/folder1/folder2 Can you create a desktop shortcut with code in adobe air? How would you do that? How do you create a folder? I keep thinking this should be easy but i keep missing it.
Thank you for your help sorry for the trouble,
Justin
If your deployment profile is Extended Desktop, you may be able to use NativeProcess and some simple scripts that you could package with your app. This approach would entail handling the functionality on a per OS basis, which would take some work and extensive testing. However, I wanted to at least share a scenario that I verified does work. Below is a test case that I threw together:
Test Case: Windows 7
Even though the Adobe documentation says that it prevents execution of .bat files, apparently it doesn't prevent one from executing the Windows Scripting Host: wscript.exe. This means you can execute any JScript or VBScript files. And this is what you would use to write a command to create a shortcut in Windows (since Windows doesn't have a commandline command to create shortcuts otherwise).
Here's a simple script to create a shortcut command, which I found on giannistsakiris.com, (converted to JScript):
// File: mkshortcut.js
var WshShell = new ActiveXObject("WScript.Shell");
var oShellLink = WshShell.CreateShortcut(WScript.Arguments.Named("shortcut") + ".lnk");
oShellLink.TargetPath = WScript.Arguments.Named("target");
oShellLink.WindowStyle = 1;
oShellLink.Save();
If you package this in your application in a folder named utils, you could write a function to create a shortcut like so:
public function createShortcut(target:File, shortcut:File):void {
if (NativeProcess.isSupported) { // Note: this is only true under extendedDesktop profile
var shortcutInfo:NativeProcessStartupInfo = new NativeProcessStartupInfo();
// Location of the Windows Scripting Host executable
shortcutInfo.executable = new File("C:/Windows/System32/wscript.exe");
// Argument 1: script to execute
shortcutInfo.arguments.push( File.applicationDirectory.resolvePath("utils/mkshortcut.js").nativePath);
// Argument 2: target
shortcutInfo.arguments.push("/target:" + target.nativePath);
// Argument 3: shortcut
shortcutInfo.arguments.push("/shortcut:" + shortcut.nativePath);
var mkShortcutProcess = new NativeProcess();
mkShortcutProcess.start(shortcutInfo);
}
}
If one wanted to create a shortcut to the Application Storage Directory on the Desktop, the following would suffice:
var targetLocation:File = File.applicationStorageDirectory;
var shortcutLocation:File = File.desktopDirectory.resolvePath("Shortcut to My AIR App Storage");
createShortcut(targetLocation, shortcutLocation);
Obviously there's a lot of work to be done to handle different OS environments, but this is at least a step.
As far as I know, File class does not allow the creation of symbolic links. But you can create directories with createDirectory(): http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/filesystem/File.html#createDirectory%28%29
Check if this can be useful: http://www.mikechambers.com/blog/2008/01/17/commandproxy-net-air-integration-proof-of-concept/
Air doesnt let you create shortcuts natively. Here's a workaround that works with Windows [may work on Mac but I don't have a machine to test].
Using Air, create a file that contains the following plain text
[InternetShortcut]
URL=C:\path-to-folder-or-file
Replace path-to-folder-or-file with your folder/file name
Save the file as test.url
Windows recognizes this file as a shortcut.
It is possible to coerce Adobe Air into creating symbolic links, other useful things, on a Mac. Here's how I did it:
You will need AIRAliases.js - Revision: 2.5
In the application.xml add:
<!-- Enables NativeProcess -->
<supportedProfiles>extendedDesktop desktop</supportedProfiles>
In the Air app JavaScript:
// A familiar console logger
var console = {
'log' : function(msg){air.Introspector.Console.log(msg)}
};
if (air.NativeProcess.isSupported) {
var cmdFile = air.File.documentsDirectory.resolvePath("/bin/ln");
if (cmdFile.exists) {
var nativeProcessStartupInfo = new air.NativeProcessStartupInfo();
var processArgs = new air.Vector["<String>"]();
nativeProcessStartupInfo.executable = cmdFile;
processArgs.push("-s");
processArgs.push("< source file path >");
processArgs.push("< link file path >");
nativeProcessStartupInfo.arguments = processArgs;
nativeProcess = new air.NativeProcess();
nativeProcess.addEventListener(air.NativeProcessExitEvent.EXIT, onProcessExit);
nativeProcess.addEventListener(air.ProgressEvent.STANDARD_OUTPUT_DATA, onProcessOutput);
nativeProcess.addEventListener(air.ProgressEvent.STANDARD_ERROR_DATA, onProcessError);
nativeProcess.start(nativeProcessStartupInfo);
} else {
console.log("Can't find cmdFile");
}
} else {
console.log("Not Supported");
}
function onProcessExit(event) {
var result = event.exitCode;
console.log("Exit Code: "+result);
};
function onProcessOutput() {
console.log("Output: "+nativeProcess.standardOutput.readUTFBytes(nativeProcess.standardOutput.bytesAvailable));
};
function onProcessError() {
console.log("Error: "+nativeProcess.standardError.readUTFBytes(nativeProcess.standardError.bytesAvailable));
};
Altering the syntax of the command and parameters passed to NativeProcess you should be able to get real shortcuts on Windows too.