HxCPP Add Include Paths and Linker Options - configuration

I'm trying to use an external C++ library in Haxe. I have an extern class with an #:include meta, but I can't figure out how to add the directory containing some headers to the HxCPP include path, or add linker options to add to the lbrary path and link with the libraries. I assume it will involve using #:buildXml, and I have this:
#:buildXml("
<target id=\"default\">
<cppflag value=\"-I...\" />
<flag value=\"-L...\" />
<lib base="..." />
</target>
")
, but those flags don't show up in the logged g++ command. I can't find any documentation about adding include paths, or any real examples of build XML.

Related

Scilab: where is optim function source file situated

I had some problems with understanding how function loadwave(...) exactly works. So i found a file with its description here: /usr/share/scilab/modules/sound/macros/loadwave.sci using find -name ... command.
Now i don't understand how optim(...) function works and so i also want to find its source file but i can't (i tried to use combinations of find and grep again). There are some demo files with examples of optim usage in /usr/share/scilab/modules/optimization/demos/optim directory but i still can not find any source code of optim(...) itself that as i understand should look somehow as follows:
function [...]=optim(...)
...
end function;
Could you give me a tip, please?
Update:
for now I only managed to find this in file
/usr/share/scilab/modules/optimization/sci_gateway/optimization_gateway.xml :
<!DOCTYPE GATEWAY SYSTEM "../../functions/xml/gateway.dtd">
<GATEWAY name="optimization">
<!-- =================== -->
<!--
Scilab
Interface description. In this file, we define the list of the function which
will be available into Scilab and the link to the "native" function.
gatewayId is the position in the hashtable 'Interfaces' defined in the
file SCI/modules/core/src/c/callinterf.h
primitiveId is the position in the hashtable '<module>Table Tab[]' defined
in the file modules/<module>/sci_gateway/c/gw_<module>.c
primitiveName is the name of the Scilab function
===================
Don't touch if you do not know what you are doing
-->
<!-- =================== -->
<PRIMITIVE gatewayId="11" primitiveId="1" primitiveName="optim" />
<PRIMITIVE gatewayId="11" primitiveId="2" primitiveName="semidef" />
<PRIMITIVE gatewayId="11" primitiveId="3" primitiveName="fsolve" />
<PRIMITIVE gatewayId="11" primitiveId="4" primitiveName="lsqrsolve" />
<PRIMITIVE gatewayId="11" primitiveId="5" primitiveName="qld" />
<PRIMITIVE gatewayId="11" primitiveId="6" primitiveName="qp_solve" />
<PRIMITIVE gatewayId="11" primitiveId="7" primitiveName="readmps" />
</GATEWAY>
So in the git repo of scilab the link to which has been kindly given to me by user1149326 below i have found the file scilab/modules/optimization/sci_gateway/c/sci_optim.c (http://gitweb.scilab.org/?p=scilab.git;a=blob;f=scilab/modules/optimization/sci_gateway/c/sci_optim.c;h=608f7dabe822fc6cfecb456e847f3b7373014322;hb=HEAD)
You can check out all Scilab sources at their git repository. More specifically all optim sources are in the optimization module. See the src and macro folder. You can read about how the module is organized on their wiki.
I think the sources are too complex to give you insight in how optim works. Instead of looking at the sources, I would recommend a document by Scilab about the kinds of optimization , that may also give the information you're looking for.

SQLCMD include all scripts in folder

In my Post Deployment Script, I would like to include all script files in a folder using a wildcard like this:
:r .\$(ReleaseName)\*.sql
Is there a way to do this? I can't find any..
I got it working:
<MyFilesPath Include="$(ProjectDir)MyPath\*.sql"/>
<MyFiles Include="#(MyFilesPath->':r %22..\Scripts\%(filename)%(extension)%22%0D%0A', '')"/>
Then I include #MyFiles in my PostScript-file.
I took a different approach that was easier for me to understand.
I simply added code to the Pre-build event in the database project properties page that copies the script files into a single file. I call a bat file and pass in the project path as a parameter because it's much nicer to edit the file than trying to edit in that little textbox in the properties page.
$(ProjectDir)PreBuildEvent.bat "$(ProjectDir)"
I set the contents of the bat to this:
copy %ProjectDir%DbUpdateScripts\*-Pre.sql %ProjectDir%DbUpdateScripts\AllPreScripts.sql
copy %ProjectDir%DbUpdateScripts\*-Post.sql %ProjectDir%DbUpdateScripts\AllPostScripts.sql
Then just include those files in your actual pre and post deploy scripts.
:r .\DbUpdateScripts\AllPreScripts.sql
:r .\DbUpdateScripts\AllPostScripts.sql
And finally, add AllPreScripts.sql and AllPostScripts.sql to your .gitignore file if you have one to prevent them from getting added to source control.
Building upon #SAS answer, here is what I did to get this working using MSBuild. Basically, the idea is that we add a pre-build target that auto-generates a post-deployment script referencing all the scripts in the source folder.
In the .sqlproj add the following at the end of the file:
<Target Name="BeforeBuild">
<PropertyGroup>
<MyAutogeneratedScriptPath>$(ProjectDir)Scripts\Post-deployment\MyScript.autogenerated.sql</MyAutogeneratedScriptPath>
</PropertyGroup>
<ItemGroup>
<MyScriptsLocation Include="$(ProjectDir)Scripts\Post-deployment\RunAll_1\*.sql" />
<MyScriptsLocation Include="$(ProjectDir)Scripts\Post-deployment\RunAll_2\*.sql" />
</ItemGroup>
<WriteLinesToFile File="$(MyAutogeneratedScriptPath)" Lines="-- This is an auto-generated file, any changes made will be overwritten" Overwrite="true" />
<WriteLinesToFile File="$(MyAutogeneratedScriptPath)" Lines="#(MyScriptsLocation->':r %22%(FullPath)%22', '%0D%0A')" Overwrite="false" />
</Target>
And then in your main post-deployment script file, include the MyScript.autogenerated.sql file.
Also, you might also want to add *.autogenerated.sql to your .gitignore file.

NoClassDefFoundError when checkstyle is running

I have written a new checkstyle check as a filescanner. I modeled my junits after the code I found in the checkstyle code. The junits run just fine and everything looks good.
But then, I add the check to my project.
<module name="TreeWalker">
<property name="tabWidth" value="4" />
<module name="com.onuspride.codetools.checkstyles.DuplicateClassNames"/>
</module>
and my ant task
<taskdef resource="checkstyletask.properties">
<classpath refid="classpath" />
</taskdef>
<property name="checkstyle.suppressions.file" value="checkstyle/suppressions.xml" />
<property name="translation.severity" value="error" />
<target name="checkStyle" description="TestTask to evaluate the checkstyle system.">
<checkstyle config="checkstyle/checkstyle_checks.xml">
<fileset dir="${msg.src}" includes="**/*.java" />
<formatter type="plain" />
<formatter type="xml" toFile="${msg.build.jar}/checkstyle_errors.xml" />
<classpath refid="classpath" />
</checkstyle>
</target>
the duplicateclassnames class calls several classes in the same jar. For some reason, when ant runs it, ant finds the check class, but can't find the supporting classes, when they are all in the same jar file. here's what i get in ant
[checkstyle] [class]:0: Got an exception - java.lang.NoClassDefFoundError: com/onuspride/codetools/common/classpath/criteria/ClassNameCriteriaCollector
Im stumped. Ive checkd all the dependencies of my jar, they are all in the classpath, I don't understand how it can find one class file but not another in the same jar. Ive done all my dirty little tricks and I just don't get it.
any ideas?
You can do it like following :
Create plugin project and add your custom checks there.
Make appropriate changes to plugin.xml, checkstyle_packages.xml.
Export the project as Deployable Plug-ins and fragments (Export > Plug-in Developement)
Copy the jar file to Eclipse Plugin folde, so no need to install your custom check .
You can go through this tutorial for reference
To reduce effort, download a Sample Check, the file is here under the name net.sf.eclipsecs.sample
Just replace your source in src folder. Before replacing, refer this 3 files in src/net/sf/eclipsecs/sample/checks/ directory as you will need them in your com/onuspride/codetools/checkstyles/ directory :
checkstyle-metadata.properties
checkstyle-metadata.xml
messages.properties
After replacing the code, make appropriate changes in checkstyle_packages.xml file in src/ directory.
Extending Check is described nicely there.

Logback configuration: factoring out reusable parts

Is there a way to factor out and parameterize repeating parts of Logback XML configuration? I have many different rolling file appenders configured basically the same except for the file names. I use that in conjunction with a bunch of loggers with their 'additivity' turned off so I can redirect different parts of the stack to different files. This adds up to a cumbersome and long configuration file composed of many almost identical segments.
I've used Logback's <include> feature before, but it doesn't address this reuse issue since I can't parameterize the included configuration. I'd expect such a feature to look something akin to:
<include resource="file-appender.xml">
<property name="filePath" value="/where/logs/go" />
<property name="baseLogger" value="com.mycompany.thatpartofthestack" />
</include>
But as far as I understand that's wishful thinking. Is there another way of factoring out Logback's configuration via templates, macros, functions or whatnot?
Try using variable substitution in local and/or context scope.
Perhaps the easiest way is to define variables in some resource file, say logback.properties bundled with each each application. Moreover, each application would carry a logback.xml file importing logback.properties.
<configuration debug="true">
<property resource="logback.properties" />
<!-- set root level as given by the value of the root.level variable -->
<!-- if root.level is undefined default to DEBUG -->
<root level="${root.level:-DEBUG}"/>
</configuration>
If you wish to set the root level to WARN in webapp-A, simply add the following line in logback.properties file bundled with webapp-A.
root.level=WARN
You can bundle logback.xml as a resource in a artifact common to your various applications.

copying a jar file in Apache Ant

I need to copy a jar file from one directory to another when my project is built. Placing the statement:
<copy file="${test.dir}/MyFirstTest.jar" todir="${share.path}"/>
works fine when the project is built alone, but if I clean and build the project I get a warning informing me that the ${test.dir}/ directory hasn't been created yet. If I'm understanding properly I should be able to create a 'target' and specify a dependency for this operation but I'm unsure of what dependency to specify. What series of statements do I need to use to ensure this copy will occur whether I clean and build or just build the project?
Please let me know if any further clarification is needed.
FYI I am using Netbeans 6.8 to build my project.
Assuming you have build, dist and javadoc folders do this in the clean.
<!-- Remove all output generated from this build script -->
<target name="clean" description="Clean project">
<delete dir="${build}" />
<delete dir="${dist}" />
<delete dir="${javadoc}" />
</target>
<!-- Initialize all elements needed for the Build -->
<target name="init">
<!-- Create the time stamp -->
<tstamp />
<!-- Create the build directory structure used by compile
and copy the deployment descriptors into it-->
<mkdir dir="${build}/classes" />
<mkdir dir="${dist}" />
<mkdir dir="${javadoc}" />
</target>
<!-- Write a target such as this -->
<target name="docopy" depends="init" description="do the copy">
<copy file="${test.dir}/MyFirstTest.jar" todir="${dist}"/>
</target>
When you run ant docopy It will run init first and then the docopy task.
You can create other tasks that have a depends="docopy" in it or move the copy file task to the init itself.
If I'm understanding properly I should be able to create a 'target' and specify a dependency for this operation but I'm unsure of what dependency to specify.
Well, either add a dependency to the target that actually creates ${share.path} if that makes sense or introduce a new target to create the directory if it doesn't exists. This is typically done is some kind of init target. Then, add the dependency like this:
<target name="copy-jar" depends="target-a, target-b">
<copy file="${test.dir}/MyFirstTest.jar" todir="${share.path}"/>
</target>
You could also simply try to create the directory before to copy the library:
<mkdir dir="${share.path}" failonerror="false">
<copy file="${test.dir}/MyFirstTest.jar" todir="${share.path}"/>