I had some problems with understanding how function loadwave(...) exactly works. So i found a file with its description here: /usr/share/scilab/modules/sound/macros/loadwave.sci using find -name ... command.
Now i don't understand how optim(...) function works and so i also want to find its source file but i can't (i tried to use combinations of find and grep again). There are some demo files with examples of optim usage in /usr/share/scilab/modules/optimization/demos/optim directory but i still can not find any source code of optim(...) itself that as i understand should look somehow as follows:
function [...]=optim(...)
...
end function;
Could you give me a tip, please?
Update:
for now I only managed to find this in file
/usr/share/scilab/modules/optimization/sci_gateway/optimization_gateway.xml :
<!DOCTYPE GATEWAY SYSTEM "../../functions/xml/gateway.dtd">
<GATEWAY name="optimization">
<!-- =================== -->
<!--
Scilab
Interface description. In this file, we define the list of the function which
will be available into Scilab and the link to the "native" function.
gatewayId is the position in the hashtable 'Interfaces' defined in the
file SCI/modules/core/src/c/callinterf.h
primitiveId is the position in the hashtable '<module>Table Tab[]' defined
in the file modules/<module>/sci_gateway/c/gw_<module>.c
primitiveName is the name of the Scilab function
===================
Don't touch if you do not know what you are doing
-->
<!-- =================== -->
<PRIMITIVE gatewayId="11" primitiveId="1" primitiveName="optim" />
<PRIMITIVE gatewayId="11" primitiveId="2" primitiveName="semidef" />
<PRIMITIVE gatewayId="11" primitiveId="3" primitiveName="fsolve" />
<PRIMITIVE gatewayId="11" primitiveId="4" primitiveName="lsqrsolve" />
<PRIMITIVE gatewayId="11" primitiveId="5" primitiveName="qld" />
<PRIMITIVE gatewayId="11" primitiveId="6" primitiveName="qp_solve" />
<PRIMITIVE gatewayId="11" primitiveId="7" primitiveName="readmps" />
</GATEWAY>
So in the git repo of scilab the link to which has been kindly given to me by user1149326 below i have found the file scilab/modules/optimization/sci_gateway/c/sci_optim.c (http://gitweb.scilab.org/?p=scilab.git;a=blob;f=scilab/modules/optimization/sci_gateway/c/sci_optim.c;h=608f7dabe822fc6cfecb456e847f3b7373014322;hb=HEAD)
You can check out all Scilab sources at their git repository. More specifically all optim sources are in the optimization module. See the src and macro folder. You can read about how the module is organized on their wiki.
I think the sources are too complex to give you insight in how optim works. Instead of looking at the sources, I would recommend a document by Scilab about the kinds of optimization , that may also give the information you're looking for.
Related
I'm trying to use an external C++ library in Haxe. I have an extern class with an #:include meta, but I can't figure out how to add the directory containing some headers to the HxCPP include path, or add linker options to add to the lbrary path and link with the libraries. I assume it will involve using #:buildXml, and I have this:
#:buildXml("
<target id=\"default\">
<cppflag value=\"-I...\" />
<flag value=\"-L...\" />
<lib base="..." />
</target>
")
, but those flags don't show up in the logged g++ command. I can't find any documentation about adding include paths, or any real examples of build XML.
I have a maven repo with custom checks which I want all the other maven repos to depend on. I want to suppress checks for some generated code in one of my repos. There are 2 ways I can setup suppression file:
Have suppression file in the custom check repo, then specify SuppressionFilter in custom check style xml:
<module name="SuppressionFilter">
<property name="file" value="${samedir}/checkstyle_suppressions.xml"
default="src/main/resources/checkstyle_suppressions.xml"/>
</module>
Then in the maven plugin section of the pom.xml file of repo that I want to run custom checkstyle on:
<suppressionsLocation>checkstyle_suppressions.xml</suppressionsLocation>
<suppressionsFileExpression>checkstyle.suppressions.file</suppressionsFileExpression>
Do not put the SuppressionFilter section in the custom checkstyle xml. Have the same pom.xml setup for the repo to be checked. The suppression file can be placed local to the repo to be checked.
Both of the approach work with command line "mvn clean validate". But neither works with CheckStyle IDEA plugin for Intellij. The Intellij plugin complains it couldn't find the suppression file for the 1st method above.
I don't want to force every repo to have a suppression file if they don't need it. I wonder if there is a way to make suppression work for CheckStyle IDEA without having to have multiple copies of the same suppression file (1 in custom check repo, 1 in local repo).
Thanks!
The logic the plugin uses is
does the file path resolve?
does the file path exist relative to the rules file?
does the file path exist relative to the module content roots, the module file or the project base directory?
If not, it gives up. So there's two possibilities:
there's a bug in the logic. Raise an issue on GitHub please.
it doesn't fit your use case. Raise a feature feature on GitHub, with a example to reproduce the problem and how you think resolution should be changed to fit your needs.
The code's in the resolveAssociatedFile method of https://github.com/jshiell/checkstyle-idea/blob/master/src/main/java/org/infernus/idea/checkstyle/model/ConfigurationLocation.java if you're interested.
I recently took much of my reusable code and compiled them into SWCs for organization and ease-of-use purposes. Since doing so, none of my documentation has appeared in the code hints that Flash Builder provides. I have searched through project settings and I have been unable to find a setting for such a feature, and I am at a loss as to why it doesn't work anymore.
I compiled the SWCs using Flash Builder's Build Automatically functionality. I have not tried compiling with ANT yet, but will probably try the next time I build. asdocs was able to compile full documentation for all of my libraries with relative ease and the code hinting works if I use the raw AS files themselves, so I do not believe it has anything to do with the way I was writing the documentation. Example:
/**
* <p>Batch adds variables from a generic object using name-value pairs</p>
* #param variables A generic <code>Object</code> that contains name-value
* pairs that will be used as the arguments of the REST request
*/
public function addVariables( variables:Object ):void {}
Any idea why the code hinting no longer works?
Flash Builder uses ASDocs, which are embedded inside the SWC, to provide code hints - unfortunately, FB doesnt include the docs when it builds a SWC.
However, it can be done 'manually' with ANT:
<target name="insertASDocs">
<zip destfile="PATH_TO_YOUR_SWC" update="true">
<zipfileset dir="ASDOCS_FOLDER/tempdita" prefix="docs">
<include name="*.*"/>
<exclude name="ASDoc_Config.xml"/>
<exclude name="overviews.xml"/>
</zipfileset>
</zip>
</target>
PATH_TO_YOUR_SWC is the relative path and swc name (eg: myFolder/mySwc.swc).
ASDOCS_FOLDER is the folder where your generated docs are stored.
The ANT script just adds the ASDocs to the SWC - after this, code hints should appear.
Update:
Forgot to mention that you need to set keep-xml to true when generating the docs (if inserting them into a swc):
<asdoc keep-xml="true" ...
I want to write a program that outputs a list of libraries that I should link to given source code (or object) files (for C or C++ programs).
In *nix, there are useful tools such as sdl-config and llvm-config. But, I want my program to work on Windows, too.
Usage:
get-library-names -l /path/to/lib a.cpp b.cpp c.cpp d.obj
Then, get-library-names would get a list of function names that are invoked from a.cpp, b.cpp, c.cpp, and d.obj. And, it'll search all library files in /path/to/lib directory and list libraries that are needed to link properly.
Is there such tool already written? Is it not trivial to write a such tool?
How do you find what libraries you should link to?
Thanks.
Yeah, you can create a pkg-config file which will allow you to run 'pkg-config --cflags' to get the compiler flags or 'pkg-config --libs' to get the linker libraries.
http://pkg-config.freedesktop.org/wiki/
If you're on Linux, just try looking into /usr/lib/pkgconfig to find some example .pc files that you can use as models. You can still use pkg-config on Windows as well, but it's not something that comes with it.
I want to compile a project (with CruiseControl) not only if its source changes, but also if some dependencies change.
example:
I got 3 folders:
c:\myProject\src (my source folder)
c:\dependency1\src (source code of dependency 1)
c:\dependency2\output (dll of dependency 2)
I want to compile my project in cruisecontrol if anything in one of these folders change.
How can I configure this in my ccnet.config?
bye and thanks
juergen
Should be something like this:
<project>
<!-- ... -->
<sourcecontrol type="multi">
<requireChangesFromAll>False</requireChangesFromAll>
<sourceControls>
<svn>
<trunkUrl>svn://svn.mycompany.com/myProject/trunk</trunkUrl>
<workingDirectory>c:\myProject\src</workingDirectory>
<!-- ... -->
</svn>
<svn>
<trunkUrl>svn://svn.mycompany.com/dependency1/trunk</trunkUrl>
<workingDirectory>c:\dependency1\src</workingDirectory>
<!-- ... -->
</svn>
<filtered>
<exclusionFilters />
<inclusionFilters>
<pathFilter>
<caseSensitive>False</caseSensitive>
<pattern>c:\dependency2\output\dependency2.dll</pattern>
</pathFilter>
</inclusionFilters>
<sourceControlProvider type="filesystem">
<autoGetSource>False</autoGetSource>
<ignoreMissingRoot>True</ignoreMissingRoot>
<repositoryRoot>c:\dependency2\output</repositoryRoot>
</sourceControlProvider>
</filtered>
</sourceControls>
</sourcecontrol>
<!-- ... -->
</project>
If you have the dependencies setup as subversion externals, then follow the instructions on this StackOverflow thread.
If they are each in their own subversion repository, you might try something like this post by Mark Cohen.
If the changes are only at the filesystem level, then you might try the <filesystem> modification set detector.