How could I go about loading functions from NTDLL without linking against it or any other DLLs? - function

I've been experimenting with loading functions from the Windows system DLLs using only the loader functions exported by NTDLL. This works as expected. For the sake of curiosity and getting an even better understanding of the process structure in NT-based systems, I've started trying to load functions from NTDLL by doing the following steps:
Load the PEB of the process from gs:[60h]
Iterate over the modules loaded into the process according to the loader to find NTDLL's base address
Parse the PE headers of NTDLL
Try to parse the export table to find LdrLoadDll, LdrGetDllHandle, and LdrGetProcedureAddress
This fails at step 4. After stepping through it in a debugger (both VS2019 and WinDbg Preview), it seems as though the offsets I've tried yield an invalid structure that leads to an access violation when my code compares the current function name to one of the ones I'm searching for. My code is being compiled and run on a 64-bit copy of Windows 10 Pro build 21364. Note that I'm using my own header that contains definitions for the structures used for this (these definitions are from winnt.h and here) because the Windows headers don't really play nice with the rest of my code. The function trying to do this is here. For the record, this is part of an attempt to implement my own libc (again, for the sake of curiosity). The code that calls the functions is here. Any help with this is tremendously appreciated.

Nevermind, turns out I had outdated verbose definitions of the structures I was using. I found better (more up-to-date) definitions at https://vergiliusproject.com.

Related

How to find dependend functions in octave

I would like to identify all functions needed to run a specific function in octave. I need this to deploy an application written in Octave.
While Matlab offers some tools to analyse a function on its dependencies, I could not find something similar for Octave.
Trying inmem as recommended in matlab does not produce the expected result:
> inmem
warning: the 'inmem' function is not yet implemented in Octave
Is there any other solution to this problem available?
First, let me point out that from your description, the matlab tool you're after is not inmem, but deprpt.
Secondly, while octave does not have a built-in tool for this, there is a number of ways to do so yourself. I have not tried these personally, so, ymmv.
1) Run your function while using the profiler, then inspect the functions used during the running process. As suggested in the octave archives: https://lists.gnu.org/archive/html/help-octave/2015-10/msg00135.html
2) There are some external tools on github that attempt just this, e.g. :
https://git.osuv.de/m/about
https://github.com/KaeroDot/mDepGen
3) If I had to attack this myself, I would approach the problem as follows:
Parse and tokenise the m-file in question. (possibly also use binary checks like isvarname to further filter useless tokens before moving to the next step.)
For each token x, wrap a "help(x)" call to a try / catch block
Inspect the error, this will be one of:
"Invalid input" (i.e. token was not a function)
"Not found" (i.e. not a valid identifier etc)
"Not documented" (function exists but has no help string)
No error, in which case you stumbled upon a valid function call within the file
To further check if these are builtin functions or part of a loaded package, you could further parse the first line of the "help" output, which typically tells you where this function came from.
If the context for this is that you're trying to check if a matlab script will work on octave, one complication will be that typically packages that will be required on octave are not present in matlab code. Then again, if this is your goal, you should probably be using deprpt from matlab directly instead.
Good luck.
PS. I might add that the above is for creating a general tool etc. In terms of identifying dependencies in your own code, good software engineering practices go a long way towards providing maintenable code and easily resolving dependency problems for your users. E.g: -- clearly identifying required packages (which, unlike matlab, octave does anyway by requiring such packages to be visibly loaded in code) -- similarly, for custom dependencies, consider wrapping and providing these as packages / namespaces, rather than scattered files -- if packaging dependencies isn't possible, you can create tests / checks in your file that throw errors if necessary files are missing, or at least mention such dependencies in comments in the file itself, etc.
According to Octave Compatibility FAQ here,
Q. inmem
A. who -functions
You can use who -function. (Note: I have not tried yet.)

NativeScript, Code Sharing and different environments

Note: this is not a dupe of this or this other question. Read on: this question is specific to the Code-Sharing template.
I am doing some pretty basic experiments with NativeScript, Angular and the code sharing templates (see: #nativescript/schematics).
Now I am doing some exploration / poc work on how different "build configuration" are supported by the framework. To be clear, I am searching for a simple -and hopefully official- way to have the application use a different version of a specific file (let's call it configuration.ts) based on the current platform (web/ios/android) and environment (development/production/staging?).
Doing the first part is obviously trivial - after all that is the prime purpose of the code sharing schematics. So, different versions of the same file are identified by different extensions. This page explain things pretty simply.
What I don't get as easily is if the framework/template supports any similar convention-based rule that can be used to switch between debug/release (or even better development/staging/production) versions of a file. Think for example of a config.ts file that contains different parameters based on the environment.
I have done some research in the topic, but I was unable to find a conclusive answer:
the old and now retired documentation for the appbuilder platform mentions a (.debug. and .release.) naming convention for files. I don't think this work anymore.
other sources mention passing parameters during the call to tns build / tns run and then fetching them via webpack env variable... See here. This may work, but seems oddly convoluted
third option that gets mentioned is to use hooks to customize the build (or use a plugin that should do the same)
lastly, for some odd reason, the #nativescript/schematics seems to generate a default project that contains two files called environment.ts and environment.prod.ts. I suspect those only work for the web version of the project (read: ng serve) - I wasn't able to get the mobile compiler to recognize files that end with debug.ts, prod.ts or release.ts
While it may be possible that what I am trying to do isn't just supported (yet?), the general confusion an dissenting opinions on the matter make me think I may be missing something.. somewhere.
In case this IS somehow supported, I also wonder how it may integrate with the NativeScript Sidekick app that is often suggested as a tool to ease the build/run process of NativeScript applications (there is no way to specify additional parameters for the tns commands that the Sidekick automates, the only options available are switching between debug/release mode), but this is probably better to be left for another question.
Environment files are not yet supported, passing environment variables from build command could be the viable solution for now.
But of course, you may write your own schematics if you like immediate support for environment files.
I did not look into sharing environment files between web and mobile yet - I do like Manoj's suggestion regarding modifying the schematics, but I'll have to cross that bridge when I get there I guess. I might have an answer to your second question regarding Sidekick. The latest version does support "Webpack" build option which seems to pass the --bundle parameter to tns. The caveat is that this option seems to be more sensitive to typescript errors, even relatively benign ones, so you have to be careful and make sure to fix them all prior to building. In my case I had to lock the version of #types/jasmine in package.json to "2.8.6" in order to avoid some incompatibility between that and the version of typescript that Sidekick's cloud solution is using. Another hint is to check "Clean Build" after npm dependency changes are made. Good luck!

Delphi - Unit x was compiled with a different version of x, when fixing a VCL bug

I am using Delphi XE6 and using Datasnap and JSON in my project. There is a bug I want to correct in the VCL unit System.JSON.pas (in the TJSONString.ToString function) where it should be escaping backslash characters as well as quotes. In order to fix this I carried out the following :
Copied System.JSON.pas from the standard VCL source folder to my project source folder
Added System.JSON.pas to my project (using the newly copied file)
Fixed the bug and attempted to compile
I get the error 'Unit Data.DBXCommon was compiled with a different version of System.JSON.TJSONObject'
I can see that the Data.DBXCommon unit references System.JSON, so I guess the compiler is now seeing 2 versions - my fixed version and the standard VCL version.
What is the correct way to implement VCL changes to avoid this problem?
There are two common reasons for this issue:
You made changes to the interface section of the unit. You cannot do this without also re-compiling all units that use the unit you are modifying.
You re-compile the unit with different compiler options from those used to build it originally. Deal with that by ensuring the compiler options used to compile the unit you modify are the same as used by Embarcadero. Typically Embarcadero compiles with default options. Impose these directly in the source file being modified, right at the very top of the file.
Having said this, a recent question here on a similar topic could not be resolved using option 2 above. In that question, under XE6 only, the unmodified Classes unit could not be re-compiled and linked at all. Which makes me wonder if this particular technique has had its day. Perhaps it's not even possible. Before you give up, see if you can compile and link the unmodified unit.
More broadly, using a detour is generally an easier way to solve such problems as you face. Using a detour rather than re-compiling makes the management of the fix cleaner and simpler.
Update 1
I cannot get the unmodified System.JSON unit to re-compile and link. Which I think means that the issue raised in that other question is broader than just the Classes unit. I think you will find this a tricky hurdle to overcome and recommend the use of a detour.
Update 2
The problem that appears to have been introduced in XE6, seems to have been resolved by the release of XE7. The unmodified System.JSON unit will compile and link in XE7.
What if Delphi XE6 original System.JSON.dcu wasn't compiled with Delphi XE6 but it was compiled with one of the previous versions of Delphi.
You claim that you managed to implement your fix in Delphi XE2 using same approach by changing source and then recompiling System.JSON. SO I suggest you first make a comparison between original System.JSON files that ship with both Delphi XE2 and Delphi XE6.
If they are the same then the changed System.JSON.dcu that you managed to recompile with Delphi XE2 might also work with Delphi XE6.
I resolved a similar issue by :
Deleting the .dcu files which are on different versions ( i.e. conflicting files).
Re-build the project to create new .dcu files.

Xtext project creation concerns

Before I begin I must admit that I am new to Xtext and the designing of DSLs. Some of my questions on this matter may be somewhat "less than intelligent".
I have created an Xtext project using the IDE, and I am simultaneously using one of the sample projects provided with Xtext as a guide to what I need to do in my language. I am seeing a lot of warnings that are making me nervous.
Apparently, when the development environment creates a new project, it somehow configures that project to use the Java 5 libraries. I am using Java 6, and as a result I get warnings saying that my project is configured for Java 5 and there is no Java 5 on my system (which there isn't!).
I have tried altering the build path so that it uses Java 6 libraries, but this generates a number of other warnings -- including warnings that the Java 6referenced in my manifest.mf file is invalid!
Then there are the "plugin.xml" warnings. Apparently, the build.properties file references a file called "plugin.xml" which is not created when the IDE creates the project. I have no idea whether or not this file is important enough to create, and I have no idea what should go into it.
Frankly, I hate warnings. Warnings tend to lead to future problems in what I produce. I like clean compiles and clean deployments. I would like to eliminate these warnings, before they start screwing me up down the road (like putting in Java6 classes that would break in a Java5 library).
Has anyone been able to eliminate these warnings reliably? Please advise.
For the JDK warning, you simply switch in the Manifest.MF to a target environment matching your preferred JDK ('JavaSE-1.6 ' in your case).
The warning regarding the missing plugin.xml will be gone as soon as you have run the grammar generator the first time, as it will produce such a file.

Problems converting a C header to D

I'm trying to translating the MySql C connector 6.02 headers to D, but I get some weird crashes.
My guess is I've made some mistakes on translating the structs or function (I'm not very good at C).
I used implib /system on the libmysql.dll to create a lib file.
I couldn't get htod.exe to work. Using -hs (include system files) complained it couldn't find system files.
Coffimplib.exe didn't have an option to prepend _ to exported internal names.
I couldn't find a free version of coff2omf.
mysql.d is the wrapper. I've included the C definition before each wrapped definition to easier spot bugs. The file includes the mysql dll and converted libfile too.
When compiling mytest_fails.d it crashes. mytest_works.d only has an assert, and this makes it work.. Compiling mytest_works with -release makes it crash too.
I've been using dmd 2.051
Download mytest.zip from share1t.com
Update: I've also asked some question regarding this on the D.learn newsgroup, but I don't think anyone has gone through the code.
C Const
Compiler extensions
Connot get htod.exe to work
The weird crashes
stdcall is a Windows function calling convention (very different from the C calling convention). HTOD failed to mark several functions with extern(Windows). This is normal, since HTOD isn't equipped to handle macros (STDCALL is defined as a macro, I can see that from the leftover comments in msyql.d).
Here's an updated mysql.d file:
http://dl.dropbox.com/u/9218759/mysql.d
Now, you need the proper import library in OMF format. I'd generally advise that you do not use implib for this. I've had several problems with it and others have reported having problems using it. Using coffimplib is the way to go. But first, you will need a COFF import library.
If you need it, the Mysql release with the COFF import library can be downloaded from here: http://dev.mysql.com/downloads/mirror.php?id=377977#mirrors (Libraries in DLL form marked for Visual Studio usually come with a COFF import library).
But I'm providing you the translated COFF import library in OMF format here: http://dl.dropbox.com/u/9218759/libmysql.lib
I've tried both of your test cases and they both seem to work fine now. In case of problems, try to check the translated header file again (mysql.d), it's possible that I might have missed to specify all the calling conventions properly.