A while back someone asked this interesting question to automate analysis of memory dumps and upload that data to a server through a web service. Here is one of the possible approaches –
The following command will collect log after executing !analyze –v
kd.exe -z “Dump_File_LocationMEMORY.DMP” -logo "user_writable_pathMyKD.log" -c "!analyze –v; q"
The above command will generate onscreen output as well, if you do not want that then redirect onscreen data to a log file,
kd.exe -z “Dump_File_LocationMEMORY.DMP” -c "!analyze –v; q" > "user_writable_pathMyKD.log”
You can pass number of commands chained together to windows debugger with “–c” command line option.
Here in the example we only run “!analyze –v”. Didn’t mean to make it complex but you may want to chain more commands like “!analyze –v; !vm 1; q”. This is especially good for bulk processing of dumps or generating quick summary for escalation. Remember, you have to set symbol path appropriately, either through command line “-y <SymbolsPath>” or global environment variable _NT_SYMBOL_PATH=[Drive:][Path]
I am not sure how to upload this data through an http web service, but you can use FTP command line to upload this file to a server.
——–ftp commands start———
echo user ftp_user_name
echo put "user_writable_pathMyKD.log”
——–ftp commands end———
ftp –n –s:ftp_commands_file FTP_SERVER
The script will generate output to screen from each command it process, to make it quiet you can redirect output to a log file which can be later deleted. The problem with this approach is, you have to save your ftp password to the commands file. However a “del ftp_commands_file” at the end of script will delete it, but that is still not secure.