Command Line
netsh winhttp import proxy source = ieChrome
If you have Chrome it has an interesting feature for doing so as well.Proxy server: some.corp-proxy:8080
Bypass list:
some.corp-server...
Proxy server: some.corp-proxy:8080
Bypass list:
some.corp-server...
To create a share called mydir that points to c:\mydocs\mydir and gives everyone full permissions you would use the following command
net share mydir=C:\mydocs\mydir /GRANT:Everyone,FULL
NOTE: If you are using Windows XP, or Windows Server 2003, I recommend using Everyone with full permissions and then using the ACL (Windows permissions) to restrict access. Otherwise, you have two levels of permissions and they can be difficult to maintain if you try to make them match. Newer versions of the OS make this work better though.
I found that the IIS and SharePoint log files on my servers were getting out of hand and consuming lots of disk space. So, I decided that it would be nice to have these files compressed and moved to a network share for archiving. The files are all text files so I have selected the PPMd compression algorithm that 7zip happens to have available. It is supposed to give 30% BETTER compression than other popular compression algorithms. This can easily be changed if you decide to modify this script to work on files other than text files. I run it from Windows Scheduler on a weekly basis.
The script assumes you are using cscript to call it. Currently it is configured to only look at files with the .log extension, but that can be easily changed. The script is recursive and will look at the directory you pass it and all sub-directories. You can specify how many days of logs to keep. This is based on the last modified date.
' Usage:
' All parameters are required
' To Move all files older than 7 days from "C:\Source\folder\name\here" to "\\destinationHost\Destination\folder\name\here"
' cscript ArchiveFiles.vbs "C:\Source\folder\name\here" "\\destinationHost\Destination\folder\name\here" 7 true
' To Copy all files older than 10 days from "C:\Source\folder\name\here" to "\\destinationHost\Destination\folder\name\here"
' cscript ArchiveFiles.vbs "C:\Source\folder\name\here" "\\destinationHost\Destination\folder\name\here" 10 false
' Original Script copied from: http://gallery.technet.microsoft.com/scriptcenter/de01e926-088f-409b-abf4-e27dbb185597#content
' Brent Vermilion: 2012-05-14: Modified to use 7zip, not use a mapped drive (only unc), is now recursive, and handles files not in sub-directory,
' and added option to keep original file.
'==================
' Parameters
'==================
' maps the parameters to variables for easier understanding
SourceFolderName = WScript.Arguments(0) 'i.e. "C:\inetpub\logs\LogFiles" Local log file directory
DestShare = WScript.Arguments(1) ' i.e. "\\myhost\d$\Archived Logs\IIS\usbtmp8375msdev"
LogFileAge = CInt(WScript.Arguments(2)) ' i.e. 7 Log files must be older than this number of days to be archived (using "last modified" date)
DeleteOriginalAfterArchive = CBool(WScript.Arguments(3)) ' i.e. true or false
'==================
' Constants
'==================
Const PathFor7zip = "C:\Program Files\7-Zip\7z.exe"
'==================
' Variables
'==================
Set objShell = WScript.CreateObject ("WScript.Shell")
Set objFSO = CreateObject("Scripting.FileSystemObject")
counter = 0
failedCounter = 0
ProcessFolder(SourceFolderName)
' clean up
Set objFile = Nothing
Set obShell = Nothing
Set objWshNetwork = Nothing
WScript.Echo counter & " files were archived to " & DestShare
WScript.Echo failedCounter & " files FAILED during archiving process."
Function ProcessFolder(folderName)
Set objWMIService = GetObject("winmgmts:\\")
'==================================
' get files in folder
'==================================
strPathName = FormatPath(folderName)
Set colFiles = objWMIService.ExecQuery("Select * from CIM_DataFile where Path = '" & strPathName & "'")
'==================================
' loop through each file and process the ones older than n days
'==================================
For Each objFile in colFiles
If objFile.Extension = "log" Then
If WMIDateStringToDate(objFile.LastModified) < (Now() - LogFileAge) Then
ProcessFile objFile, folderName
End If
End If
Next
'=====================================================================================
' Connect to local WMI service and get a collection of subfolders for current folder
'=====================================================================================
Set colSubfolders = objWMIService.ExecQuery _
("Associators of {Win32_Directory.Name='" & folderName & "'} " _
& "Where AssocClass = Win32_Subdirectory " _
& "ResultRole = PartComponent")
'=============================================
' loop through the sub-folders
'=============================================
For Each objFolder in colSubfolders
' recursive call
ProcessFolder(objFolder.Name)
Next
End Function
Function ProcessFile(objFile, folderName)
'=================================================
' Check if current folder exists on remote system
' If not, create it
'=================================================
If Not objFSO.FolderExists(DestShare) Then
CreateDirs(DestShare)
End If
'========================================================
' Compress file
' chr(34) adds ASCII quotes in case path contains spaces
'========================================================
matchBasePathIdx = InStr(1,objFile.Name, folderName, 1)
' get the path and name of the file without the base directory path
relativeFilename = Mid(objFile.Name, matchBasePathIdx + Len(folderName) + 1)
' prepend the mapped drive and the base path on the drive we want to write to
zipFilename = DestShare & "\" & relativeFilename & ".7z"
' build the command line
' Notice we are using 7zip and the PPMd compression algorithm (-m0=PPMd) that is superior by approx 30% over other modern compression algorithms FOR TEXT.
' If you are not compressing text files you may leave off or change this parameter.
' More Info on PPMd Compression: http://www.dotnetperls.com/ppmd
cmdText = chr(34) & PathFor7zip & chr(34) & " a -t7z " & chr(34) & zipFilename & chr(34) & " " & chr(34) & objFile.Name & chr(34) + " -m0=PPMd"
' execute the command we built up
compressReturn = objShell.run (cmdText,0,true)
'========================================================
' Make sure the current file was compressed successfully
' If so, delete the file from the source directory
'========================================================
If compressReturn = 0 Then
' Delete the file if it succeeded and we are configured to do so
if DeleteOriginalAfterArchive = true Then
WScript.Echo "Deleted: " & objFile.Name
' Check if file exists to prevent error
If objFSO.FileExists(objFile.Name) Then
objFSO.DeleteFile objFile.Name
End If
End If
counter = counter + 1
WScript.Echo "SUCCEEDED: " + objFile.Name
else
failedCounter = failedCounter + 1
WScript.Echo "FAILDED: " + objFile.Name & " -> " & zipFilename
End If
End Function
Function FormatPath(strFolderName)
'===========================================================================
' Formats strFolderName to add extra backslashes for CIM_DataFile WQL query
' Stolen from TechNet Script Center
'===========================================================================
arrFolderPath = Split(strFolderName, "\")
strNewPath = ""
For i = 1 to Ubound(arrFolderPath)
strNewPath = strNewPath & "\\" & arrFolderPath(i)
Next
FormatPath = strNewPath & "\\"
End Function
Function WMIDateStringToDate(dtmDate)
'===================================================
' Formats a WMI date string to a usable Date format
' Stolen from TechNet Script Center
'===================================================
WMIDateStringToDate = CDate(Mid(dtmDate, 5, 2) & "/" & _
Mid(dtmDate, 7, 2) & "/" & Left(dtmDate, 4) _
& " " & Mid (dtmDate, 9, 2) & ":" & Mid(dtmDate, 11, 2) & ":" & Mid(dtmDate,13, 2))
End Function
' Copied from: http://www.robvanderwoude.com/vbstech_folders_md.php
' Examples:
' UNC path
'CreateDirs "\\MYSERVER\D$\Test01\Test02\Test03\Test04"
' Absolute path
'CreateDirs "D:\Test11\Test12\Test13\Test14"
' Relative path
'CreateDirs "Test21\Test22\Test23\Test24"
Sub CreateDirs( MyDirName )
' This subroutine creates multiple folders like CMD.EXE's internal MD command.
' By default VBScript can only create one level of folders at a time (blows
' up otherwise!).
'
' Argument:
' MyDirName [string] folder(s) to be created, single or
' multi level, absolute or relative,
' "d:\folder\subfolder" format or UNC
'
' Written by Todd Reeves
' Modified by Rob van der Woude
' http://www.robvanderwoude.com
Dim arrDirs, i, idxFirst, objFSO, strDir, strDirBuild
' Create a file system object
Set objFSO = CreateObject( "Scripting.FileSystemObject" )
' Convert relative to absolute path
strDir = objFSO.GetAbsolutePathName( MyDirName )
' Split a multi level path in its "components"
arrDirs = Split( strDir, "\" )
' Check if the absolute path is UNC or not
If Left( strDir, 2 ) = "\\" Then
strDirBuild = "\\" & arrDirs(2) & "\" & arrDirs(3) & "\"
idxFirst = 4
Else
strDirBuild = arrDirs(0) & "\"
idxFirst = 1
End If
' Check each (sub)folder and create it if it doesn't exist
For i = idxFirst to Ubound( arrDirs )
strDirBuild = objFSO.BuildPath( strDirBuild, arrDirs(i) )
If Not objFSO.FolderExists( strDirBuild ) Then
objFSO.CreateFolder strDirBuild
End if
Next
' Release the file system object
Set objFSO= Nothing
End Sub
You can control, monitor, change, etc Windows Scheduled Tasks using schtasks from the command line. You can use it on your local machine or a remote machine such as a server.
In the examples below let’s assume that the Windows Scheduled Task you are interested in is called EventLogImport.
IMPORTANT: For any of the commands or examples below, you can add a /S and then the servername to execute the command against a remote machine running Windows.
schtasks /QUERY /TN EventLogImport
schtasks /QUERY /S serverNameHere /TN EventLogImport
schtasks /QUERY /V /TN EventLogImport
schtasks /QUERY /FO CSV /V /TN EventLogImport
NOTE: Other options are TABLE and LIST. The /V is for Verbose.
To remove the header row you can use the MORE command to start reading at the second line
schtasks /QUERY /FO CSV /V /TN EventLogImport | more +1
This will clear the status.csv if it exists, else it will create a new file.
schtasks /QUERY /FO CSV /V /TN EventLogImport > c:\temp\status.csv
To append to the file if you call two or more of these in a row, use the >> instead of >.
schtasks /QUERY /FO CSV /V /TN EventLogImport >> c:\temp\status.csv
To remove the header row you can use the MORE command to start reading at the second line
schtasks /QUERY /FO CSV /V /TN EventLogImport | more +1 >> EventLogImportScheduledJobStatus.csv
schtasks /CHANGE /TN EventLogImport /DISABLE
schtasks /CHANGE /TN EventLogImport /ENABLE
You can also create, delete, run, or kill scheduled tasks as well. For information on how to do this, I recommend typing schtasks /? at any command prompt.
For help on a specific switch, try one of the following:
SCHTASKS
SCHTASKS /?
SCHTASKS /Run /?
SCHTASKS /End /?
SCHTASKS /Create /?
SCHTASKS /Delete /?
SCHTASKS /Query /?
SCHTASKS /Change /?
SCHTASKS /ShowSid /?
It sounds like a tall order at first, but then it also sounds like something that should be built into Windows. In particular, I am using Windows 7, but I assume it works on XP as well.
The solution is actually quite simple and is indeed built into Windows 7.
dir /s /b *my string*
This will give you something similar to:
c:\temp\This is a test of my string.txt
c:\mydocs\My string could be found here.txt
As shown above it will search the current directory and all directories below the current one for files that contain ‘my string’. You can also specify something like this to specify the directory you want to start searching in regardless of your current directory.
dir /s /b “c:\mydocs\*my string*”
You can use all the standard wild card syntax such as the following to search only .html files.
dir /s /b “c:\mydocs\*my string*.html”
In case you are wondering the /s is for recursive and the /b is to return just the path\filename (no file info such as date, size, etc).
I love Beyond Compare. It is so useful. It allows you to compare contents of files in a very nice way. It can also compare entire directories. It can even compare and copy binary files.
In many ways, I prefer it to the built in Windows copy functionality because if there is an error in Windows copy, you have to try again and I never really know the state of the destination folder. When I use Beyond Compare, if there is an error all I have to do is look at the differences or just simply try again. I also get that warm-fuzzy when I see that there is no difference between the source and destination directories.
I also like to have the same warm-fuzzy when it comes to “backups". I don’t trust backup solutions in general. It isn’t that they aren’t good, it is that I don’t trust that I am configuring it right. What I want is to be able to see the files after a backup in the destination drive. Then if I can use a tool like Beyond Compare to verify the source and destination directories are the same, then I feel confident that what think I backed up is actually backed up.
One important point with this model is that I am not really talking about a true backup in the sense that most backup software provides. Most backup software saves different versions of a given file for a certain period of time. While, what I am talking about here could be modified to do so, it would be a substantially different solution. What I am talking about is really a way to snapshot my important data to another disk in the event that my disk fails. I am not worried about me changing a file and needing a previous version of a files. I use manual copies or version control for this type of stuff. I don’t need this for most files, thus I only need the most recent snapshot of my important data.
You can actually use Beyond Compare to do what I just described. However, I want an automated way of doing this. I also want to be notified of any errors. There is nothing worse than thinking your backup is running, and then when you need it most, you realize it was broken. So, always periodically check that your backup process is working.
Below are my scripts I have created to automate the process. The only software needed to do this is Beyond Compare and Microsoft LogParser and MS Windows XP (or greater I’m sure). LogParser is only needed if you want to see any errors that occur. If you like checking log files, then you don’t need LogParser.
This scripts below will need to be tweaked to match your environment. My environment is such that I have an external hard drive that I backup to. You can also use UNC pathnames as well if you want to use a Windows share.
You can download a zip file of the same basic thing, or you can follow the instructions below.
Assume E:\ is the volume that you are backing up to.
This is the main file that you will run to do the backup. You can use Windows Scheduled Tasks to run this an a regular interval to automate the backup process. You need to modify this file to have a line for each of the Saved Sessions you created in the initial setup. If you are backup up a SQL Server database directory, then be sure to stop SQL Server first, otherwise the files will be locked and cannot be backed up. You could also schedule a backup of the SQL Server databases, and then backup the database backups. The choice is yours. The backup is running on my laptop where I don’t have much disk space to make backups of the databases just to back them up, but I no problem stopping SQL Server since I am the only one using it.
Below is a Beyond Compare script. It tells Beyond Compare to clear any read only flags in the destination directory so that we won’t get any errors when replacing any files. Then it makes the source and destination directory the same. There are other options, just check the Beyond Compare docs on how to modify the script to change the behavior of the sync command (see the last line), the rest of the script is to clear the read only flags.
# Backup Directory
# Turn on logging
log verbose %LOGS_PATH%\backup-%BACKUP_NAME%.log
# Load the deployment directory on the server and the backup directory on the server
load %BACKUP_NAME%
#Must expand all folders so that select will work
expand all
# Select all files in the backup directory on server
select right
# Clear the read only flag on the backup directory on server
attrib -r
# Assume yes to all confirmation messages
option confirm:yes-to-all
# Copy all files from local directory to backup directory on the server
sync mirror mirror:left->right
This file is very simple and doesn’t really need to be modified unless you changed the path to the Logs directory, or have installed Beyond Compare in a different path. This just tells Beyond Compare to run as quietly as possible so that it doesn’t disturb us. :)
SET LOGS_PATH=Logs
IF NOT EXIST %LOGS_PATH% md %LOGS_PATH%
set BACKUP_NAME=%1%
"C:\Program Files\Beyond Compare 2\bc2.exe" /silent @backupDirectory.bc
This file is optional. You really only need it if you want to scan the log files for errors and then show them to you. It tells MS LogParser to query the logs we created using the query defined in the CheckForErrors.sql file. Adjust the path to LogParser if needed or a different version is used.
"C:\Program Files\Log Parser 2.2\LogParser" file:CheckForErrors.sql -i:TEXTLINE
This file is optional like I said before. Adjust the path to your log files if you didn’t use the same path as I did.
select * into DATAGRID from E:\backupScript\Logs\*.log where Text like '%Script Error%'
When I move a project from my computer to the production server I like to create a label in Visual Source Safe (VSS) that notes the date and time and that something was deployed. This allows me and others to always know what is in production and how long it has been in production. It also makes it easy to do a bug fix. It is nice to have this at the click of a batch file. Otherwise, I am just to lazy most of the time to wait for VSS (it is on a slow connection so it takes a long time to load or do anything), remember to check in my code, find the project I am working on, then create a label, and type in the date and time and try to remember the format of my label so that is the same. Here is the code I put in a batch file (just create a text file and change the extension to .bat) using my favorite text editor. For more information on how the NOW variable is populated and limitation on it, see my other entry at: http://justgeeks.blogspot.com/2008/07/getting-formatted-date-into-variable-in.html. If you are not using US English data format you will definitely need to change the line that sets NOW to work with your locale. Also, you will need to adjust the paths in this script to match your environment.
@echo off set NOW=%date:~10,4%-%date:~4,2%-%date:~7,2%--%time:~0,2%.%time:~3,2% echo Please make sure all changes are checked in pause set SSDIR=\\myVssServer\someDirWhereIniFileIs"C:\Program Files\Microsoft Visual SourceSafe\ss" Label $/MyVssProjectPath -LDEPLOYED-%NOW% -C- echo A label has been created in VSS called: DEPLOYED-%NOW%