Archive for November, 2013

Rename multiple FILES using POWERSHELL

November 30, 2013 Leave a comment

Requirement ::

I needed to upload my Educational certificates onto a particular websites ..

after i was done with scanning my docs .. i never realised that the sites wont accept PDF or any other fromat Documents

which had special Characters in the name Exxmple –> ( ,&,*,@,$ ETC …

in my case the Documents were named Document(1).PDF 


so i needed to get there name changed to just Document1.PDF 

aligned snapshot is the what my docs looked


to change the names of the PDF files .. used POWERGUI and used below command to set my location to the path which

contained the PDF files

SL stands for SET LOCATION and is equvalent to CD

sl ‘D:\GS\DOCS\TEST_scancopy’

then i used below command to get a count of files that had round brackets [ ( )  ] in their names

PS_rename._2JPGget-childitem -Filter ‘*(*’ | measure

this gave me idea of how many docs need to renamed …

i used a while loop to run for 29 times … to ensure all the PDF files

having *(* present in their name are set to required name ..

The rename-item was used to set the names of PDF files .

below is PS Script i used to rename the docs … as i have around 29 docs .. i have used 30 as the no. of times the loop will run through

___________________SCRIPT  STARTs __________

$int = 1

Write-Host $int

while ($int -ne 30)
Write-Host ‘within while loop’
Write-Host $int

$var = ‘Document’+ $int.ToString()+’.PDF’

# the names to which docs will be set

get-childitem -Filter ‘*(*’ | select -first 1 | rename-item -NewName $var

Write-Host ‘renamed to ‘ $var

$int ++


___________________SCRIPT  ENDs __________

The SCRIPT ran fine ..

you may use similar approach to tackle .. any other requirement 

specific to your need  ENJOY 🙂 





Running DBCC CHECKDB on all databases at once !!!

November 26, 2013 2 comments

Writing with Ink

Issue ::

some times it so happens .. that the  storage assigned to SQL servers may take a sudden

unexpected reboot .. and in those moments we may have LOST WRITES in

          databases .. which can cause them to be marked as suspect or causing 

          issues while accessing objects … 


in those moments we have to ensure consistancy and allocation across all DBs

are intact .. running DBCC CheckDB manually on each and every option is time

consuming (plus multiple windows  are to be opened)

SCRIPT ::  below script will help us run DBCC checkDB on all DBs once we have count of all DBs present on our instance ..  (the other option you have to achieve the same is to use CHECKDB job .. but with below script you will be able to view the errors in Query analyser itself)


1> use sp_helpdb .. to check total number of databases present on your instance …

according to the total number of DBs .. use dbid >= * and dbid <= * to cover all DBs

—Script to Run in different windows

—— by commenting and dbid >= * and dbid <= * part in the where clause



—————————————-SCRIPT START ————————————————-

DECLARE @name VARCHAR(50) — database name



FROM master.dbo.sysdatabases
WHERE name NOT IN (‘tempdb’)
and dbid >= 1 and dbid <= 20 —- window 1
–and dbid >= 21 and dbid <= 50 —- uncomment and run from QA window 2
–and dbid >= 51 and dbid <= 100 —- uncomment this and run from QA window 3
OPEN db_cursor
FETCH NEXT FROM db_cursor INTO @name



–SET @fileName = @path + @name + ‘_’ + @fileDate + ‘.BAK’



set @SQL = ‘DBCC CHECKDB’ + ‘ ‘ + ‘(‘+@name + ‘)’
print @SQL

exec (@SQL)



FETCH NEXT FROM db_cursor INTO @name



CLOSE db_cursor
DEALLOCATE db_cursor



—————————————-SCRIPT END————————————————-




you should see the output similar to above ..

go throught the errors and see if Allocation or integrity errors are reported for any DB …

accordingly use Repair_REBUILD or REPAIR_ALLOW_DATA_LOSS of DBCC CheckDB ..

alway prefer to use lastest Good backup over using above repair option !!!!




Delete OLD SQL Backups using Powershell in one go !!!

November 23, 2013 Leave a comment

issue :

some times the maintance plans do not delete old Backups files (based on their age) ,that we have intended it to do

also the delete maintance task (subtask) may also skip files .. based on their creation date or AGE ..

manually deleting the old .BAK or any other files is not really inspiring after a certain time ..

Solution : 

we can use Powershell (PS) to help us acheive the same result .. with few lines ..

SETUP :: im backing up the databases into a test folder .. but the PS script can used to do the same tasks for any storage filer server .. that you must be using to store backups — JUST MAKE SURE YOU PASS the CORRECT PATH ..

use below script to backup all database to a sepcfic path

USE master


DECLARE @int varchar(15)

declare @Rand varchar(15)

DECLARE @SQL varchar(200)

declare @path varchar(200)

DECLARE @Cur varchar(20)

declare curr CURSOR FOR

SELECT name from sys.databases where database_id <> 2

OPEN curr


FROM curr INTO @cur



–declare @var char(20)

PRINT @cur
–set @var = @cur

set @int = convert(varchar(20),getdate(),112)

set @rand = substring(convert(varchar(10),rand()*1000,112),0,4)

set @path = ‘D:\POWERSHELL\TEST\New Folder\’+@cur+’_’+@int+’_’+@rand+’.BAK’

–print @path

backup database @cur to disk = @path

FROM curr INTO @cur


CLOSE curr



Above script is send all full backups to path –> D:\POWERSHELL\TEST\New Folder\    (choose any path .. that works for you !!!)

once we have backups in place ..  (or we know which Backups are to deleted according to their age ) .. start you powershell ..

START –> RUN –> POWERSHELL.EXE    (will work on all machines above windows XP,Server 2003 ) .. for this 2 .. you need to download and install

you do not have to change you path .. to the folder .. from where you want to delete files .. (example in case of external storage location which is accessed like \\DATADOMAINSERVER01\SQLBACKUP\DAILY_Backups\) .. you can run the powershell script ..  from wherever your current path ..

in my case im switching to the location which holds the path .. because i need to show few options ..

the -Recurse option present in DIR or get-CHILDITEM is able to traverse through all sub folders under the parent folder ..

hence with this option you will be able to delete old files in a external storage path like —


below results show .. that DIR – Recurse is able to trverse through D:\POWERSHELL\TEST\ and a subfolder named New folder

(we excluded the BAK files .. so as to minimize the outcome)


below command will give you backups created 3 hours back .. you will be looking for much older backups and use below command

which works with days instead of hours  (i have used below .. as a DEMO )

 USE addhours(-30)   (to get a count of files and folders which creation date older then 30 days )

get-childitem “D:\POWERSHELL\TEST\New Folder” -Recurse | Where-Object {$_.creationdate-lt ($(getdate).adddays(-30))} | measure


NOW :  we only want to work with .BAK files and possibly may like to exlcude some files from deletion command … so we will use -include *.BAK (or *.SAFE) and -Exlcude  (to exclude any type of files example *.PS1 )

if you use SQL SAFe for backups or any other Backup utility use the required extention

use below command

basically .. we are using -lt (less then comparision) to filter out files older then 30 days .. (in example its older then 2 hours)

below command will give .. only 2 files .. of type *.BAK ..

 get-childitem “D:\POWERSHELL\TEST\New Folder” -Recurse -include *.BAK -exclu
de *.PS1 | Where-Object {$_.creationtime -lt ($(get-date).adddays(-30))} | select -First 2


use below command to get a exact count of *.BAK files .. you will be deleting

 get-childitem “D:\POWERSHELL\TEST\New Folder” -Recurse -include *.BAK

-exclude *.PS1 | Where-Object {$_.creationtime -lt ($(get-date).adddays(-30))} | measure 


if you trying to get a count of files older then 30 days .. on large file server .. above command will take some time to complete ..

———-command to delete files older then certains Days ————————

********** take care of below points **************

1> mention correct path from where you want to delete backup files (even sub-folder will be taken into account)

2> mention -include *.BAK .. to make sure only backup files are deleted .. also if needed exclude any application or config file

if its possibly on the mentioned path ..

3> make sure to mention correct AGE or no. of days you want to delete files  adddays(-30)) .. means you want to delete a month old files or 2 months old files ETC 


 get-childitem “D:\POWERSHELL\TEST\New Folder” -Recurse -include *.BAK -exclude *.PS1 | Where-Object {$_.creationtime -lt ($(get-date).adddays(-30))} | remove-item -force




Above command will delete the backup files of type (*.BAK) and exclude the files of type (*.PS1)  older then 30 days

under path D:\POWERSHELL\TEST\New Folder and any sub folder .. it may have .. 

try out this command .. and let me know your thoughts ..