• Your shared article is very helpful, I am learning PowerShell to automate tasks. I’m from Mexico. Thank you

  • Hi
    can you please suggest how to provide local admin rights for ad group in jea.
    can we establish the remote computer as same like DC?

  • the -ExecutionPolicy should not be used at all in a company environment!!
    Just make sure you have that option disabled by GPO and use code-signing to guarantee the security of the code.

  • I have tested these powershell arguments to prevent any powershell pop-ups:

    -file “C:DeployFolderxyz.ps1” -NoLogo -WindowStyle Hidden

  • Hello, s31064 (wow, nice name, you could pretend to be one of Elon Musk’s children and claim inheritance money :D)

    I’m glad you find this useful. Basically, i tried to make something that makes life easier. It’s a short one-liner once you’ve referenced the function, so my scripts look clearer.
    I’m even happier when others find it useful, too.

    Cheers. Emanuel

  • Sponsor posted an update in the group Group logo of PowerShellPowerShell 1 week, 1 day ago

    PowerShell Security Best Practices
    Running PowerShell scripts securely is essential when it comes to administration of complex IT infrastructures.
    PowerShell itself already comes with a lot of security features, and ScriptRunner extends these for secure delegation and central management of scripts and credentials.
    You will learn how to use the PowerShell SecretManagement module, working with execution policies, secure credential management and much more.
    Register for free
  • Very nice, thank you. I was timing my scripts with

    $stopWatch = New-Object System.Diagnostics.Stopwatch
    $stopWatch.Start()
    # Script code goes here
    $CheckTime = $stopWatch.Elapsed
    if ($CheckTime.Days -gt ‘0’)
    {
    Write-Verbose $(“Script Completed in {0} days, {1} hours, {2} minutes`n” -f $CheckTime.Days, $CheckTime.Hours, $CheckTime.Minutes )
    }
    elseif ($CheckTime.Hours -gt ‘0’)
    {
    Write-Verbose $(“Script Completed in {0} hours, {1} minutes`n” -f $CheckTime.Hours, $CheckTime.Minutes )
    }
    elseif ($Checktime.Minutes -gt ‘0’)
    {
    Write-Verbose $(“Script Completed in {0} minutes, {1} seconds`n” -f $CheckTime.Minutes, $CheckTime.Seconds )
    }
    else
    {
    Write-Verbose $(“Script Completed in {0} seconds, {1} milliseconds, {2} ticks`n” -f $CheckTime.Seconds, $CheckTime.Milliseconds, $CheckTime.Ticks )
    }

    I still use stopwatch, but I’ve replaced all of the if, elseif, else at the end with a simple

    $CheckTime = $stopWatch.Elapsed
    “Script took $($CheckTime | Get-TimeSpanPretty).”
    $stopWatch.Stop()

    Much neater and pretty much exactly the same information. Good job.

  • Kudos for publishing this method in Powershell.
    Years ago, I wrote the same app in Delphi, but use CRC32 as the hashing tool.

    Part of that app is a non-recursive file search that starts at the specified parent directory, then searches all child leafs below. Each filespec is recorded in a TOPAZ (DB4) flat file database which is a source-code level toolbox for Delphi.

    On the first pass, I keep track of all three file create/modified/accessed time stamps, the fully qualified file name, and the file size in the database.

    The DB is indexed on file size and all identical-sized files are further examined.
    Those with different file extensions are skipped.

    Those with identical extensions are then each calculated for CRC32 and stored in the database. This is a slow process when examining 30,000 to 50,000 files. I rewrote CRC32 into Assembler but got no appreciable boost in performance, as the Delphi compiler is very good at optimizing code performance. Profiling the code, the CRC32 is indeed the slow point.

    Last, I don’t delete duplicate files, but rename them with a representation of their fully qualified file name, i.e “root-mydocuments-pictures-vacation.img1234.jpg”, then move them all as unique file names to a common directory called !Duplicate Files for safe keeping until certain to delete them.

  • The are multiple techniques to find duplicates, but the most challenging part is scanning directories efficiently. Get-ChildItem is sequential and limited to 260-character path length (solved in PS7). In order to scan a disk with millions of files, I created a C# app, using p/invoke FindFirstFile thus solving the path length issue and using async functions and a channel. Scanning a 12TB disk for files over 1MB is less than three minutes (including the file system owner!). Most of the time, file name, date and size returns 90%+ of the duplicates. Hashing is an expensive operation on millions of files. Putting a threshold on the size helps with performance too.

    De-duplication can also be used so you don’t need to care about duplicates. It has a ‘cost’ of course.

  • Thank you for this. Excellent. i checked your blog, too. Enough there to keep me busy for the rest of my life. I won’t need any other sources for what I want to know, looks like.
    I’ve got big problems with duplicate files, I know. But i also know from experience it is not a simple job fixing it. You have to be very careful, right?
    Blindly eliminating duplicates can wreck everything I think I have found in the past.
    But that may have been because my duplicate elimination software was too simplistic – perhaps not checking file length, calling files ‘duplicates’ merely because they had the same name.
    Which can destroy all your config.txt or similar in a flash.
    Whatever it was I know I created a real hell for myself in the past by trying to organise files and eliminate duplicates.
    ‘Organising’ meant collecting like files together in one place. That didn’t work. Moving them is the same as deleting them to software that expects to find them there.
    Part of the problem is hinted at here where you caution to make backups before doing anything. Well that’s just it. We make backups and then get interrupted before the whole task is complete – which can take a long time when using only ‘spare’ time – and with maybe as many as eight disks to work on, something greater than 10 Terabytes – and there we are during the process with now ‘extra’ duplicates created just for the sake of the exercise!
    I very much would like a workable technique for this.
    Currently I have lapsed into the ‘redundancy’ mode: i.e. I never delete anything and just keep buying hard drives.
    There’s gotta be a better way but it’s not simply running some duplicate finder/eliminator software is it?
    There has to be some kind of rigorous and careful and fool proof procedure.
    And I can’t devise it.
    But this stuff is good. I like it very much.
    I thank you for it.
    I

  • You can use PowerShell to find and remove duplicate files that are only wasting valuable storage. Once you identify duplicate files, you can move them to another location, or you might even want to permanently remove all duplicates.

  • Hi,
    In Azure Front door standard, if i want to do Rule set implementation in bulk, is it possible to automate via powershell.
    if yes, can you please help me in it.

  • Omg Im not the only one. I have a psychotic neighbor doing the same thing to my entire network. He’s even hacked into my dvr units n ip cameras. And the local police is totally useless. No attempt to shut down access to my devices has been successful and I’ve tried nearly everything. Has anyone here found a way to get rid of the pests in their network ??

  • PsInfo, a command-line tool that is part of the Sysinternals suite, gets key information about Windows systems, such as the type of installation, available disk space, installed applications, kernel build number, uptime, registered owner, and the number of processors.

  • I fully expected it would need tweaking.. the main thing was to show the idea I had on how to put together that information.

    The stuff you show in your example is obviously different than what I had, even though I can’t be 100% sure what I had was right (I was doing it on the fly, and I’m not a RegEx expert).  The idea is to take the output from the two commands (qwinsta, quser), parse it, and assemble a hash table of the results, and use that to run the 3rd command (qprocess) to get the activity.

    David F.

  • Thanks Michael, great scripts. After checking if the Server needs a reboot or if there any pending reboots, If I need to just restart services on a server and not do a complete restart of the server, can I just use the cmdlet

    Restart-Service <service-name> -PassThru

    Please let me know ?

     

     

    Thanks again

  • The final code is definitely more complicated than where I began. But the code itself isn’t the point of the article. I was demonstrating different ways you can use PSBoundparameters. You might use one of the concepts in your code to simplify something.

  • Load More
© 4sysops 2006 - 2022

CONTACT US

Please ask IT administration questions in the forums. Any other messages are welcome.

Sending

Log in with your credentials

or    

Forgot your details?

Create Account