• It is not working. I don’t think the module installed on my computer. The install command is not doing anything. I checked the github link for it & it makes no sense to me. What does this mean?: “Drop the root folder in your PSModulePath”. I don’t see a root folder there & where is the PSModulePath?

  • For those looking for a way to do this in exchange online / office 365:
    – Create the pst in a local outlook client (we created a temp mailbox for this)
    – Use Network upload to import your pst to Microsoft 365 (you can find a step by step from microsoft to do this here: https://learn.microsoft.com/en-us/microsoft-365/compliance/use-network-upload-to-import-pst-files?view=o365-worldwide)

    Good luck!

  • Unfortunately, ComputerName parameter is not supported but you can always use Invoke-Command as shown below to run these commands on remote computer:

    Invoke-Command -ComputerName PC1 -ScriptBlock {Get-WiFiProfile -ProfileName MyWiFi}
  • How do I use this on a remote computer on my domain?

  • I was able to use this info to retrieve data not even present in our Varonis , so double awesome.Very good writeup!

  • Your shared article is very helpful, I am learning PowerShell to automate tasks. I’m from Mexico. Thank you

  • Hi
    can you please suggest how to provide local admin rights for ad group in jea.
    can we establish the remote computer as same like DC?

  • the -ExecutionPolicy should not be used at all in a company environment!!
    Just make sure you have that option disabled by GPO and use code-signing to guarantee the security of the code.

  • I have tested these powershell arguments to prevent any powershell pop-ups:

    -file “C:DeployFolderxyz.ps1” -NoLogo -WindowStyle Hidden

  • Hello, s31064 (wow, nice name, you could pretend to be one of Elon Musk’s children and claim inheritance money :D)

    I’m glad you find this useful. Basically, i tried to make something that makes life easier. It’s a short one-liner once you’ve referenced the function, so my scripts look clearer.
    I’m even happier when others find it useful, too.

    Cheers. Emanuel

  • Sponsor posted an update in the group Group logo of PowerShellPowerShell 2 weeks, 3 days ago

    PowerShell Security Best Practices
    Running PowerShell scripts securely is essential when it comes to administration of complex IT infrastructures.
    PowerShell itself already comes with a lot of security features, and ScriptRunner extends these for secure delegation and central management of scripts and credentials.
    You will learn how to use the PowerShell SecretManagement module, working with execution policies, secure credential management and much more.
    Register for free
  • Very nice, thank you. I was timing my scripts with

    $stopWatch = New-Object System.Diagnostics.Stopwatch
    # Script code goes here
    $CheckTime = $stopWatch.Elapsed
    if ($CheckTime.Days -gt ‘0’)
    Write-Verbose $(“Script Completed in {0} days, {1} hours, {2} minutes`n” -f $CheckTime.Days, $CheckTime.Hours, $CheckTime.Minutes )
    elseif ($CheckTime.Hours -gt ‘0’)
    Write-Verbose $(“Script Completed in {0} hours, {1} minutes`n” -f $CheckTime.Hours, $CheckTime.Minutes )
    elseif ($Checktime.Minutes -gt ‘0’)
    Write-Verbose $(“Script Completed in {0} minutes, {1} seconds`n” -f $CheckTime.Minutes, $CheckTime.Seconds )
    Write-Verbose $(“Script Completed in {0} seconds, {1} milliseconds, {2} ticks`n” -f $CheckTime.Seconds, $CheckTime.Milliseconds, $CheckTime.Ticks )

    I still use stopwatch, but I’ve replaced all of the if, elseif, else at the end with a simple

    $CheckTime = $stopWatch.Elapsed
    “Script took $($CheckTime | Get-TimeSpanPretty).”

    Much neater and pretty much exactly the same information. Good job.

  • Kudos for publishing this method in Powershell.
    Years ago, I wrote the same app in Delphi, but use CRC32 as the hashing tool.

    Part of that app is a non-recursive file search that starts at the specified parent directory, then searches all child leafs below. Each filespec is recorded in a TOPAZ (DB4) flat file database which is a source-code level toolbox for Delphi.

    On the first pass, I keep track of all three file create/modified/accessed time stamps, the fully qualified file name, and the file size in the database.

    The DB is indexed on file size and all identical-sized files are further examined.
    Those with different file extensions are skipped.

    Those with identical extensions are then each calculated for CRC32 and stored in the database. This is a slow process when examining 30,000 to 50,000 files. I rewrote CRC32 into Assembler but got no appreciable boost in performance, as the Delphi compiler is very good at optimizing code performance. Profiling the code, the CRC32 is indeed the slow point.

    Last, I don’t delete duplicate files, but rename them with a representation of their fully qualified file name, i.e “root-mydocuments-pictures-vacation.img1234.jpg”, then move them all as unique file names to a common directory called !Duplicate Files for safe keeping until certain to delete them.

  • The are multiple techniques to find duplicates, but the most challenging part is scanning directories efficiently. Get-ChildItem is sequential and limited to 260-character path length (solved in PS7). In order to scan a disk with millions of files, I created a C# app, using p/invoke FindFirstFile thus solving the path length issue and using async functions and a channel. Scanning a 12TB disk for files over 1MB is less than three minutes (including the file system owner!). Most of the time, file name, date and size returns 90%+ of the duplicates. Hashing is an expensive operation on millions of files. Putting a threshold on the size helps with performance too.

    De-duplication can also be used so you don’t need to care about duplicates. It has a ‘cost’ of course.

  • Thank you for this. Excellent. i checked your blog, too. Enough there to keep me busy for the rest of my life. I won’t need any other sources for what I want to know, looks like.
    I’ve got big problems with duplicate files, I know. But i also know from experience it is not a simple job fixing it. You have to be very careful, right?
    Blindly eliminating duplicates can wreck everything I think I have found in the past.
    But that may have been because my duplicate elimination software was too simplistic – perhaps not checking file length, calling files ‘duplicates’ merely because they had the same name.
    Which can destroy all your config.txt or similar in a flash.
    Whatever it was I know I created a real hell for myself in the past by trying to organise files and eliminate duplicates.
    ‘Organising’ meant collecting like files together in one place. That didn’t work. Moving them is the same as deleting them to software that expects to find them there.
    Part of the problem is hinted at here where you caution to make backups before doing anything. Well that’s just it. We make backups and then get interrupted before the whole task is complete – which can take a long time when using only ‘spare’ time – and with maybe as many as eight disks to work on, something greater than 10 Terabytes – and there we are during the process with now ‘extra’ duplicates created just for the sake of the exercise!
    I very much would like a workable technique for this.
    Currently I have lapsed into the ‘redundancy’ mode: i.e. I never delete anything and just keep buying hard drives.
    There’s gotta be a better way but it’s not simply running some duplicate finder/eliminator software is it?
    There has to be some kind of rigorous and careful and fool proof procedure.
    And I can’t devise it.
    But this stuff is good. I like it very much.
    I thank you for it.

  • You can use PowerShell to find and remove duplicate files that are only wasting valuable storage. Once you identify duplicate files, you can move them to another location, or you might even want to permanently remove all duplicates.

  • Hi,
    In Azure Front door standard, if i want to do Rule set implementation in bulk, is it possible to automate via powershell.
    if yes, can you please help me in it.

  • Load More
© 4sysops 2006 - 2022


Please ask IT administration questions in the forums. Any other messages are welcome.


Log in with your credentials


Forgot your details?

Create Account