• https://www.microsoft.com/en-us/microsoft-365/blog/2015/04/15/office-365-now-supports-larger-email-messages-up-to-150-mb/

    “OWA, however, restricts the size of the message you can send to 25 percent less than the configured allowed maximum send size.”

    I don’t know if the above is relevant (especially considering that 110 MB is right at the edge of 25% of 150 MB), but I tested this out myself and had to increase my limit because the file I was testing with was within that margin.

    The “remote server” part of the response if probably throwing you off, but the important part to focus on is that it’s indicating that the message is too large for the sender and not the recipient. Typically, if the recipient rejects the message, you’ll get:
    “Your message wasn’t delivered because the recipient’s email provider rejected it.”

  • “Obviously, the recipient’s mailbox suffers from the same limits.”

    This is not the recipient’s mailbox that’s returning this message. It’s actually the sender’s.

  • Hi.
    Please notice that using the GPO method configure SACLs will also distribute DACLs!!
    This means that registry permissions will be modified!
    There’s a reference to this problem here:
    https://helpcenter.netwrix.com/bundle/Auditor_10.5/page/Content/Auditor/Configuration/WindowsServer/WS_Registry.htm

    -> NOTE: Using Group Policy for configuring registry audit is not recommended, as registry DACL settings may be lost.

    Please mention it in the article, as this can very easily go unnoticed.
    Thanks.

  • the -ExecutionPolicy should not be used at all in a company environment!!
    Just make sure you have that option disabled by GPO and use code-signing to guarantee the security of the code.

  • I have tested these powershell arguments to prevent any powershell pop-ups:

    -file “C:DeployFolderxyz.ps1” -NoLogo -WindowStyle Hidden

  • Hi John,

    I need your help regarding the code with TFTP as you were added TFTP commands in last .
    Can you share the code with TFTP after editing i want to take configuration backup in config file format like TFTP take configuration backup and put this backup on TFTP directory.
    Appreciate if you can help me.

    Regards,
    Kashif

  • how do you get a copy of this phonebook app?

  • What is your situation? how that work for you? are you still using Proxmox?

  • What is your situation? how that work for you? are you still using Proxmox?

  • What is your perspective on Proxmox, could we have article about that too?!

  • Very nice, thank you. I was timing my scripts with

    $stopWatch = New-Object System.Diagnostics.Stopwatch
    $stopWatch.Start()
    # Script code goes here
    $CheckTime = $stopWatch.Elapsed
    if ($CheckTime.Days -gt ‘0’)
    {
    Write-Verbose $(“Script Completed in {0} days, {1} hours, {2} minutes`n” -f $CheckTime.Days, $CheckTime.Hours, $CheckTime.Minutes )
    }
    elseif ($CheckTime.Hours -gt ‘0’)
    {
    Write-Verbose $(“Script Completed in {0} hours, {1} minutes`n” -f $CheckTime.Hours, $CheckTime.Minutes )
    }
    elseif ($Checktime.Minutes -gt ‘0’)
    {
    Write-Verbose $(“Script Completed in {0} minutes, {1} seconds`n” -f $CheckTime.Minutes, $CheckTime.Seconds )
    }
    else
    {
    Write-Verbose $(“Script Completed in {0} seconds, {1} milliseconds, {2} ticks`n” -f $CheckTime.Seconds, $CheckTime.Milliseconds, $CheckTime.Ticks )
    }

    I still use stopwatch, but I’ve replaced all of the if, elseif, else at the end with a simple

    $CheckTime = $stopWatch.Elapsed
    “Script took $($CheckTime | Get-TimeSpanPretty).”
    $stopWatch.Stop()

    Much neater and pretty much exactly the same information. Good job.

  • My DymaxIO dashboard says needs reboot. I have rebooted computer. Still says needs reboot. What should I do? How to reboot?

  • Kudos for publishing this method in Powershell.
    Years ago, I wrote the same app in Delphi, but use CRC32 as the hashing tool.

    Part of that app is a non-recursive file search that starts at the specified parent directory, then searches all child leafs below. Each filespec is recorded in a TOPAZ (DB4) flat file database which is a source-code level toolbox for Delphi.

    On the first pass, I keep track of all three file create/modified/accessed time stamps, the fully qualified file name, and the file size in the database.

    The DB is indexed on file size and all identical-sized files are further examined.
    Those with different file extensions are skipped.

    Those with identical extensions are then each calculated for CRC32 and stored in the database. This is a slow process when examining 30,000 to 50,000 files. I rewrote CRC32 into Assembler but got no appreciable boost in performance, as the Delphi compiler is very good at optimizing code performance. Profiling the code, the CRC32 is indeed the slow point.

    Last, I don’t delete duplicate files, but rename them with a representation of their fully qualified file name, i.e “root-mydocuments-pictures-vacation.img1234.jpg”, then move them all as unique file names to a common directory called !Duplicate Files for safe keeping until certain to delete them.

  • Thank you so much for this. I was about to rebuild my WSUS servers because I could not get the “Basket” add-in to appear.
    It is a pain to add each one manually, but at least I have been able to add the patches I need to add the out-of-band patches I need for this month.

    THANK YOU!

  • The are multiple techniques to find duplicates, but the most challenging part is scanning directories efficiently. Get-ChildItem is sequential and limited to 260-character path length (solved in PS7). In order to scan a disk with millions of files, I created a C# app, using p/invoke FindFirstFile thus solving the path length issue and using async functions and a channel. Scanning a 12TB disk for files over 1MB is less than three minutes (including the file system owner!). Most of the time, file name, date and size returns 90%+ of the duplicates. Hashing is an expensive operation on millions of files. Putting a threshold on the size helps with performance too.

    De-duplication can also be used so you don’t need to care about duplicates. It has a ‘cost’ of course.

  • Thank you for this. Excellent. i checked your blog, too. Enough there to keep me busy for the rest of my life. I won’t need any other sources for what I want to know, looks like.
    I’ve got big problems with duplicate files, I know. But i also know from experience it is not a simple job fixing it. You have to be very careful, right?
    Blindly eliminating duplicates can wreck everything I think I have found in the past.
    But that may have been because my duplicate elimination software was too simplistic – perhaps not checking file length, calling files ‘duplicates’ merely because they had the same name.
    Which can destroy all your config.txt or similar in a flash.
    Whatever it was I know I created a real hell for myself in the past by trying to organise files and eliminate duplicates.
    ‘Organising’ meant collecting like files together in one place. That didn’t work. Moving them is the same as deleting them to software that expects to find them there.
    Part of the problem is hinted at here where you caution to make backups before doing anything. Well that’s just it. We make backups and then get interrupted before the whole task is complete – which can take a long time when using only ‘spare’ time – and with maybe as many as eight disks to work on, something greater than 10 Terabytes – and there we are during the process with now ‘extra’ duplicates created just for the sake of the exercise!
    I very much would like a workable technique for this.
    Currently I have lapsed into the ‘redundancy’ mode: i.e. I never delete anything and just keep buying hard drives.
    There’s gotta be a better way but it’s not simply running some duplicate finder/eliminator software is it?
    There has to be some kind of rigorous and careful and fool proof procedure.
    And I can’t devise it.
    But this stuff is good. I like it very much.
    I thank you for it.
    I

  • I had LAPS working on all our devices. However, recently I have swapped all devices for new Surface laptop 4 with Windows11 and now LAPS doesn’t work at all. Any ideas? DC is still Server2019, no changes in policies or whatsoever.

  • Hi,
    In Azure Front door standard, if i want to do Rule set implementation in bulk, is it possible to automate via powershell.
    if yes, can you please help me in it.

  • Can you please tell me how to exit from this Server Evaluation? I put Option 15 at the SConfig screen and it takes me to PowerShell but I can’t exit. Exit doesn’t work, and CTRL-C doesn’t work.

  • Load More
© 4sysops 2006 - 2022

CONTACT US

Please ask IT administration questions in the forums. Any other messages are welcome.

Sending

Log in with your credentials

or    

Forgot your details?

Create Account