Tagged: group policy
- Mon, Apr 10 2017 at 8:57 am #106852
My team and I have been doing some research to find a tool that can be used to report on the status of GPO deployments. We’re seeking this for one of our customers but can’t seem to find anything in the marketplace. To clarify, precisely, the type of reporting we’re seeking, I’ll share an analogy used by one of my team members and then depict how it is comparable to Group Policy (GP).
A mother writes a letter to each of her children with instructions for completing an online survey for their upcoming family reunion. She puts the letters in envelopes and then goes to the post office to get them mailed out with delivery confirmation. She assumes that if she gets confirmation that the letters arrived, her children (since they’re such good kids) followed through with the request and completed the survey.
From a GP perspective, an IT administrator (mother) creates a GPO (envelope) and specifies the settings (instructions) that need to delivered to Windows systems (children). The Group Policy infrastructure (Post Office) gets the GPO to the workstation.
What we’re looking for is something like delivery confirmation for GP. How do we know the GPO got to the systems? What time did it get there?
How have you been able to meet this requirement in your organization or for your customers?
If you’re familiar with Client Management Systems such as SCCM, we’re just looking for something similar to the reporting available in these tools to monitor the status of package deployments.
Indeed, we’ve been able to find tools in the market that can confirm whether settings were applied (if the survey got completed), but that’s not what we’re seeking.
If we can’t find something, we may be tasked to create a solution. If we were to go that route, we’d rather build something that is commercially available than something that aids only one organization. How many of you would be interested in something like this?
- Mon, Apr 10 2017 at 8:07 pm #106867
- Tue, Apr 11 2017 at 5:22 pm #108024
Thanks for commenting. The event log certainly has verbose GP information. But I don’t think it has what I need (i.e., RSOP information). I did some research and found a tool in the Windows 2003 Resource Kit called GP Monitor (https://db.tt/cV9qlbVWCthttps://db.tt/cV9qlbVWCt), but is has many limitations. Not to mention it won’t work with the latest releases of the Windows operating system.
- Wed, Apr 12 2017 at 5:37 am #108545Karim BuzdarModerator
Post count: 137Member Points: 2,565
- Topics: 16
- Replies: 46
How about executing gpresult.exe? With the help of gpresult.exe, you can collect RSoP information including the time GPO was applied.
- This reply was modified 4 months ago by Karim Buzdar.
- Wed, Apr 12 2017 at 9:54 am #109215
- Wed, Apr 12 2017 at 7:13 pm #109560
I wonder what exactly are you trying to accomplish? What would you do with all this data? Who has the time to analyze this? Usually you only need RSoP information for troubleshooting.
I recommend collecting this data from a small number of machines first and then see if you can do with the data what you want. You can also use the Get-GPResultantSetOfPolicy cmdlet (belongs to RSAT) to remotely collect RSoP information. Storing the XML files in a central database is probably not a big deal. Your real problem is to analyze all this data from 100k machines.
Users who have liked this topic:
- Thu, Apr 13 2017 at 3:20 am #111529
Let me paint a picture that will hopefully make things a little more clearer. Let’s assume I have an OU with 1000 workstations in it. I create a GPO, and then link and enable it to the OU. The settings in the OU are now being deployed to all 1000 systems. How would you go about knowing the following?
- How many systems got the GPO?
- For those that got it, how many succeeded? … how many failed?
These are questions I need answered anytime I have a GPO being pushed to systems. And based on my research, there is nothing available that can get me these answers.
- Thu, Apr 13 2017 at 7:41 pm #112705
But this exactly the information you get through event log analysis.
The only thing you don’t get this way is RSoP information. So the question you have to ask yourself is if you really want to collect RSoP information from 100k machines and then what exactly you intend to do with this massive database?
The point is that you don’t need RSoP information if you want to determine how many systems received a GPO and if errors occurred. I guess that is the reason why you can’t find such a tool.
By the way, one way to centrally retrieve RSoP information is triggering a script at user logon that writes this data to the event log and then collect the data with an event log tool. I would use Task Scheduler for this purpose because you need admin rights to read computer RSoP.
- Fri, Apr 14 2017 at 7:31 am #112740
Thanks again for commenting. And you’re right. I probably could get this information from the event logs.
I’m trying to stay away from event logs for 2 reasons.
- Any admin can purge the data (many users in my environment have admin rights)
- The max log size can be set (again by an admins) to a value so low that important data would be overwritten.
The logon script approach is a reasonable option. But what if the user never logs off?
I may be wrong here, but I need something more reliable and timely.
That’s why the GPMonitor solution was so appealing to me. It simply invokes after every GP refresh and dumps the RSOP data to a central location.
If only it had dumped the info to a database (for aggregation, data mining an analysis purposes), rather than a fileshare (GPMonitor dumps a .cab file), I would definitely use it.
- Fri, Apr 14 2017 at 5:44 pm #112756
Users who have admin rights can interfere with any kind of logging you are trying to implement. Any third-party solution that is similar to GPMonitor would just be as unreliable as the event log method. If you can’t trust your event logs and your admins, you already have a serious security issue.
If users never log off, you can use one of the countless other triggers of Task Scheduler. The easiest way would be to schedule a task every 4 hours or so. This would ensure that a Group Policy refresh has happened. You can also manually trigger a refresh and then retrieve RSoP afterward. If you work with SCCM, you can trigger similar tasks remotely.
As to the database, with Get-GPResultantSetOfPolicy you get XML data. A skilled scripter should be able to import the data into database.
However, I think the whole idea of logging RSoP information is kind of unusual. If you just want to ensure that certain crucial policies are active (maybe because you are afraid that your local admins messed with them?), you can just remotely query the machines and retrieve the corresponding registry values on a regular basis. Of course, a savvy local admin can interfere with this procedure too.
- Fri, Apr 14 2017 at 11:16 am #112749FrankTuckerCAParticipant
Post count: 3Member Points: 32
- Topics: 0
- Replies: 1
Maybe handle this as a two-part solution: Which computers at least received the GPO? Have the GPO create a simple txt file with a GPO preference. Or even better create a (csv, json, xml) with a PowerShell script ran at startup under the computer configuration. With PowerShell you could include the time, the computername along with the name of the GPO. Store the file on each computer or some file share. PowerShell and/or SCCM could then check and report which computers are missing the file.
The next part seems more difficult, did the GPO fully apply. Unless another GPO takes precedence, why wouldn’t the whole GPO apply along with the file created above? A group policy results report along with a report on which computers had the file created really should cover most of what you are looking for. For additional piece of mind, you could check a few ‘core’ items the GPO is applying on computer with PowerShell and/or SCCM and create an additional file.
Once you had the basics working, you could then implement Mr. Don Jones “Making Historical & Trend Report in PowerShell” and get this data into SQL and run SQL reports. I guess you could technically skip creating ‘files’ and have PowerShell just enter data into SQL directly. I think it seems nice have two ways to check: run an SQLl report and if you wanted to double check you could manually check the files on a group of computers. If a admin just wanted to check a single or even a few hundred computers they are responsible for they could.
I’m not in enterprise IT. I assumed a network with 100k computer would at least forward some basic event logs to a centralize solution that could also be used for this.
Users who have liked this topic:
- Tue, Apr 18 2017 at 6:49 am #115153
@franktuckerca .. thanks for chiming in. The picture you’re painting with using GP preferences, scripting, or SCCM is definitely feasible. Rather than going that route I would rather just implement GPMonitor. It would make the administration easier.
But as you’ve suggested, at the end of the day, I’d need to get the data into SQL. The client has enterprise logging, but it’s used mostly for security. And for the amount of data it already accumulates, It wouldn’t be in my best interest to have it capture more because it would be difficult to mine anyway.
You must be logged in to reply to this topic.