Deploy the client-side script
In this step, we'll deploy the PowerShell script that runs on your workstations to gather software update related data and send this to your log analytics workspace.
Last updated
In this step, we'll deploy the PowerShell script that runs on your workstations to gather software update related data and send this to your log analytics workspace.
Last updated
The client-side script executes on a schedule to regularly gather the latest software updates related data from your workstations and then send this to the log analytics workspace via the data collector API. The script can send both delta and full updates; my personal recommendation is to send a full update at least once a day.
The script must be run in administrative context. Intune proactive remediations is an ideal mechanism to deploy this script and run it on a schedule. If you don't have use of this, the local task scheduler is an alternative.
Download the script from here:
Then configure the parameters at the top of the script.
This defines the minimum frequency in days that a full data set should be uploaded. Recommended value is 1.
This defines the minimum frequency in hours that a new delta data set can be uploaded. The actual frequency that your script will run is defined by the schedule in Intune; this is more of a safeguard to prevent data inadvertently being sent too frequently.
The ID of your log analytics workspace. You can get this from the workspace itself in the Overview blade:
The primary key of your log analytics workspace. You can get this from the workspace itself under Settings > Agents management:
If you regenerate your keys, remember to update them in the script.
Enter a name, such as your company name, to be used to create a directory containing the exported update info, and a registry key containing script execution data.
Here we'll use proactive remediations in Intune to deploy the script.
In the Intune portal, go to Reports > Endpoint Analytics > Proactive remediations
Click Create script package
Enter a name as a minimum, click Next
For the Detection script file, browse for and add the script. There is no need to add a remediation script.
Be sure to set Run script in 64-bit PowerShell to Yes.
Click Next, add a scope tag if desired
Click Next and assign the script to your target workstations
For the Schedule, set a frequency that reflects how fresh you want your data to be. For example, every 4 hours typically means you'll get data updates sent twice a day for workstations online during normal business hours.
Bear in mind that the frequency affects how much data will be sent to the log analytics workspace and how much data the kusto queries need to process. If you have a very large number of workstations avoid sending a full data set too frequently. Find out what works for you.
Once your assignments and schedule are set, click Next and Create to finish.
The script first checks whether a delta or full 'inventory' is required, based on the parameters you have set in the script.
The script then gathers software update-related data from various places on the client, including WMI, the registry, the Windows Update agent and the Windows event log.
If a full inventory is required, it converts the inventory data to json format and caches it to the local disk at %ProgramData%\<YourCompanyName>\SoftwareUpdateReporting. If a delta is required, it calculates the delta by comparing it with the most recent full inventory and generates a delta file.
The file is then posted directly to the log analytics workspace via the data collector API.
The script also logs statistics to the registry at HKLM:\Software\<YourCompanyName>\SoftwareUpdateReporting.