PCI DSS 4.0 point 8.3.6 : Deploy the Windows Guest Configuration extension to enable Guest Configuration assignments on Windows VMs
While I was working on PCI DSS 4.0, I saw a VM under point 8.3.6 Deploy the Windows Guest Configuration extension to enable Guest Configuration assignments on Windows VMs. I identified that the issue to be with Guest configuration not installed on VM. To fix , we need to run open the Shell command from top right corner of the screen next to search as shown in below screenshot

Make sure we have Bash selected and run below command in the Shell. Make sure you replace VMname and resourcename from the command.
az vm extension set --publisher Microsoft.GuestConfiguration --name ConfigurationforWindows --extension-instance-name AzurePolicyforWindows --resource-group 'Resourcename' --vm-name myVM --enable-auto-upgrade true


Once command is completed, wait for 24 hrs and this point will be closed in PCI DSS 4.0 Dashboard.
Thanks and Regards,
Inderjeet Singh Jaggi
After disk encryption unable to backup Virtual Machine in Azure
I recently encountered an issue that I believe is worth sharing with you. We encrypted our disk as required by PCI DSS 4.0 compliance. However, after the encryption process was completed, we started receiving an error with the code ‘UserErrorEncryptedVmNotSupportedWithDiskEx’, which indicates that ‘disk exclusion is not supported for encrypted virtual machines‘


When reviewed the backend log we understood that you tried configuring selective disk backup for encrypted disk backup and this is not supported with standard backup policy, however you can configure backup with enhanced policy. I had only OS disk so this article wasn’t much helpful for me.
https://learn.microsoft.com/en-us/azure/backup/selective-disk-backup-restore#limitations

On the backup screen, I could see OS disk as included disks.

But from the log I could see you have enabled backup without selective disk backup option, and it failed with error UserErrorKeyVaultPermissionsNotConfigured

After lot of research, I found we need to run below command to fix the issue and reset the exclusion. This will mainly reset the settings.
az backup protection update-for-vm --resource-group {resourcegroup} --vault-name {vaultname} -c {vmname} -i {vmname} --disk-list-setting resetexclusionsettings

After above command, I was able to start the backup and could see the completed status of the VM’s


Thanks and regards,
Inderjeet Singh Jaggi
Cloud Architect – Golden Five Consulting
[email protected]
Download a SharePoint Online Page
I encountered an issue where I needed to download a page and upload it to a new site collection. Although this issue may not be common, if you are experiencing it, this article could be of great assistance.

Unfortunately, SharePoint Online doesn’t allow us to copy pages from one site collection to another. However, we can copy list items and documents. If you want to copy a file or item from a document library to another, select the three dots in front of the item, then select ‘Copy to’ and choose the destination site/library. Since we are unable to do this with pages, I have found a workaround which I will share with you below.


To download a SharePoint page, you can use the following URL format and the file will be downloaded: https://tenant.sharepoint.com/_layouts/download.aspx?SourceUrl=https://tenant.sharepoint.com/SitePages/test.aspx

Now you can upload this file to the destination using SharePoint classic view.

Article by: Inderjeet Singh
Cloud Architect
Golden Five Consulting
How to get rid of error ‘The email address is already being used. Please choose another email address’
Today, I had mistakenly created a user account instead of shared mailbox so I wanted to delete and recreate it as a shared mailbox but kept getting error ‘The email address is already being used. Please choose another email address.’

After some research I found that if we want to use the same email address as an alias of our primary account, we have to wait for 30 days prior to the date when deleted. But we have an alternate workaround for same, we can hard delete the email address using below command.
Remove-Mailbox -Identity <Identity> -Permanent $true
So, I opened PowerShell from top right navigation as shown below

If you are running it for the 1st time, it will create a new database to log the powershell command and its output. as shown below



If you don’t have at least contribute access on subscription, you will get error ‘{“error”:{“code”:”AuthorizationFailed”,”message”:”The client ‘UserID’ with object id ‘xxx’ does not have authorization to perform action ‘Microsoft.Resources/subscriptions/resourcegroups/read’ over scope ‘/subscriptions/subscriptionnumber/resourcegroups/cloud-shell-storage-eastus’ or the scope is invalid. If access was recently granted, please refresh your credentials.”}}

Note : Microsoft is updating few policies so few command including this one may not work for all tenants so use the alternate method mentioned below
Or
You can even run PowerShell from your local machine as shown below:


Now you can use below command and remove the mailbox, now you should be able to create the Shared mailbox without any error.
Remove-Mailbox -Identity <Identity> -PermanentDelete $true
Article from:
Inderjeet Singh Jaggi
Published a user\shared mailbox calendar and add it to SharePoint page or even with someone outside the organization
1. Make sure you have send-as permission on the mailbox from Admin center.
2. Sign-in to Outlook web app (https://outlook.office365.com/owa) with your Office 365 account or open mailbox.
3. If its shared mailbox click on your name on top right corner of the screen and then select Open other users mailbox, type shared mailbox details.
4. Navigate to the calendar and right-click the calendar you want to share. Select Share > Publish this calendar > Share this calendar as shown below:


5. Now select Can view all details or any other option and select Publish

Now open the SharePoint page and add a script editor app. Add below code to the app. I have hidden the GUID in above screenshot.
<iframe src=”https://outlook.office365.com/owa/calendar/SOMEGUID/calendar.html” height=”400px” width=”400px”></iframe>


Article from:
Inderjeet Singh Jaggi
How to access PowerBI anonymous site using SharePoint 2013 (Authenticated) without exposing the anonymous URL

Publish the PowerBI site for anonymous user and try to browse the PowerBI dashboard on the SharePoint site make sure it is working fine

Configure Performance Point Service Application. To do so follow below article.
Configure PerformancePoint Services – SharePoint Server | Microsoft Learn

Now create a new site collection . To do so, Click Application Management > Create Site collections > Select appropriate web application > Provide details such as title, url, admin > Make sure the site collection template is named as ‘Business Intelligence Center’ under Enterprise tab. > Click OK


Once Site collection is ready, Browse the site > select site content > select Dashboard


Select PerformancePoint on top and then select ‘Dashboard Designer’ > It will download and open Dashboard Designer application on the screen(If IE doesn’t work try Chrome browser)


Select ‘PerformancePoint Content’ > Other Reports > Web Page > Click OK

Enter the PowerBI URL in the URL. Also you can rename the name of report from left navigation to something easy.

Right click ‘PerformancePoint Content’ > New > Dashboard

Select 1 zone from template and click OK

Now Rename the Dashboard from left navigation > Name the page to something easy to understand > From right hand navigation expand Reports > PerformancePoint Content > BIdemo > Click Add at bottom of screen

Now right click on Dashboard and select ‘Deploy to SharePoint’

Click OK

Once published, you will be routed the page which has the new Dashboard published with PowerBI running in the background

Here when checked the source on the page, we wouldn’t be able to find the path of the anonymous site published in PowerBI
Access to this SharePoint page can managed using SharePoint permissions
Article from:
Inderjeet Singh Jaggi
Cloud Architect – Golden Five Consulting
Delete items from Preservation hold library in SharePoint online
Hi All,
I was struggle for few days to remove/delete content from preservation hold library in SharePoint Online. If you check online or ask Bing chat engine(Chat GPT), you will get response at the end of the page but actually it doesn’t help, still I have shared those details as they are needed to be followed to make sure we don’t have any hold from Admin center.
You need to make sure you follow below steps in addition to release all holds from admin center which are not documented in any article I could find.

2. Then on this window search for ‘Unable to delete SharePoint site’




$SiteURL = "https://customersite.sharepoint.com/sites/site"
$ListName = "Preservation Hold Library"
#Connect to PnP Online
Connect-PnPOnline -Url $SiteURL -Interactive
#Delete all files from the library
Get-PnPList -Identity $ListName | Get-PnPListItem -PageSize 100 -ScriptBlock {
Param($items) Invoke-PnPQuery } | ForEach-Object { $_.Recycle() | Out-Null
}

Get-PnPListItem : This library contains items that have been modified or deleted but must remain
available due to eDiscovery holds. Items cannot be modified or removed.
At line:8 char:35
To remove hold on the Preservation, hold library in SharePoint Online, you can follow these steps:
- Go to Microsoft Purview Compliance Portal as a global admin.
- If you have any Retention Policies – Exclude the site from the retention policy.
- If you have eDiscovery cases – Close all eDiscovery cases.
- If you have Data loss prevention policies – Disable them.


To exclude a site from the retention policy in SharePoint Online, you can follow these steps:
- Go to Microsoft Purview Compliance Portal as a global admin
- Select “Exclude Sites”
- Enter the URL of the site that you want to exclude, and then select the plus (+) button
- Select the check box for the site. You can add other sites if you want
- After you enter all the sites that you want to exclude, select “Exclude” at the bottom of the window to confirm the changes
- Select “Save”
SharePoint PowerShell to identify all OneDrive sites and migrate content to SharePoint sites
I wanted to perform migration of few of my users content to their respective OneDrive site. To do so, I used below PowerShell to identify URL of all users who already have a OneDrive sites and then used FileMigration wizard to import their content in the respective OneDrive Site.
$TenantUrl = Read-Host "https://tenantname-admin.sharepoint.com/"
$LogFile = [Environment]::GetFolderPath("Desktop") + "\OneDriveSites.log"
Connect-SPOService -Url $TenantUrl
Get-SPOSite -IncludePersonalSite $true -Limit all -Filter "Url -like '-my.sharepoint.com/personal/'" | Select -ExpandProperty Url | Out-File $LogFile -Force
Write-Host "Done! File saved as $($LogFile)."

Note : Note agent should be installed on a system which should have access to the files through network drive we want to migrate








PowerShell to find duplicate files on SharePoint online site
You can use below PowerShell to find all duplicate files(regardless of document libraries) on SharePoint online site.
Make sure you update the parameters such as SiteURL and ReportOutput. This may take time based on number of items in SharePoint site. I have kept it running for 3 days and it worked like a charm. I have seen other Powershell’s which don’t work if you have MFA enabled whereas this script works regardless you have MFA enabled or disabled.
#Before running the below script, kindly follow below steps :
#1. Open you PowerShell ISE in your system and run it as administrator
#cls2. Install the New PnP PowerShell Module using below commands:
Install-Module PnP.PowerShell
#3. Register a new Azure AD Application and Grant Access to the tenant
Register-PnPManagementShellAccess
#Then paste and run below pnp script:
#Parameters
$SiteURL = "https://tenantname.sharepoint.com/sites/Sitename"
$Pagesize = 2000
$ReportOutput = "C:\Temp\Sitename.csv"
#Connect to SharePoint Online site
Connect-PnPOnline $SiteURL -Interactive
#Array to store results
$DataCollection = @()
#Get all Document libraries
$DocumentLibraries = Get-PnPList | Where-Object {$_.BaseType -eq "DocumentLibrary" -and $_.Hidden -eq $false -and $_.ItemCount -gt 0 -and $_.Title -Notin("Site Pages","Style Library", "Preservation Hold Library")}
#Iterate through each document library
ForEach($Library in $DocumentLibraries)
{
#Get All documents from the library
$global:counter = 0;
$Documents = Get-PnPListItem -List $Library -PageSize $Pagesize -Fields ID, File_x0020_Type -ScriptBlock `
{ Param($items) $global:counter += $items.Count; Write-Progress -PercentComplete ($global:Counter / ($Library.ItemCount) * 100) -Activity `
"Getting Documents from Library '$($Library.Title)'" -Status "Getting Documents data $global:Counter of $($Library.ItemCount)";} | Where {$_.FileSystemObjectType -eq "File"}
$ItemCounter = 0
#Iterate through each document
Foreach($Document in $Documents)
{
#Get the File from Item
$File = Get-PnPProperty -ClientObject $Document -Property File
#Get The File Hash
$Bytes = $File.OpenBinaryStream()
Invoke-PnPQuery
$MD5 = New-Object -TypeName System.Security.Cryptography.MD5CryptoServiceProvider
$HashCode = [System.BitConverter]::ToString($MD5.ComputeHash($Bytes.Value))
#Collect data
$Data = New-Object PSObject
$Data | Add-Member -MemberType NoteProperty -name "FileName" -value $File.Name
$Data | Add-Member -MemberType NoteProperty -Name "HashCode" -value $HashCode
$Data | Add-Member -MemberType NoteProperty -Name "URL" -value $File.ServerRelativeUrl
$Data | Add-Member -MemberType NoteProperty -Name "FileSize" -value $File.Length
$DataCollection += $Data
$ItemCounter++
Write-Progress -PercentComplete ($ItemCounter / ($Library.ItemCount) * 100) -Activity "Collecting data from Documents $ItemCounter of $($Library.ItemCount) from $($Library.Title)" `
-Status "Reading Data from Document '$($Document['FileLeafRef']) at '$($Document['FileRef'])"
}
}
#Get Duplicate Files by Grouping Hash code
$Duplicates = $DataCollection | Group-Object -Property HashCode | Where {$_.Count -gt 1} | Select -ExpandProperty Group
Write-host "Duplicate Files Based on File Hashcode:"
$Duplicates | Format-table -AutoSize
#Export the duplicates results to CSV
$Duplicates | Export-Csv -Path $ReportOutput -NoTypeInformation
What is digital twin? – Part 2
Now lets understand when you should use Digital twin:
- You should use it when you have large number of digital twin to manage
- When there is complexity in relationship
- It would be best if you can existing model instead of creating new once
- You can benefit if it has flexible schema
Couple of things which popularized digital twin in recent days:
- Cheap IOT devices
- Increased number of IOT devices in use
- Affordable internet/connectivity
- Boost in Cloud platforms
If you want to implement Digital Twin, you need to make sure your projects have below characteristics:
- Ability to establish some sort of connection between 2 devices
- Can and should require queries to be executed based on device parameters
- Secure integration
Lets try to build a Digital twin of a Building to understand what is temperature of each room.
In Azure Digital Twin, we have 4 types of interfaces or classes
- Component – Temperature and humidity
- Relationship – Building has floor and then floor has rooms
- Telemetry – Input data
- Properties – State of entity
You can download the json from MS link below:
Building : https://raw.githubusercontent.com/Azure-Samples/digital-twins-explorer/main/client/examples/Building.json
Floor : https://raw.githubusercontent.com/Azure-Samples/digital-twins-explorer/main/client/examples/Floor.json
Room : https://raw.githubusercontent.com/Azure-Samples/digital-twins-explorer/main/client/examples/Room.json


Here Json needs to start with dtmi with displayname as ‘Building’, next the only property of this interface is Date, It has relationship with Floor. Most important is DisplayName.

Here Json needs to start with dtmi with displayname as ‘Floor’, next properties of this interface is OwnershipUser and OwnershipDeparment, It has relationship with Room

Here Json needs to start with dtmi with displayname as ‘Room, next the only property of this interface is Temperature and Humidity, It has no further relationship
Make sure you use DTDL validator while creating these Json files
https://learn.microsoft.com/en-us/samples/azure-samples/dtdl-validator/dtdl-validator/
Using this json we have multiple ways to create our scenario’s but we will use below xlsx file which has specified data to start with. You can download the excel for building scenario’s from below path:
https://github.com/Azure-Samples/digital-twins-explorer/raw/main/client/examples/buildingScenario.xlsx



Microsoft understands it will take a lot of efforts for one to create these models and relationships, so to ease the process Microsoft has published a few models such as space, event, document, building, etc so you can reuse them as your start point
https://github.com/WillowInc/opendigitaltwins-building
Here is basic strcture of Azure Digital Twin

Input : Input can be from Workflows, IOT hub or any rest Api. You can input parameter data manually as well.
Processing : Azure Digital Twin platform process the input based on business logics, understands connections and process data accordingly. Digital twin of real-world sensors can be created to predict and test results based on change in inputs
Output : Using Time Series insights, Azure data explorer and Analytics we can better visualize data and actionable items can be driven from predictions.
Main challenges with ADT are:
- Modelling is difficult and time consuming
- Delay in processing data realtime
- Inter operability between different types of IOT devices and services
- Hard and expensive for existing systems
- Privacy and confidentiality