Archive for the ‘Uncategorized’ Category
How to access PowerBI anonymous site using SharePoint 2013 (Authenticated) without exposing the anonymous URL

Publish the PowerBI site for anonymous user and try to browse the PowerBI dashboard on the SharePoint site make sure it is working fine

Configure Performance Point Service Application. To do so follow below article.
Configure PerformancePoint Services – SharePoint Server | Microsoft Learn

Now create a new site collection . To do so, Click Application Management > Create Site collections > Select appropriate web application > Provide details such as title, url, admin > Make sure the site collection template is named as ‘Business Intelligence Center’ under Enterprise tab. > Click OK


Once Site collection is ready, Browse the site > select site content > select Dashboard


Select PerformancePoint on top and then select ‘Dashboard Designer’ > It will download and open Dashboard Designer application on the screen(If IE doesn’t work try Chrome browser)


Select ‘PerformancePoint Content’ > Other Reports > Web Page > Click OK

Enter the PowerBI URL in the URL. Also you can rename the name of report from left navigation to something easy.

Right click ‘PerformancePoint Content’ > New > Dashboard

Select 1 zone from template and click OK

Now Rename the Dashboard from left navigation > Name the page to something easy to understand > From right hand navigation expand Reports > PerformancePoint Content > BIdemo > Click Add at bottom of screen

Now right click on Dashboard and select ‘Deploy to SharePoint’

Click OK

Once published, you will be routed the page which has the new Dashboard published with PowerBI running in the background

Here when checked the source on the page, we wouldn’t be able to find the path of the anonymous site published in PowerBI
Access to this SharePoint page can managed using SharePoint permissions
Article from:
Inderjeet Singh Jaggi
Cloud Architect – Golden Five Consulting
Delete items from Preservation hold library in SharePoint online
Hi All,
I was struggle for few days to remove/delete content from preservation hold library in SharePoint Online. If you check online or ask Bing chat engine(Chat GPT), you will get response at the end of the page but actually it doesn’t help, still I have shared those details as they are needed to be followed to make sure we don’t have any hold from Admin center.
You need to make sure you follow below steps in addition to release all holds from admin center which are not documented in any article I could find.

2. Then on this window search for ‘Unable to delete SharePoint site’




$SiteURL = "https://customersite.sharepoint.com/sites/site"
$ListName = "Preservation Hold Library"
#Connect to PnP Online
Connect-PnPOnline -Url $SiteURL -Interactive
#Delete all files from the library
Get-PnPList -Identity $ListName | Get-PnPListItem -PageSize 100 -ScriptBlock {
Param($items) Invoke-PnPQuery } | ForEach-Object { $_.Recycle() | Out-Null
}

Get-PnPListItem : This library contains items that have been modified or deleted but must remain
available due to eDiscovery holds. Items cannot be modified or removed.
At line:8 char:35
To remove hold on the Preservation, hold library in SharePoint Online, you can follow these steps:
- Go to Microsoft Purview Compliance Portal as a global admin.
- If you have any Retention Policies – Exclude the site from the retention policy.
- If you have eDiscovery cases – Close all eDiscovery cases.
- If you have Data loss prevention policies – Disable them.


To exclude a site from the retention policy in SharePoint Online, you can follow these steps:
- Go to Microsoft Purview Compliance Portal as a global admin
- Select “Exclude Sites”
- Enter the URL of the site that you want to exclude, and then select the plus (+) button
- Select the check box for the site. You can add other sites if you want
- After you enter all the sites that you want to exclude, select “Exclude” at the bottom of the window to confirm the changes
- Select “Save”
SharePoint PowerShell to identify all OneDrive sites and migrate content to SharePoint sites
I wanted to perform migration of few of my users content to their respective OneDrive site. To do so, I used below PowerShell to identify URL of all users who already have a OneDrive sites and then used FileMigration wizard to import their content in the respective OneDrive Site.
$TenantUrl = Read-Host "https://tenantname-admin.sharepoint.com/"
$LogFile = [Environment]::GetFolderPath("Desktop") + "\OneDriveSites.log"
Connect-SPOService -Url $TenantUrl
Get-SPOSite -IncludePersonalSite $true -Limit all -Filter "Url -like '-my.sharepoint.com/personal/'" | Select -ExpandProperty Url | Out-File $LogFile -Force
Write-Host "Done! File saved as $($LogFile)."

Note : Note agent should be installed on a system which should have access to the files through network drive we want to migrate








PowerShell to find duplicate files on SharePoint online site
You can use below PowerShell to find all duplicate files(regardless of document libraries) on SharePoint online site.
Make sure you update the parameters such as SiteURL and ReportOutput. This may take time based on number of items in SharePoint site. I have kept it running for 3 days and it worked like a charm. I have seen other Powershell’s which don’t work if you have MFA enabled whereas this script works regardless you have MFA enabled or disabled.
#Before running the below script, kindly follow below steps :
#1. Open you PowerShell ISE in your system and run it as administrator
#cls2. Install the New PnP PowerShell Module using below commands:
Install-Module PnP.PowerShell
#3. Register a new Azure AD Application and Grant Access to the tenant
Register-PnPManagementShellAccess
#Then paste and run below pnp script:
#Parameters
$SiteURL = "https://tenantname.sharepoint.com/sites/Sitename"
$Pagesize = 2000
$ReportOutput = "C:\Temp\Sitename.csv"
#Connect to SharePoint Online site
Connect-PnPOnline $SiteURL -Interactive
#Array to store results
$DataCollection = @()
#Get all Document libraries
$DocumentLibraries = Get-PnPList | Where-Object {$_.BaseType -eq "DocumentLibrary" -and $_.Hidden -eq $false -and $_.ItemCount -gt 0 -and $_.Title -Notin("Site Pages","Style Library", "Preservation Hold Library")}
#Iterate through each document library
ForEach($Library in $DocumentLibraries)
{
#Get All documents from the library
$global:counter = 0;
$Documents = Get-PnPListItem -List $Library -PageSize $Pagesize -Fields ID, File_x0020_Type -ScriptBlock `
{ Param($items) $global:counter += $items.Count; Write-Progress -PercentComplete ($global:Counter / ($Library.ItemCount) * 100) -Activity `
"Getting Documents from Library '$($Library.Title)'" -Status "Getting Documents data $global:Counter of $($Library.ItemCount)";} | Where {$_.FileSystemObjectType -eq "File"}
$ItemCounter = 0
#Iterate through each document
Foreach($Document in $Documents)
{
#Get the File from Item
$File = Get-PnPProperty -ClientObject $Document -Property File
#Get The File Hash
$Bytes = $File.OpenBinaryStream()
Invoke-PnPQuery
$MD5 = New-Object -TypeName System.Security.Cryptography.MD5CryptoServiceProvider
$HashCode = [System.BitConverter]::ToString($MD5.ComputeHash($Bytes.Value))
#Collect data
$Data = New-Object PSObject
$Data | Add-Member -MemberType NoteProperty -name "FileName" -value $File.Name
$Data | Add-Member -MemberType NoteProperty -Name "HashCode" -value $HashCode
$Data | Add-Member -MemberType NoteProperty -Name "URL" -value $File.ServerRelativeUrl
$Data | Add-Member -MemberType NoteProperty -Name "FileSize" -value $File.Length
$DataCollection += $Data
$ItemCounter++
Write-Progress -PercentComplete ($ItemCounter / ($Library.ItemCount) * 100) -Activity "Collecting data from Documents $ItemCounter of $($Library.ItemCount) from $($Library.Title)" `
-Status "Reading Data from Document '$($Document['FileLeafRef']) at '$($Document['FileRef'])"
}
}
#Get Duplicate Files by Grouping Hash code
$Duplicates = $DataCollection | Group-Object -Property HashCode | Where {$_.Count -gt 1} | Select -ExpandProperty Group
Write-host "Duplicate Files Based on File Hashcode:"
$Duplicates | Format-table -AutoSize
#Export the duplicates results to CSV
$Duplicates | Export-Csv -Path $ReportOutput -NoTypeInformation
What is digital twin? – Part 2
Now lets understand when you should use Digital twin:
- You should use it when you have large number of digital twin to manage
- When there is complexity in relationship
- It would be best if you can existing model instead of creating new once
- You can benefit if it has flexible schema
Couple of things which popularized digital twin in recent days:
- Cheap IOT devices
- Increased number of IOT devices in use
- Affordable internet/connectivity
- Boost in Cloud platforms
If you want to implement Digital Twin, you need to make sure your projects have below characteristics:
- Ability to establish some sort of connection between 2 devices
- Can and should require queries to be executed based on device parameters
- Secure integration
Lets try to build a Digital twin of a Building to understand what is temperature of each room.
In Azure Digital Twin, we have 4 types of interfaces or classes
- Component – Temperature and humidity
- Relationship – Building has floor and then floor has rooms
- Telemetry – Input data
- Properties – State of entity
You can download the json from MS link below:
Building : https://raw.githubusercontent.com/Azure-Samples/digital-twins-explorer/main/client/examples/Building.json
Floor : https://raw.githubusercontent.com/Azure-Samples/digital-twins-explorer/main/client/examples/Floor.json
Room : https://raw.githubusercontent.com/Azure-Samples/digital-twins-explorer/main/client/examples/Room.json


Here Json needs to start with dtmi with displayname as ‘Building’, next the only property of this interface is Date, It has relationship with Floor. Most important is DisplayName.

Here Json needs to start with dtmi with displayname as ‘Floor’, next properties of this interface is OwnershipUser and OwnershipDeparment, It has relationship with Room

Here Json needs to start with dtmi with displayname as ‘Room, next the only property of this interface is Temperature and Humidity, It has no further relationship
Make sure you use DTDL validator while creating these Json files
https://learn.microsoft.com/en-us/samples/azure-samples/dtdl-validator/dtdl-validator/
Using this json we have multiple ways to create our scenario’s but we will use below xlsx file which has specified data to start with. You can download the excel for building scenario’s from below path:
https://github.com/Azure-Samples/digital-twins-explorer/raw/main/client/examples/buildingScenario.xlsx



Microsoft understands it will take a lot of efforts for one to create these models and relationships, so to ease the process Microsoft has published a few models such as space, event, document, building, etc so you can reuse them as your start point
https://github.com/WillowInc/opendigitaltwins-building
Here is basic strcture of Azure Digital Twin

Input : Input can be from Workflows, IOT hub or any rest Api. You can input parameter data manually as well.
Processing : Azure Digital Twin platform process the input based on business logics, understands connections and process data accordingly. Digital twin of real-world sensors can be created to predict and test results based on change in inputs
Output : Using Time Series insights, Azure data explorer and Analytics we can better visualize data and actionable items can be driven from predictions.
Main challenges with ADT are:
- Modelling is difficult and time consuming
- Delay in processing data realtime
- Inter operability between different types of IOT devices and services
- Hard and expensive for existing systems
- Privacy and confidentiality
ChatGPT is back with new Avatar, Try new Bing
If you are looking for a new way to search the web, you should check out the new Bing. The new Bing is more than just a search engine. It’s a chat partner that can answer your questions, provide complete information and even inspire your creativity.
The new Bing uses ChatGPT AI technology to give you better results for simple things like sports scores, stock prices and weather . But it also shows you more detailed answers if you want them . You can chat with Bing in natural language, do complex searches, follow up and make refinements. You’ll be surprised by how well Bing understands you and responds to your needs.
But that’s not all. The new Bing also helps you unleash your creative potential with features like poems, stories, code, essays, songs and celebrity parodies. You can ask Bing to create any of these for you based on your keywords or preferences. You can also edit and share them with others.
The new Bing also works seamlessly with Microsoft Edge, your copilot for the web. Edge gives you access to exclusive features like Collections, Immersive Reader and Vertical Tabs that help you organize your browsing experience. You can also sync your bookmarks, passwords and settings across devices with Edge.
Don’t miss out on the new Bing. Go to bing.com/new and start exploring the possibilities. The new Bing is here to help you find what you need and have fun along the way.
What is digital twin? – Part 1
What is digital twin?
Virtual representation of real world item or process.
Digital twin platform enables us to actually collect this data, process it, provide insights and then enable us to take some actions based on the data. More specifically take proactive measures instead of reactive measures. General its used for items but we can also use it for processes. I will explain on how we can use digital twin for processes.
What are types of Twin?
- Digital Model/Simulator: Here we manually feed data into a existing model to see its results. If we have seen the movie ‘Day After Tomorrow’, the scientist inputs data from various locations in the existing model and can predict what the world would go through in next few days.
- Digital Shadow: Here real world devices/sensors send IOT data to virtual platform. Lets assume we have a physical car and a wheels in your car has a sensor wheel which is sending data to the a cloud based digital twin wheel (Virtual representation of car). This will help understand when your car tires needs to be replaced or repaired. A real life example is Michelin Fleet Solutions.
- Digital Twin: Here we have 2 way data sync from sensor to the digital twin and may receive some action based on input provided. Example here is where we have a multistory warehouse which has sensor on all most critical components, water supply, electricity circuit, elevators, co2 level, AC etc. Based on feedback of the sensors, temperature(Cold storage) in one section of the warehouse is different and other section(Meeting room).
Typical Digital Twin functions
- Visualize current state of item or process
- Trigger alerts under certain conditions
- Review historic data
- Simulate scenarios
Few Benefits:
- Helps save money incase we go live/production with a major defects or avoid downtime at all.
- Learn from past scenarios.
- Help increase safer work environment.
- Increase Quality / efficiency / life of the products/organization.
How are digital twin used in real world?
As I have already shared, Michelin Fleet Solutions now sells kilometers instead of tyres to make profit and sensor in wheel of car is sending data to the a cloud based digital twin wheel (Virtual representation of car).This will help understand when your car tires needs to be replaced or repaired. Also they provide feedback to customer on which routes to take and what time to travel to avoid reduce lifespan of the wheels.
Another example is where a supermarket monitors all the customers as to which section they visit most, which is most visible section for adds, how many people buy what kind of items from check out counters, etc. This helps them to place new products / high margin products in that area with high footprint and also plan a longer exit path so because physiologically, the more time a person spends in the store, the more he would purchase.
Origins of digital twin
John Vickers who was Principal technologies invented the word digital twin back in 2010.
Michael Grieves then worked with Vickers to create digital twin of few real life items. It got boost after IOT growth accelerated.
One of the best example is Digital Twin Victoria(Australia)
Permission or URL issue when you use Azure Digital Twin explorer for the 1st time



Error : ‘Double check URL and directory’ or ‘you don’t have permission on the Digital twin URL’
RestError: Sub-domain not found.
at new t (https://explorer.digitaltwins.azure.net/static/js/465.196a83e5.chunk.js:2:21103219)
at https://explorer.digitaltwins.azure.net/static/js/465.196a83e5.chunk.js:2:21108505
at https://explorer.digitaltwins.azure.net/static/js/465.196a83e5.chunk.js:2:21109190
To resolved these errors follow below steps
- In the Azure portal, navigate to the Digital Twins resource
- Make sure you use the URL given on your Digital Twin Overview screen in front of Hostname as shown above, make sure you start with https://
- Also give appropriate permission to yourself by clicking on the Access Control (IAM) menu
- Select Add role assignment
- Select Azure Digital Twins Data Owner in the Role dropdown listbox
- For the Assign access to setting, select User, group, or service principal
- Now, select the user that you are using in the Azure portal right now and that you’ll use on your local machine to authenticate with the sample application
- Click Save to assign the Role




Quick overview of Azure Data Explorer
A big data analytics platform called Azure Data Explorer makes it simple to quickly evaluate large amounts of data. You may get a complete end-to-end solution for data ingestion, query, visualization, and administration with the Azure Data Explorer toolset.
Azure Data Explorer makes it simple to extract critical insights, identify patterns and trends, and develop forecasting models by analyzing structured, semi-structured, and unstructured data over time series. Azure Data Explorer is helpful for log analytics, time series analytics, IoT, and general-purpose exploratory analytics. It is completely managed, scalable, secure, resilient, and enterprise ready.
Terabytes of data may be ingested by Azure Data Explorer in minutes in batch or streaming mode, and petabytes of data can be searched with millisecond response times. Data can be ingested in a variety of structures and formats. It may enter from a number of channels and sources.
The Kusto Query Language (KQL), an open-source language created by the team, is used by Azure Data Explorer. The language is straightforward to comprehend and learn, and it is quite effective. Both basic operators and sophisticated analyses are available. Microsoft makes extensive use of it (Azure Monitor – Log Analytics and Application Insights, Microsoft Sentinel, and Microsoft 365 Defender). KQL is designed with quick, varied big data exploration in mind. queries any other tabular expressions, including views, functions, and tables. This can involve clusters or even tables from various databases.
Each table’s data is kept as data shards, commonly referred to as “extents.” Based on the ingestion time, all data is automatically indexed and partitioned. There are no main foreign key requirements or other restrictions, like as uniqueness, unlike a relational database. As a result, you may store a wide variety of data and quickly query it thanks to the way it is kept.
Azure Data Explorer
ADX Dashboards is a newly released feature from Microsoft which provide the ability to create various types of data visualizations in one place and in an easy-to-digest form. Dashboards can be shared broadly and allow multiple stakeholders to view dynamic, real time, fresh data while easily interacting with it to gain desired insights. With ADX Dashboards, you can build a dashboard, share it with others, and empower them to continue their data exploration journey. You can say, this is a cheap alternative to PowerBI.
- Create Azure data explorer cluster


*Create a new resource group if required.



2. Add Help database to Visualization Dashboard




StormEvents
| sort by StartTime desc
| take 10

StormEvents
| sort by StartTime desc
| project StartTime, EndTime, State, EventType, DamageProperty, EpisodeNarrative
| take 10

StormEvents
| summarize event_count=count(), mid = avg(BeginLat) by State
| sort by mid
| where event_count > 1800
| project State, event_count
| render columnchart
3. Finally we are ready to create Visualization Dashboard








StormEvents
| summarize event_count=count(), mid = avg(BeginLat) by State
| sort by mid
| where event_count > 1800
| project State, event_count
| render columnchart




