Who's Online
9 visitors online now
8 guests, 1 bots, 0 members
Support my Sponsor
  • An error has occurred, which probably means the feed is down. Try again later.

Import-Module : Function Remove-MgSiteTermStoreSetParentGroupSetTermRelation cannot be created because function capacity 4096 has been exceeded for this scope

Explained everything in the Video : https://youtu.be/7-btaMI6wJI

Today I got below error when I tried to import my Graphs module to Powershell (Import-Module Microsoft.Graph)

Import-Module : Function Remove-MgSiteTermStoreSetParentGroupSetTermRelation cannot be created because function capacity 4096 has been exceeded for this scope.
At line:1 char:1
+ Import-Module Microsoft.Graph
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: (Remove-MgSiteTe...SetTermRelation:String) [Import-Module], SessionSta
   teOverflowException
    + FullyQualifiedErrorId : FunctionOverflow,Microsoft.PowerShell.Commands.ImportModuleCommand

After lot of research I identified that this is a known issue that occurs when importing Microsoft.Graph module in PowerShell 5.1, because the module has more than 4096 functions and PowerShell 5.1 has a limit on the number of functions that can be created in a scope.

There are some possible workarounds that you can try to resolve this error:

  • Upgrade to PowerShell 7+ or latest as the runtime version (highly recommended). PowerShell 7+ does not have the function capacity limit and can import the module without any errors.
  • Set $maximumfunctioncount variable to its max value, 32768, before importing the module. This will increase the function capacity limit for your scope, but it may not be enough if you have other modules loaded or if the Microsoft.Graph module adds more functions in the future.

    To check all Maximum value set in PowerShell
    gv Max*Count

    To set the Maximum Function Count
    $MaximumFunctionCount = 8096

I hope this helps you understand how to fix the error and import the Microsoft.Graph module successfully.

Classic application insights will be retired on 29 February 2024—migrate to workspace-based application insights

I recently got a email which says ‘Classic application insights will be retired on 29 February 2024—migrate to workspace-based application insights’.

Further it adds, classic application insights in Azure Monitor will be retired and you’ll need to migrate your resources to workspace-based application insights by that date.

Now if we move to Workspace-based application insights, it will offer improved functionality such as:

Now let’s logon to Azure and check what it means and how to migrate to workspace-based application insights.

Access the Application Insights and then click on ‘Classic Application Insights is deprecated and will be retired in February 2024. Migrate this resource to Workspace-based Application Insights to gain support for all of the capabilities of Log Analytics, including Customer-Managed Keys and Commitment Tiers. Click here to learn more and migrate in a few clicks’.

Now you will see the Subscription and the new Log Analytics Workspace getting created. Just click Apply at bottom of screen.

Once done, you will see the new workspace created and all the content of classic application.

You can also watch the whole process in my Youtube Video
https://www.youtube.com/watch?v=gvQp_33ezqg

Get duplicate files in all SharePoint site using file HASH

Explained everything in the Video : https://youtu.be/WHk2tIav-sQ

The task was to find a way to identify all duplicate files in all SharePoint sites. I searched online for a solution, but none of the scripts I found were accurate. They used different criteria to detect duplicates, such as Name, modified date, size, etc.

After extensive research, I developed the following script that can generate a hash for each file on the SharePoint sites. The hash is a unique identifier that cannot be the same for two files, even if they differ by a single character.

If you want to do the same for only one SharePoint site, you can use below link: Get duplicate files in SharePoint site using file HASH. (itfreesupport.com)

I hope this script will be useful for many people.

Register a new Azure AD Application and Grant Access to the tenant

Register-PnPManagementShellAccess

Then paste and run below pnp script:

Parameters

$TenantURL = “https://tenant-admin.sharepoint.com”
$Pagesize = 2000
$ReportOutput = “C:\Temp\DupSitename.csv”

Connect to SharePoint Online tenant

Connect-PnPOnline $TenantURL -Interactive
Connect-SPOService $TenantURL

Array to store results

$DataCollection = @()

Get all site collections

$SiteCollections = Get-SPOSite -Limit All -Filter “Url -like ‘/sites/‘”

Iterate through each site collection

ForEach ($Site in $SiteCollections)
{
#Get the site URL
$SiteURL = $Site.Url

#Connect to SharePoint Online site
Connect-PnPOnline $SiteURL -Interactive

#Get all Document libraries
$DocumentLibraries = Get-PnPList | Where-Object {$_.BaseType -eq "DocumentLibrary" -and $_.Hidden -eq $false -and $_.ItemCount -gt 0 -and $_.Title -Notin ("Site Pages","Style Library", "Preservation Hold Library")}

#Iterate through each document library
ForEach ($Library in $DocumentLibraries)
{
    #Get All documents from the library
    $global:counter = 0;
    $Documents = Get-PnPListItem -List $Library -PageSize $Pagesize -Fields ID, File_x0020_Type -ScriptBlock `
    { Param ($items) $global:counter += $items.Count; Write-Progress -PercentComplete ($global:Counter / ($Library.ItemCount) * 100) -Activity `
    "Getting Documents from Library '$($Library.Title)'" -Status "Getting Documents data $global:Counter of $($Library.ItemCount)";} | Where {$_.FileSystemObjectType -eq "File"}
    $ItemCounter = 0

    #Iterate through each document
    Foreach ($Document in $Documents)
    {
        #Get the File from Item
        $File = Get-PnPProperty -ClientObject $Document -Property File

        #Get The File Hash
        $Bytes = $File.OpenBinaryStream()
        Invoke-PnPQuery
        $MD5 = New-Object -TypeName System.Security.Cryptography.MD5CryptoServiceProvider
        $HashCode = [System.BitConverter]::ToString($MD5.ComputeHash($Bytes.Value))

        #Collect data
        $Data = New-Object PSObject
        $Data | Add-Member -MemberType NoteProperty -name "FileName" -value $File.Name
        $Data | Add-Member -MemberType NoteProperty -Name "HashCode" -value $HashCode
        $Data | Add-Member -MemberType NoteProperty -Name "URL" -value $File.ServerRelativeUrl
        $Data | Add-Member -MemberType NoteProperty -Name "FileSize" -value $File.Length
    $DataCollection += $Data
    $ItemCounter++
    Write-Progress -PercentComplete ($ItemCounter / ($Library.ItemCount) * 100) -Activity "Collecting data from Documents $ItemCounter of $($Library.ItemCount) from $($Library.Title)" `
    -Status "Reading Data from Document '$($Document['FileLeafRef']) at '$($Document['FileRef'])"
}

}
}

Get Duplicate Files by Grouping Hash code

$Duplicates = $DataCollection | Group-Object -Property HashCode | Where {$_.Count -gt 1} | Select -ExpandProperty Group
Write-host “Duplicate Files Based on File Hashcode:”
$Duplicates | Format-table -AutoSize

Export the duplicates results to CSV

$Duplicates | Export-Csv -Path $ReportOutput -NoTypeInformation

Get duplicate files in SharePoint site using file HASH

Explained everything in the Video : https://youtu.be/WHk2tIav-sQ

The task was to find a way to identify all duplicate files in a SharePoint site. I searched online for a solution, but none of the scripts I found were accurate. They used different criteria to detect duplicates, such as Name, modified date, size, etc.

After extensive research, I developed the following script that can generate a hash for each file on the SharePoint sites. The hash is a unique identifier that cannot be the same for two files, even if they differ by a single character.

If you want to do the same for all SharePoint sites, you can use below link:

Get duplicate files in all SharePoint site using file HASH. (itfreesupport.com)

I hope this script will be useful for many people.

Register a new Azure AD Application and Grant Access to the tenant

Register-PnPManagementShellAccess

Then paste and run below pnp script:

$SiteURL = “https://tenant.sharepoint.com/sites/sitename”
$Pagesize = 2000
$ReportOutput = “C:\Temp\DupSitename.csv”

Connect to SharePoint Online site

Connect-PnPOnline $SiteURL -Interactive

Array to store results

$DataCollection = @()

Get all Document libraries

$DocumentLibraries = Get-PnPList | Where-Object {$_.BaseType -eq “DocumentLibrary” -and $_.Hidden -eq $false -and $_.ItemCount -gt 0 -and $_.Title -Notin (“Site Pages”,”Style Library”, “Preservation Hold Library”)}

Iterate through each document library

ForEach ($Library in $DocumentLibraries)
{
#Get All documents from the library
$global:counter = 0;
$Documents = Get-PnPListItem -List $Library -PageSize $Pagesize -Fields ID, File_x0020_Type -ScriptBlock { Param ($items) $global:counter += $items.Count; Write-Progress -PercentComplete ($global:Counter / ($Library.ItemCount) * 100) -Activity
“Getting Documents from Library ‘$($Library.Title)'” -Status “Getting Documents data $global:Counter of $($Library.ItemCount)”;} | Where {$_.FileSystemObjectType -eq “File”}
$ItemCounter = 0

#Iterate through each document
Foreach ($Document in $Documents)
{
    #Get the File from Item
    $File = Get-PnPProperty -ClientObject $Document -Property File

    #Get The File Hash
    $Bytes = $File.OpenBinaryStream()
    Invoke-PnPQuery
    $MD5 = New-Object -TypeName System.Security.Cryptography.MD5CryptoServiceProvider
    $HashCode = [System.BitConverter]::ToString($MD5.ComputeHash($Bytes.Value))

    #Collect data
    $Data = New-Object PSObject
    $Data | Add-Member -MemberType NoteProperty -name "FileName" -value $File.Name
    $Data | Add-Member -MemberType NoteProperty -Name "HashCode" -value $HashCode
    $Data | Add-Member -MemberType NoteProperty -Name "URL" -value $File.ServerRelativeUrl
    $Data | Add-Member -MemberType NoteProperty -Name "FileSize" -value $File.Length
    $DataCollection += $Data
    $ItemCounter++
    Write-Progress -PercentComplete ($ItemCounter / ($Library.ItemCount) * 100) -Activity "Collecting data from Documents $ItemCounter of $($Library.ItemCount) from $($Library.Title)" `
    -Status "Reading Data from Document '$($Document['FileLeafRef']) at '$($Document['FileRef'])"
}

}

Get Duplicate Files by Grouping Hash code

$Duplicates = $DataCollection | Group-Object -Property HashCode | Where {$_.Count -gt 1} | Select -ExpandProperty Group
Write-host “Duplicate Files Based on File Hashcode:”
$Duplicates | Format-table -AutoSize

Export the duplicates results to CSV

$Duplicates | Export-Csv -Path $ReportOutput -NoTypeInformation

Download content from others OneDrive using Graphs API

Explained the same using my Youtube channel : https://www.youtube.com/watch?v=YG19odJN94Q

If you want to download content from other users’ OneDrive accounts, you have a few options. The simplest option is to create a OneDrive link for each user using the Office 365 Admin center. This will allow you to access and download their files through a web browser. However, this option is not very efficient if you have to do it for many users. A better option is to use an API call that can create a custom application for you. This application can download the files for multiple users at once, without requiring you to create individual links. To use this option, you need to follow these steps to get the API calls for the custom application.

Open Graphs API explorer URL : https://developer.microsoft.com/en-us/graph/graph-explorer

Sign in and give consent to Graphs explorer using the User icon on right corner of the screen

Now type the credentials of the users you want to use and give consent.

Now try to run the initial API and confirm you get OK 200 and your details in the Response View

Now the 1st API URL using which you can query other users OneDrive content is https://graph.microsoft.com/v1.0/users/[email protected]/drive/root/children

Sometimes, when you try to download content from other users’ OneDrive accounts, you may encounter an access denied error. This means that you do not have the permission to access their files. To fix this error, you need to do one of the following things:

  • Make sure that you are a Global Admin for the organization. This role gives you the highest level of access to all the resources in the organization, including other users’ OneDrive accounts.
  • Grant admin access to other users’ OneDrive accounts by following these steps:
  1. In the left pane, select Admin centers > SharePoint. (You might need to select Show all to see the list of admin centers.)
  2. If the classic SharePoint admin center appears, select Open it now at the top of the page to open the SharePoint admin center.
  3. In the left pane, select More features.
  4. Under User profiles, select Open.
  5. Under People, select Manage User Profiles.
  6. Enter the former employee’s name and select Find.
  7. Right-click the user, and then choose Manage site collection owners.
  8. Add the user to Site collection administrators and select OK.

This option allows you to specify which users’ OneDrive accounts you want to access, without requiring you to be a Global Admin.

After you successfully execute the above API calls, you will get a response that shows you the list of all the folders and files on the root of the other users’ OneDrive accounts. Each folder and file will have a unique ID that you can use to access its contents. To get the downloadable links for all the files in a specific folder, you need to use the following API calls:

https://graph.microsoft.com/v1.0/users/[email protected]/drive/items/FOLDERGUID/children?$select=content.downloadUrl,ID,name

These API calls will return a response that contains the links for each file in the folder. You can download any file by clicking on its link. The file will be saved to your local device.

Office 365 get all sites with its size in GB

Explained everything in the Video : https://youtu.be/rqn1KNYKlhE

Recently I was asked to get a script or command which can get all sites in Office 365 with its size in GB. I tried to look for ways to do so but could not find anything to help with same. After some research I tried to use ChatGPT and Bard but the scripts didn’t work for me and kept getting error. Same of not working command and its error is shown below:

THIS COMMAND IS NOT WORKING

Get-SPOSite | Select Title, Url, Owner, SharingCapability, LastContentModifiedDate, @{Name=”Size (GB)”;Expression={ [math]::round ($_.length/1GB,4)}} Get-SPOSite | Select Title, Url, Owner, SharingCapability, LastContentModifiedDate, @{Name=”Size (GB)”;Expression={ [math]::round ($_.length/1GB,4)}} | Export-CSV “C:\SharePoint-Online-Sites.csv” -NoTypeInformation -Encoding UTF8

At line:1 char:115
+ ... {Name="StorageUsageCurrent (GB)";Expression={ [math]::Round ($_.Stora ...
+                                                                 ~
Unexpected token '(' in expression or statement.
At line:1 char:203
+ ... 2)}}, @{Name="StorageQuota (GB)";Expression={ [math]::Round ($_.Stora ...
+                                                                 ~
Unexpected token '(' in expression or statement.
    + CategoryInfo          : ParserError: (:) [], ParentContainsErrorRecordException
    + FullyQualifiedErrorId : UnexpectedToken

I did a lot of research and found the issue in above command and a way to get all sites with its size in GB.

THIS IS WORKING COMMAND

Get-SPOSite | Select Title, Url, Owner, SharingCapability, LastContentModifiedDate, @{Name=”Size (GB)”;Expression={ ($_.StorageUsageCurrent/1024)}}

Also if you want to export the results to CSV file, use below command

Get-SPOSite | Select Title, Url, Owner, SharingCapability, LastContentModifiedDate, @{Name=”Size (GB)”;Expression={ ($_.StorageUsageCurrent/1024)}} | Export-CSV “C:\SharePoint-Online-Sites.csv” -NoTypeInformation -Encoding UTF8

You will also have to connect to your Office subscription, use below commands to do so

Install-Module -Name Microsoft.Online.SharePoint.PowerShell

Connect-SPOService -Url https://YOURTENANTNAE-admin.sharepoint.com

Note if you are using GCC or GCCHIgh, add ‘-ExchangeEnvironmentName O365UsGovGCCH’ in above ConnectSPOservice command

Estimating Physical Resources in Quantum Computing with Microsoft Azure

Resource estimation is a vital aspect in the realm of quantum computing. Microsoft Azure offers a sophisticated tool known as the Azure Quantum Resource Estimator to facilitate this critical process. Resource estimation plays a crucial role in evaluating the computational needs of quantum algorithms and optimizing computing resource usage. Leveraging the Azure Quantum Resource Estimator empowers users to gain insightful understanding of the computational demands of quantum algorithms, enabling informed decision-making about resource allocation. This tool assists in fine-tuning resource utilization, ultimately improving the efficiency and performance of quantum computing tasks.

The Importance of Resource Estimation

Quantum computing hinges on resource estimation, a vital process for gauging the requisites of executing a quantum program, encompassing qubit count, quantum circuit depth, and quantum operation tally.

Estimating these resources is crucial for several reasons:

  1. By treading the path of resource estimation, quantum programmers can meticulously craft efficient codes, tuning the allocation of resources, and streamlining their utilization.
  2. Resource estimation also elucidates the feasibility quotient pertaining to quantum operations vis-à-vis different quantum hardware, helping decipher the viability of executing specific quantum programs.
  3. Considering the lofty expenses endemic to quantum computing, resource estimation assumes a pivotal role in fiscal governance, furnishing an empirical basis for prognosticating the costs entailed in running quantum programs.

Using the Azure Quantum Resource Estimator

The Azure Quantum Resource Estimator is a tool provided by Microsoft Azure that helps in estimating the resources required to run a quantum program. Here’s how you can use it:

  1. Commencing the quantum odyssey, the initial stride involves scripting the quantum program employing Q#, an avant-garde quantum programming dialect incubated by Microsoft.
  2. Post-scripting, the foray into resource estimation unfurls, facilitated by the Azure Quantum Resource Estimator, which furnishes an insightful appraisal of the resources indispensable for program execution.
  3. Subsequently, delving into a comprehensive dissection of the resource estimation outcomes proffered by the Azure Quantum Resource Estimator allows intricate insights, paving the way for nimble program optimization.

Conclusion

Resource estimation galvanizes the bedrock of quantum computing, and with the Azure Quantum Resource Estimator as your compass, you can steer through the uncharted waters of resource quantification, birthing sagacious and cost-effective quantum algorithms.

Interacting with Azure Quantum Cloud Service

The realm of quantum computing is rapidly advancing, and Microsoft Azure stands at the vanguard of this transformative juncture. Through Azure Quantum, you have the capacity to script your quantum algorithms and execute them on bona fide quantum processors. The process unfolds as follows:

Step 1: Writing Your Quantum Programs

Firstly, you compose your quantum algorithms. This is achievable through Q#, an exclusive programming language crafted by Microsoft for articulating quantum computations. Q# is encompassed within the Quantum Development Kit (QDK), comprising an array of open-source utilities that facilitate developers in coding, assessing, and debugging quantum algorithms.

Step 2: Accessing Azure Quantum

Subsequently, upon crafting your quantum algorithms, the next strategic maneuver is to engage with Azure Quantum. This entails steering towards the Azure Quantum website using your web browser. If you don’t possess an Azure account, you’ll be obliged to create one. Once authenticated, you gain entry to the Azure Quantum service.

Step 3: Running Your Quantum Programs

Upon gaining access to Azure Quantum, you can set in motion the execution of your quantum programs on authentic quantum processors. This action entails navigating to the ‘Jobs’ segment within your designated Azure Quantum workspace. Here, you dispatch your quantum programs as jobs earmarked for enactment on the quantum hardware. You are at liberty to select from a diverse pool of quantum hardware providers, guaranteeing that you are endowed with the requisite resources to delve into the potency of quantum computing.

Step 4: Monitoring Your Jobs

Following the submission of your jobs, you are in a position to directly oversee their progression via your Azure Quantum workspace. Azure Quantum furnishes comprehensive insights into each job, encompassing its status, the quantum apparatus upon which it is being executed, and the resultant findings.

Conclusion

Interacting with Azure Quantum Cloud Service is a straightforward process. With just a browser and an Azure account, you can start exploring the exciting world of quantum computing. Whether you’re a seasoned quantum computing professional or a curious beginner, Azure Quantum provides the tools and resources you need to dive into quantum computing.

Quantum Intermediate Representation (QIR) in Azure Quantum

In the realm of quantum computing, the paramount factors are interoperability and hardware independence. Microsoft Azure Quantum attains this by leveraging Quantum Intermediate Representation (QIR), a universal language that ensures compatibility of code across various quantum hardware platforms.

Understanding QIR

Quantum Intermediate Representation (QIR) stands as a hardware-agnostic intermediary representation for quantum programs. Founded upon LLVM, an extensively utilized open-source endeavor, it furnishes a language-agnostic structure for presenting program code in a ubiquitous intermediate language.

QIR in Azure Quantum

Azure Quantum utilizes QIR to target diverse quantum machines. Consequently, developers can craft their quantum program just once and subsequently execute it across varied quantum hardware sans the need for code rewrites. This presents a notable advantage, enabling developers to concentrate on formulating their quantum algorithms without fretting over the specific intricacies of the underlying quantum hardware.

Benefits of QIR

The use of QIR in Azure Quantum has several benefits:

  1. Hardware Independence: Quantum Intermediate Representation (QIR) facilitates hardware independence, allowing quantum algorithms to be authored once and executed on any quantum processor compatible with QIR’s specifications.
  2. Interoperability: QIR fosters seamless interoperability across diverse quantum programming languages, streamlining collaboration among developers and facilitating the exchange of code.
  3. Optimization: QIR empowers the implementation of sophisticated optimization methodologies, enhancing the efficiency and efficacy of quantum algorithms.

Conclusion

Employing Quantum Intermediate Representation (QIR) within Azure Quantum constitutes a pivotal advancement in the realm of quantum computing. QIR guarantees seamless code compatibility across diverse quantum hardware, fostering interoperability and emancipating it from hardware constraints. Consequently, developers can channel their energy towards the quintessential task: crafting potent and efficacious quantum algorithms.

Writing Quantum Code with Microsoft Azure

The realm of quantum computing is rapidly advancing, and Microsoft Azure stands at the forefront of this unprecedented transformation. Through Azure Quantum, you have the capacity to craft your Q# program using the hosted Jupyter Notebooks within your Azure Quantum workspace.

Step 1: Accessing Azure Quantum

Initiating this process entails reaching Azure Quantum itself. This can be achieved by visiting the Azure Quantum website via your web browser. If you lack an Azure account, its creation is a requisite. Subsequently, upon logging in, entry into the Azure Quantum service becomes feasible.

Step 2: Navigating to Jupyter Notebooks

Once ensconced within Azure Quantum, the next step is to make your way to the Jupyter Notebooks segment. Embodying an open-source web application, Jupyter Notebooks provide the avenue to fabricate and disseminate documents integrating live code, equations, visualizations, and narrative text.

Step 3: Writing Your Q# Program

Upon commencing your journey within Jupyter Notebooks, you are empowered to commence crafting your Q# program. Q# stands as a specialized programming language tailored for articulating quantum algorithms. This unique language finds its purpose in conjunction with the Quantum Development Kit (QDK), an inclusive ensemble featuring the Q# programming language, quantum simulators, libraries, and an array of supplementary tools.

Step 4: Using Libraries

To keep your code high level, you can use libraries. Libraries in Q# provide reusable pieces of code that can be called from your Q# program. They include operations (the basic unit of quantum execution in Q#), functions (code that helps to process information within the quantum algorithm), and types (abstractions that are used to represent and manipulate quantum states).

Step 5: Running Your Quantum Code

After writing your Q# program, you can run it directly in the Jupyter Notebook. You can also debug and test your quantum code using the Quantum Development Kit.

Conclusion

Crafting quantum algorithms using Microsoft Azure entails a seamless and accessible procedure. Armed with solely a web browser and an Azure account, one can commence delving into the captivating realm of quantum computation. Whether one is a seasoned quantum computing expert or an inquisitive novice, Azure Quantum furnishes the requisite tools and assets to delve into the intricacies of quantum computation.