Who's Online
15 visitors online now
4 guests, 11 bots, 0 members
Support my Sponsor
  • An error has occurred, which probably means the feed is down. Try again later.

Read Microsoft Teams messages using Graphs API

Explained the same in Youtube Video : https://youtu.be/Uez9QrBNNS0

So today I had a requirement to create a custom app where I can read all MS teams message and use AI to reply to the team’s messages.

The 1st thing we need is to get read all Messages in Microsoft Teams using a Graphs API. I tried to look for option and finally got a Teams Beta API named ‘messages (Without replies) in a channel’ but it needs GroupID and ChannelID to read to the messages.

So I ran below Graphs API to get all the Teams I am members of, here we will get the name and the ID of the Teams group as well.

https://graph.microsoft.com/v1.0/me/joinedTeams

Now we need to use the Group ID in below graphs query to get allchannels in that teams group.

https://graph.microsoft.com/beta/teams/{group-id-for-teams}/Allchannels/

Finally when we have both the Group ID and ChannelID, we will use below graphs API to get all the messages in the ChannelID.

Note: I am only able to see all the unread messages.

https://graph.microsoft.com/beta/teams/{group-id-for-teams}/channels/{channel-id}/messages

Below I am able to see all the same messages using GUI in teams

Ways to add an item to SharePoint lists

Explained the same in Youtube Video : https://youtu.be/xioKl4KrlLo

To use Graph API to add content to a SharePoint list, you need to have the appropriate permissions, the site ID, the list ID, and the JSON representation of the list item you want to create. You can use the following HTTP request to create a new list item:

POST https://graph.microsoft.com/v1.0/sites/{site-id}/lists/{list-id}/items
Content-Type: application/json

{
  "fields": {
    // your list item properties
  }
}

To get site ID, use below URL
https://TenantName.sharepoint.com/sites/SiteName/_api/site

Here at the ID after HUBSiteID tag is actually your Site ID as shown below.

For example, if you want to create a list item with the title “Item2”, the color “Pink”, and the Number 1, you can use this JSON body:

{
  "fields": {
    "Title": "Item2",
    "Color": "Pink",
    "Number": 1
  }
}

If the request is successful, you will get a response with the created list item and its properties.

You can also use the Graph SDKs to simplify the process of creating list items in different programming languages. For example, here is how you can create a list item in C# using the Microsoft Graph SDK:

using Microsoft.Graph;
using Microsoft.Identity.Client;

// Create a public client application
var app = PublicClientApplicationBuilder.Create(clientId)
    .WithAuthority(authority)
    .Build();

// Get the access token
var result = await app.AcquireTokenInteractive(scopes)
    .ExecuteAsync();

// Create a Graph client
var graphClient = new GraphServiceClient(
    new DelegateAuthenticationProvider((requestMessage) =>
    {
        requestMessage
            .Headers
            .Authorization = new AuthenticationHeaderValue("bearer", result.AccessToken);

        return Task.CompletedTask;
    }));

// Create a dictionary of fields for the list item
var fields = new FieldValueSet
{
    AdditionalData = new Dictionary<string, object>()
    {
        {"Title", "Item"},
        {"Color", "Pink"},
        {"Number", 1}
    }
};

// Create the list item
var listItem = await graphClient.Sites[siteId].Lists[listId].Items
    .Request()
    .AddAsync(fields);

Import-Module : Function Remove-MgSiteTermStoreSetParentGroupSetTermRelation cannot be created because function capacity 4096 has been exceeded for this scope

Explained everything in the Video : https://youtu.be/7-btaMI6wJI

Today I got below error when I tried to import my Graphs module to Powershell (Import-Module Microsoft.Graph)

Import-Module : Function Remove-MgSiteTermStoreSetParentGroupSetTermRelation cannot be created because function capacity 4096 has been exceeded for this scope.
At line:1 char:1
+ Import-Module Microsoft.Graph
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: (Remove-MgSiteTe...SetTermRelation:String) [Import-Module], SessionSta
   teOverflowException
    + FullyQualifiedErrorId : FunctionOverflow,Microsoft.PowerShell.Commands.ImportModuleCommand

After lot of research I identified that this is a known issue that occurs when importing Microsoft.Graph module in PowerShell 5.1, because the module has more than 4096 functions and PowerShell 5.1 has a limit on the number of functions that can be created in a scope.

There are some possible workarounds that you can try to resolve this error:

  • Upgrade to PowerShell 7+ or latest as the runtime version (highly recommended). PowerShell 7+ does not have the function capacity limit and can import the module without any errors.
  • Set $maximumfunctioncount variable to its max value, 32768, before importing the module. This will increase the function capacity limit for your scope, but it may not be enough if you have other modules loaded or if the Microsoft.Graph module adds more functions in the future.

    To check all Maximum value set in PowerShell
    gv Max*Count

    To set the Maximum Function Count
    $MaximumFunctionCount = 8096

I hope this helps you understand how to fix the error and import the Microsoft.Graph module successfully.

Classic application insights will be retired on 29 February 2024—migrate to workspace-based application insights

I recently got a email which says ‘Classic application insights will be retired on 29 February 2024—migrate to workspace-based application insights’.

Further it adds, classic application insights in Azure Monitor will be retired and you’ll need to migrate your resources to workspace-based application insights by that date.

Now if we move to Workspace-based application insights, it will offer improved functionality such as:

Now let’s logon to Azure and check what it means and how to migrate to workspace-based application insights.

Access the Application Insights and then click on ‘Classic Application Insights is deprecated and will be retired in February 2024. Migrate this resource to Workspace-based Application Insights to gain support for all of the capabilities of Log Analytics, including Customer-Managed Keys and Commitment Tiers. Click here to learn more and migrate in a few clicks’.

Now you will see the Subscription and the new Log Analytics Workspace getting created. Just click Apply at bottom of screen.

Once done, you will see the new workspace created and all the content of classic application.

You can also watch the whole process in my Youtube Video
https://www.youtube.com/watch?v=gvQp_33ezqg

Get duplicate files in all SharePoint site using file HASH

Explained everything in the Video : https://youtu.be/WHk2tIav-sQ

The task was to find a way to identify all duplicate files in all SharePoint sites. I searched online for a solution, but none of the scripts I found were accurate. They used different criteria to detect duplicates, such as Name, modified date, size, etc.

After extensive research, I developed the following script that can generate a hash for each file on the SharePoint sites. The hash is a unique identifier that cannot be the same for two files, even if they differ by a single character.

If you want to do the same for only one SharePoint site, you can use below link: Get duplicate files in SharePoint site using file HASH. (itfreesupport.com)

I hope this script will be useful for many people.

Register a new Azure AD Application and Grant Access to the tenant

Register-PnPManagementShellAccess

Then paste and run below pnp script:

Parameters

$TenantURL = “https://tenant-admin.sharepoint.com”
$Pagesize = 2000
$ReportOutput = “C:\Temp\DupSitename.csv”

Connect to SharePoint Online tenant

Connect-PnPOnline $TenantURL -Interactive
Connect-SPOService $TenantURL

Array to store results

$DataCollection = @()

Get all site collections

$SiteCollections = Get-SPOSite -Limit All -Filter “Url -like ‘/sites/‘”

Iterate through each site collection

ForEach ($Site in $SiteCollections)
{
#Get the site URL
$SiteURL = $Site.Url

#Connect to SharePoint Online site
Connect-PnPOnline $SiteURL -Interactive

#Get all Document libraries
$DocumentLibraries = Get-PnPList | Where-Object {$_.BaseType -eq "DocumentLibrary" -and $_.Hidden -eq $false -and $_.ItemCount -gt 0 -and $_.Title -Notin ("Site Pages","Style Library", "Preservation Hold Library")}

#Iterate through each document library
ForEach ($Library in $DocumentLibraries)
{
    #Get All documents from the library
    $global:counter = 0;
    $Documents = Get-PnPListItem -List $Library -PageSize $Pagesize -Fields ID, File_x0020_Type -ScriptBlock `
    { Param ($items) $global:counter += $items.Count; Write-Progress -PercentComplete ($global:Counter / ($Library.ItemCount) * 100) -Activity `
    "Getting Documents from Library '$($Library.Title)'" -Status "Getting Documents data $global:Counter of $($Library.ItemCount)";} | Where {$_.FileSystemObjectType -eq "File"}
    $ItemCounter = 0

    #Iterate through each document
    Foreach ($Document in $Documents)
    {
        #Get the File from Item
        $File = Get-PnPProperty -ClientObject $Document -Property File

        #Get The File Hash
        $Bytes = $File.OpenBinaryStream()
        Invoke-PnPQuery
        $MD5 = New-Object -TypeName System.Security.Cryptography.MD5CryptoServiceProvider
        $HashCode = [System.BitConverter]::ToString($MD5.ComputeHash($Bytes.Value))

        #Collect data
        $Data = New-Object PSObject
        $Data | Add-Member -MemberType NoteProperty -name "FileName" -value $File.Name
        $Data | Add-Member -MemberType NoteProperty -Name "HashCode" -value $HashCode
        $Data | Add-Member -MemberType NoteProperty -Name "URL" -value $File.ServerRelativeUrl
        $Data | Add-Member -MemberType NoteProperty -Name "FileSize" -value $File.Length
    $DataCollection += $Data
    $ItemCounter++
    Write-Progress -PercentComplete ($ItemCounter / ($Library.ItemCount) * 100) -Activity "Collecting data from Documents $ItemCounter of $($Library.ItemCount) from $($Library.Title)" `
    -Status "Reading Data from Document '$($Document['FileLeafRef']) at '$($Document['FileRef'])"
}

}
}

Get Duplicate Files by Grouping Hash code

$Duplicates = $DataCollection | Group-Object -Property HashCode | Where {$_.Count -gt 1} | Select -ExpandProperty Group
Write-host “Duplicate Files Based on File Hashcode:”
$Duplicates | Format-table -AutoSize

Export the duplicates results to CSV

$Duplicates | Export-Csv -Path $ReportOutput -NoTypeInformation

Get duplicate files in SharePoint site using file HASH

Explained everything in the Video : https://youtu.be/WHk2tIav-sQ

The task was to find a way to identify all duplicate files in a SharePoint site. I searched online for a solution, but none of the scripts I found were accurate. They used different criteria to detect duplicates, such as Name, modified date, size, etc.

After extensive research, I developed the following script that can generate a hash for each file on the SharePoint sites. The hash is a unique identifier that cannot be the same for two files, even if they differ by a single character.

If you want to do the same for all SharePoint sites, you can use below link:

Get duplicate files in all SharePoint site using file HASH. (itfreesupport.com)

I hope this script will be useful for many people.

Register a new Azure AD Application and Grant Access to the tenant

Register-PnPManagementShellAccess

Then paste and run below pnp script:

$SiteURL = “https://tenant.sharepoint.com/sites/sitename”
$Pagesize = 2000
$ReportOutput = “C:\Temp\DupSitename.csv”

Connect to SharePoint Online site

Connect-PnPOnline $SiteURL -Interactive

Array to store results

$DataCollection = @()

Get all Document libraries

$DocumentLibraries = Get-PnPList | Where-Object {$_.BaseType -eq “DocumentLibrary” -and $_.Hidden -eq $false -and $_.ItemCount -gt 0 -and $_.Title -Notin (“Site Pages”,”Style Library”, “Preservation Hold Library”)}

Iterate through each document library

ForEach ($Library in $DocumentLibraries)
{
#Get All documents from the library
$global:counter = 0;
$Documents = Get-PnPListItem -List $Library -PageSize $Pagesize -Fields ID, File_x0020_Type -ScriptBlock { Param ($items) $global:counter += $items.Count; Write-Progress -PercentComplete ($global:Counter / ($Library.ItemCount) * 100) -Activity
“Getting Documents from Library ‘$($Library.Title)'” -Status “Getting Documents data $global:Counter of $($Library.ItemCount)”;} | Where {$_.FileSystemObjectType -eq “File”}
$ItemCounter = 0

#Iterate through each document
Foreach ($Document in $Documents)
{
    #Get the File from Item
    $File = Get-PnPProperty -ClientObject $Document -Property File

    #Get The File Hash
    $Bytes = $File.OpenBinaryStream()
    Invoke-PnPQuery
    $MD5 = New-Object -TypeName System.Security.Cryptography.MD5CryptoServiceProvider
    $HashCode = [System.BitConverter]::ToString($MD5.ComputeHash($Bytes.Value))

    #Collect data
    $Data = New-Object PSObject
    $Data | Add-Member -MemberType NoteProperty -name "FileName" -value $File.Name
    $Data | Add-Member -MemberType NoteProperty -Name "HashCode" -value $HashCode
    $Data | Add-Member -MemberType NoteProperty -Name "URL" -value $File.ServerRelativeUrl
    $Data | Add-Member -MemberType NoteProperty -Name "FileSize" -value $File.Length
    $DataCollection += $Data
    $ItemCounter++
    Write-Progress -PercentComplete ($ItemCounter / ($Library.ItemCount) * 100) -Activity "Collecting data from Documents $ItemCounter of $($Library.ItemCount) from $($Library.Title)" `
    -Status "Reading Data from Document '$($Document['FileLeafRef']) at '$($Document['FileRef'])"
}

}

Get Duplicate Files by Grouping Hash code

$Duplicates = $DataCollection | Group-Object -Property HashCode | Where {$_.Count -gt 1} | Select -ExpandProperty Group
Write-host “Duplicate Files Based on File Hashcode:”
$Duplicates | Format-table -AutoSize

Export the duplicates results to CSV

$Duplicates | Export-Csv -Path $ReportOutput -NoTypeInformation

Download content from others OneDrive using Graphs API

Explained the same using my Youtube channel : https://www.youtube.com/watch?v=YG19odJN94Q

If you want to download content from other users’ OneDrive accounts, you have a few options. The simplest option is to create a OneDrive link for each user using the Office 365 Admin center. This will allow you to access and download their files through a web browser. However, this option is not very efficient if you have to do it for many users. A better option is to use an API call that can create a custom application for you. This application can download the files for multiple users at once, without requiring you to create individual links. To use this option, you need to follow these steps to get the API calls for the custom application.

Open Graphs API explorer URL : https://developer.microsoft.com/en-us/graph/graph-explorer

Sign in and give consent to Graphs explorer using the User icon on right corner of the screen

Now type the credentials of the users you want to use and give consent.

Now try to run the initial API and confirm you get OK 200 and your details in the Response View

Now the 1st API URL using which you can query other users OneDrive content is https://graph.microsoft.com/v1.0/users/[email protected]/drive/root/children

Sometimes, when you try to download content from other users’ OneDrive accounts, you may encounter an access denied error. This means that you do not have the permission to access their files. To fix this error, you need to do one of the following things:

  • Make sure that you are a Global Admin for the organization. This role gives you the highest level of access to all the resources in the organization, including other users’ OneDrive accounts.
  • Grant admin access to other users’ OneDrive accounts by following these steps:
  1. In the left pane, select Admin centers > SharePoint. (You might need to select Show all to see the list of admin centers.)
  2. If the classic SharePoint admin center appears, select Open it now at the top of the page to open the SharePoint admin center.
  3. In the left pane, select More features.
  4. Under User profiles, select Open.
  5. Under People, select Manage User Profiles.
  6. Enter the former employee’s name and select Find.
  7. Right-click the user, and then choose Manage site collection owners.
  8. Add the user to Site collection administrators and select OK.

This option allows you to specify which users’ OneDrive accounts you want to access, without requiring you to be a Global Admin.

After you successfully execute the above API calls, you will get a response that shows you the list of all the folders and files on the root of the other users’ OneDrive accounts. Each folder and file will have a unique ID that you can use to access its contents. To get the downloadable links for all the files in a specific folder, you need to use the following API calls:

https://graph.microsoft.com/v1.0/users/[email protected]/drive/items/FOLDERGUID/children?$select=content.downloadUrl,ID,name

These API calls will return a response that contains the links for each file in the folder. You can download any file by clicking on its link. The file will be saved to your local device.

Office 365 get all sites with its size in GB

Explained everything in the Video : https://youtu.be/rqn1KNYKlhE

Recently I was asked to get a script or command which can get all sites in Office 365 with its size in GB. I tried to look for ways to do so but could not find anything to help with same. After some research I tried to use ChatGPT and Bard but the scripts didn’t work for me and kept getting error. Same of not working command and its error is shown below:

THIS COMMAND IS NOT WORKING

Get-SPOSite | Select Title, Url, Owner, SharingCapability, LastContentModifiedDate, @{Name=”Size (GB)”;Expression={ [math]::round ($_.length/1GB,4)}} Get-SPOSite | Select Title, Url, Owner, SharingCapability, LastContentModifiedDate, @{Name=”Size (GB)”;Expression={ [math]::round ($_.length/1GB,4)}} | Export-CSV “C:\SharePoint-Online-Sites.csv” -NoTypeInformation -Encoding UTF8

At line:1 char:115
+ ... {Name="StorageUsageCurrent (GB)";Expression={ [math]::Round ($_.Stora ...
+                                                                 ~
Unexpected token '(' in expression or statement.
At line:1 char:203
+ ... 2)}}, @{Name="StorageQuota (GB)";Expression={ [math]::Round ($_.Stora ...
+                                                                 ~
Unexpected token '(' in expression or statement.
    + CategoryInfo          : ParserError: (:) [], ParentContainsErrorRecordException
    + FullyQualifiedErrorId : UnexpectedToken

I did a lot of research and found the issue in above command and a way to get all sites with its size in GB.

THIS IS WORKING COMMAND

Get-SPOSite | Select Title, Url, Owner, SharingCapability, LastContentModifiedDate, @{Name=”Size (GB)”;Expression={ ($_.StorageUsageCurrent/1024)}}

Also if you want to export the results to CSV file, use below command

Get-SPOSite | Select Title, Url, Owner, SharingCapability, LastContentModifiedDate, @{Name=”Size (GB)”;Expression={ ($_.StorageUsageCurrent/1024)}} | Export-CSV “C:\SharePoint-Online-Sites.csv” -NoTypeInformation -Encoding UTF8

You will also have to connect to your Office subscription, use below commands to do so

Install-Module -Name Microsoft.Online.SharePoint.PowerShell

Connect-SPOService -Url https://YOURTENANTNAE-admin.sharepoint.com

Note if you are using GCC or GCCHIgh, add ‘-ExchangeEnvironmentName O365UsGovGCCH’ in above ConnectSPOservice command

Estimating Physical Resources in Quantum Computing with Microsoft Azure

Resource estimation is a vital aspect in the realm of quantum computing. Microsoft Azure offers a sophisticated tool known as the Azure Quantum Resource Estimator to facilitate this critical process. Resource estimation plays a crucial role in evaluating the computational needs of quantum algorithms and optimizing computing resource usage. Leveraging the Azure Quantum Resource Estimator empowers users to gain insightful understanding of the computational demands of quantum algorithms, enabling informed decision-making about resource allocation. This tool assists in fine-tuning resource utilization, ultimately improving the efficiency and performance of quantum computing tasks.

The Importance of Resource Estimation

Quantum computing hinges on resource estimation, a vital process for gauging the requisites of executing a quantum program, encompassing qubit count, quantum circuit depth, and quantum operation tally.

Estimating these resources is crucial for several reasons:

  1. By treading the path of resource estimation, quantum programmers can meticulously craft efficient codes, tuning the allocation of resources, and streamlining their utilization.
  2. Resource estimation also elucidates the feasibility quotient pertaining to quantum operations vis-à-vis different quantum hardware, helping decipher the viability of executing specific quantum programs.
  3. Considering the lofty expenses endemic to quantum computing, resource estimation assumes a pivotal role in fiscal governance, furnishing an empirical basis for prognosticating the costs entailed in running quantum programs.

Using the Azure Quantum Resource Estimator

The Azure Quantum Resource Estimator is a tool provided by Microsoft Azure that helps in estimating the resources required to run a quantum program. Here’s how you can use it:

  1. Commencing the quantum odyssey, the initial stride involves scripting the quantum program employing Q#, an avant-garde quantum programming dialect incubated by Microsoft.
  2. Post-scripting, the foray into resource estimation unfurls, facilitated by the Azure Quantum Resource Estimator, which furnishes an insightful appraisal of the resources indispensable for program execution.
  3. Subsequently, delving into a comprehensive dissection of the resource estimation outcomes proffered by the Azure Quantum Resource Estimator allows intricate insights, paving the way for nimble program optimization.

Conclusion

Resource estimation galvanizes the bedrock of quantum computing, and with the Azure Quantum Resource Estimator as your compass, you can steer through the uncharted waters of resource quantification, birthing sagacious and cost-effective quantum algorithms.

Interacting with Azure Quantum Cloud Service

The realm of quantum computing is rapidly advancing, and Microsoft Azure stands at the vanguard of this transformative juncture. Through Azure Quantum, you have the capacity to script your quantum algorithms and execute them on bona fide quantum processors. The process unfolds as follows:

Step 1: Writing Your Quantum Programs

Firstly, you compose your quantum algorithms. This is achievable through Q#, an exclusive programming language crafted by Microsoft for articulating quantum computations. Q# is encompassed within the Quantum Development Kit (QDK), comprising an array of open-source utilities that facilitate developers in coding, assessing, and debugging quantum algorithms.

Step 2: Accessing Azure Quantum

Subsequently, upon crafting your quantum algorithms, the next strategic maneuver is to engage with Azure Quantum. This entails steering towards the Azure Quantum website using your web browser. If you don’t possess an Azure account, you’ll be obliged to create one. Once authenticated, you gain entry to the Azure Quantum service.

Step 3: Running Your Quantum Programs

Upon gaining access to Azure Quantum, you can set in motion the execution of your quantum programs on authentic quantum processors. This action entails navigating to the ‘Jobs’ segment within your designated Azure Quantum workspace. Here, you dispatch your quantum programs as jobs earmarked for enactment on the quantum hardware. You are at liberty to select from a diverse pool of quantum hardware providers, guaranteeing that you are endowed with the requisite resources to delve into the potency of quantum computing.

Step 4: Monitoring Your Jobs

Following the submission of your jobs, you are in a position to directly oversee their progression via your Azure Quantum workspace. Azure Quantum furnishes comprehensive insights into each job, encompassing its status, the quantum apparatus upon which it is being executed, and the resultant findings.

Conclusion

Interacting with Azure Quantum Cloud Service is a straightforward process. With just a browser and an Azure account, you can start exploring the exciting world of quantum computing. Whether you’re a seasoned quantum computing professional or a curious beginner, Azure Quantum provides the tools and resources you need to dive into quantum computing.