(this post was written using a SharePoint 2010 environment)
A while ago I was trying to find a way of getting a list of all user profiles that didn’t have a profile picture uploaded. I searched around online a bit and could only find examples that involved writing PowerShell which I thought was way to complicated a thing to do!
So this post will show how you can easily export user profile data from SharePoint, apply some excel filters to find all users without a profile picture.
This method applies to all on-premises versions of SharePoint, but also will work for SharePoint Online.
Get user profile data
To get started, open the Excel desktop app
Press Get Data > From Other Sources > SharePoint List
Enter the root site collection URL for your SharePoint environment
This will open a navigator window, which will display all the available lists within the root site collection.
Scroll down until you find the UserInformationList
Click on it and a preview will load, scroll across the preview and make sure you can see the Picture column
NOTE: at first I was using the Get Data > From Web option but that only brought back the first 30 rows from the UserInformationList.
I didn’t want to increase the view limit as there are thousands of rows to display but I found that the way described doesn’t have that issue.
Now this will take a little while to download all the rows if you have a fair few users, but you’ll see when its finished in the right-hand Queries and Connections pane as it shows you the number of rows downloaded once complete.
Filter your data
The first thing to note about this data is that it does also contain some stuff you wont need. With my list data I noted that I had SharePoint security and domain groups, service accounts and other general use/ non-user accounts listed. Here are some of the filters and logic I applied:
Filter the Picture column to only show blanks
Filter the Content Type column to only show Person
Or filter the ContentTypeID to only show the corresponding Person content type ID
Filter SIPAddress to only show rows with an organisational email address
That’s all there is to it, a pretty quick and super easy way to get a list of all user profiles without a picture 🙂
Azure CDN video caching for SharePoint and OneDrive
Microsoft have introduced improved streaming perofrmance for videos stored in SharePoint and OneDrive. Frequently accessed videos will be streamed from the Azure Content Delivery Network (CDN)closest to the user to optimize the playback experience. At all times data will remain within the Microsoft 365 compliance boundary.
Site owners, members and visitors will now see a Share command when they are viewing a list, even when they do not have any list item selected.
Site owners can grant other users access to the list and can specify whether what permissions to grant. Other users (e.g. site members and visitors) cannot directly grant other users access to the list, but they can use the dialog to send an approval request to site owners if the site is set up to allow access requests (enabled by default).
Several improvements have been made to the image column, including the ability to add an image to a list or library using the list itself or a library form. Also, when users are browsing the list on a mobile device, they will be able to upload an image – including taking a photo with the device’s camera.
Made available through the Microsoft Look Book, these SharePoint site templates are designed for schools and universities. Each brings together news, events, highlighted content, quick links and more, pre-configured and designed with a specific scope and audience for a variety of Education scenarios.
Microsoft have added new features to Whiteboard in Teams to make it easier to keep the creative process moving forward virtually. You can now quickly add sticky notes to a canvas, making it easier to contribute if you’re using a device without a digital pen.
You can now also move and re-order objects on the canvas through a simple drag and drop gesture.
Stay focused throughout the day with Microsoft Teams. Whether you are free for a quick chat or presenting in a meeting, real time presence increases the accuracy of your status so others know when to reach out.
Meeting recording storage for areas where Stream is not available
A new admin setting will allow you to turn on meeting recordings if Microsoft Stream data residency is not yet in country. If this setting is turned on, Teams meeting recordings will be saved in the data center closest to the region.
Optimised Microsoft Teams Experience for VMware Horizon 8
VMWare Horizon 8 now offers enhanced audio and video experience for Microsoft Teams. The optimization pack helps provide better user experience and improved productivity for those leveraging Microsoft Teams across Horizon virtual desktop (VDI) and published application environments.
Skype for Business Online connector consolidating to Teams module
Microsoft are simplifying the Teams administration experience with a single PowerShell module that includes complete management capabilities for Microsoft Teams and Skype for Business Online, by introducing full functionality of Skype for Business Online Connector into the Teams Module.
This means that Skype for Business Online Connector in the Skype for Business Online PowerShell, which has been a separate product that contains many cmdlets needed to manage Microsoft Teams, now has been consolidated into a single PowerShell module.
Microsoft have announced that the Teams App Submission API is now generally available. This new Graph API allows all users at any organization to develop on the platform of their choice and submit their apps into Teams with zero friction.
Roster updates from School Data Sync automatically update the Class Notebook roster
Now, School Data Sync (SDS) updates automatically flow to the OneNote Class Notebook! Previously when SDS made roster updates, they would not happen until the educator went and clicked on the “Class Notebook” tab in the Class Team.
Set channel specific collaboration and content spaces
Available now, educators can set Channels in class teams to create sections in Collaboration Space or Content Library, which is especially helpful for educators who like to use channels as separate units. While in a class team, go to the Class Notebook and click “Manage Notebook” to choose which channel a new Section will go.
This allows the educator to create a “read-only” Content Library or student editable “Collaboration Space” specific to each unit and channel.
IT admins can leverage communication compliance policies and AI models to automatically detect inappropriate content, then review and choose to delete.
Messaging containing offensive or harassing language and adult, racy, or gory images can be automatically flagged then removed from the Teams chat or channel by the IT Admin. This is not supported in private channels or in communications sent by guest users.
Build an interactive classroom with education apps
There are several third party apps that can be used within Teams to assist in finding ways to keep students engaged and learning throughout the day. All the available apps for engagement, communication, content creation and content & curriculum can be found below.
New lobby setting: only the organizer joins the meeting directly for GCC
A new lobby setting is coming to Teams Meeting Options. We are adding “Only me” as an option to the “Who can bypass lobby?” setting. Once enabled, only the organizer will be able to join the meeting directly. Everyone else, including people from within the same organization, will be sent to the lobby.
Improved Teams meeting join launcher experience for GCC
When you click on a Teams meeting join link, you will now see an optimized and improved join experience. You will be prompted with an option to join on the web, download the Teams client, or join with the native Teams client. This will result in faster and more confident meeting join.
New policy to prevent upload of profile picture for GCC, GCC High, and DoD
Teams desktop and web experiences will honor the Outlook on the Web mailbox policy setting that can control whether users are able change their profile pictures. This applies to GCC, GCC High, and DoD tenants.
You can now get information about issues you encounter when working with related tables, entities, controls and components on a form by accessing the Monitoring Tool inside your model driven Power App.
The tool can help identify if the issue you are seeing is designed out of box or is due to a customization in the application and provide details that can help you understand why there are issues. To launch the tool just log into your Dynamics or Model Driven app and add &monitor=true to the end of your URL. This will add an icon onto the app header toolbar. Click on the icon that looks like the one in the circle below.
After making improvements to the cascade assign and delete options last year to reduce timeouts, the Power Apps team are making similar improvements to merge operations.
Just as with assign and delete, merges will happen in the background. When a merge operation is run, if you are on build 9.1.0000.20463 or greater, instead of waiting on the submit screen until the job is completed which can take several minutes, you will instead be informed this will be handled in the background.
Power Apps checker now analyzes modern web syntax (up to ES9)
To ensure Power Apps checker continues to broaden analysis coverage as languages evolve, we recently migrated our web language rules to an ESLint plugin and now offer support for ECMAScript 2018 (ES9) syntax plus ES6 globals.
Power Platform Build Tools now generally available
Now generally available, Power Platform Build Tools allows anyone to setup DevOps for low-code and pro-code application development for Power Apps, Power Automate, Power Virtual Agents and other components supported by CDS solutions.
The GA release also introduces support for Multi Factor Authentication with the introduction of Service Principal Authentication.
On-premises data gateway August 2020 update is now available
The August update for the On-premises data gateway (version 3000.54.8 ) includes an updated version of the Mashup Engine, which will match the Power BI Desktop August update.
This will ensure that the reports that you publish to the Power BI Service and refresh via the Gateway will go through the same query execution logic/run-time as in the latest Power BI Desktop version.
AutoML now supports applying models in PQO Function Browser
Analysts can now apply any Power BI AutoML model to entities in any dataflow in the same workspace using PQO function browser. With this new capability, users don’t need to be an owner of the dataflow that has the model. They can use models created by others in the same workspace.
Header/ navigation improvements in the Power BI Mobile apps
The Power BI team have announced improvements to the navigation in the Power BI mobile apps to make it easier for our users to understand their content hierarchy and to navigate between items quickly, aswell as making it easier to share relevant content with colleagues.
The Power BI team have added a new tenant setting for Power BI admins to choose if users can create classic workspaces. This helps organizations control workspace creation more effectively and prevent unwanted workspaces from appearing in Power BI when Office 365 groups are created.
Multiple data lakes support inside Dataflows in Power BI
The Power BI team have announced improvements to Azure Data Lake Storage Gen2 (ADLS Gen2) support inside Dataflows in Power BI. This includes support for workspace admins to bring their own ADLS Gen2 account, improvements to the Dataflows connector in Power BI, take ownership support for dataflows using ADLS Gen2 and also minor improvements to detaching from ADLS Gen2.
Share and collaborate on your bot with subject matter experts
Power Virtual Agents now allows you to share your bot with your teammates, allowing multiple contributors to edit and manage your bot. Subject matter experts in a team can collaborate together on a bot. For example, a customer support bot can now have different subject matter experts working on refund related topics vs. topics related to membership.
One of the most common requests for Power Automate is to enable a way to trigger a flow when a column is modified in SharePoint. That wish has now been answered as the Power Automate team have announced the new “When an item or file is modified” trigger.
The “When an item or file is modified” trigger that lets you filter to just the list or file modifications, making it much simpler to create a flow that’s tuned to the right events.
Provision users to Azure AD from SAP SuccessFactors
With the integration between Azure AD and SAP SuccessFactors, you can now automate user access to applications and resources so that a new hire can be up and running with full access to the necessary applications on day one. The integration also helps you reduce dependencies on IT helpdesk for onboarding and offboarding tasks.
Streamlined process for FastTrack for Microsoft 365
Microsoft have streamlined the process for requesting assistance from FastTrack for Microsoft 365. FastTrack is a benefit that comes with your Microsoft 365 subscription, at no additional cost, for customers with eligible plans of 150+ licenses.
Recently I came across an issue with a SharePoint 2010 publishing site. The site had a page on it that was being edited and after a series of web parts were added, crashed and would no longer load. An additional issue here was that there wasn’t another, recent version of the page to restore to.
So, in the steps below detail how I was able to access the page using web part maintenance mode and delete the problem web part:
Navigate to the problem page’s URL
At the end of the URL add ?contents=1
This will then open the problem page up in web part maintenance mode. From here you are able to close, restore defaults or delete web parts from your page
NOTE: make sure you page is checked out before trying this else you won’t be able to make any changes.
Select the web part(s) which you think are causing the issue
Now select to either close, reset or delete the web part. I chose delete
A warning message will appear > press OK
When writing this post I wondered if this method of accessing web part maintenance mode still worked for modern SharePoint – the answer was no! When you try to open a modern page using ?contents=1 you get this:
However, after reading this handy article from Microsoft about maintenance for client-side web parts in SharePoint Online I just switched my query to ?maintenancemode=true and it worked!
Different to the classic example, modern web parts when in maintenance mode show summary, manifest and data tabs with information about each web part.
If you wish to delete a web part from this view you will need to edit the page and delete it from there, then republish like in the example below:
There are loads of URL’s that either I can never remember or haven’t come across that are listed here. However I wanted to keep a list of them on my site just for reference:
So the good news is, if you have a Microsoft 365 subscription then you also have the Migration Manager too! The Migration Manager went into general availability in June 2020 for all customers as part of the SharePoint Online offering.
The Migration Manager is a part of the SharePoint admin center and is really simple and easy to get set up and begin using. Microsoft have this handy image to demonstrate how easy the process is:
Currently, the Migration Manager only supports file share migrations, this means that if you are planning to migrate any other content you hold, for example: on-premise SharePoint content or cloud based content you would need to use a separate tool.
Microsoft also offer several other tools that can facilitate the migration of that content too. Microsoft provide a table which recommends which tool to use here, but I’ve whittled them down to the main options below:
Migration Manager: used for network and local file share migrations, easy to set up via the SharePoint admin center
SharePoint Migration Tool: used for SharePoint Server 2010, 2013 & 2016 (Public Preview), network & local file shares, requires some prerequisites configuring before installation
Mover: Service for cloud to cloud migration (Dropbox, Google Drive etc.), easy to set up via web platform
Another limitation of the Migration Manager is that it currently does not support third party multi-factor authentication.
So as the above breakdown of each Microsoft migration tool suggests, the Migration Manager is easy to get going. The key consideration when using this tool to migrate file share content is around the volume of data you are migrating.
If you are using the Microsoft FastTrack service, they will recommend that you set up a number of migration machines relative to the total number of files you are migrating.
The recommendation they offer for scoping this is below:
Total file count
Migration machines required
Less than 200,000
Between 200,000 and 500,000
Between 500,000 and 1,000,000
More than 1,000,000
To be discussed with FastTrack
Once you decide how many migration machines will be required, you will then need to setup the migration agents on each machine, check the prerequisites and required endpoints have been reviewed and met, and also have the relevant accounts being used as part of the migration process given access to the file share content and SharePoint admin roles.
File and folder permissions
When prepping for your file share migration, another consideration will be the permissions of a file once it is migrated. For most organisations it comes down to the following scenarios:
1. We want all file/ folder permissions preserved when they migrate to SharePoint Online
2. We only want specific file/ folder permissions mapped into SharePoint Online
3. We don’t want any of the permissions migrating, we want to start again!
Yes, I know the third option is highly unlikely, but I’ll include it anyway. When using the Migration Manager, the syncronization between your Active Directory on-premise and Azure Active Directory is key to how permissions migrate across.
It is important to note that if your organisation uses security groups in Active Directory to manage permissions for their file shares, they must be syncronized with Azure Active Directory in order for the user permissions to map across like-for-like.
If not, then you will be required to create a user mapping file to map user permissions for the relevant files or folders.
The Migration Manager also takes into account the same permissions conditions and results as the SharePoint Migration Tool. This table lists all the conditions and the corresponding results.
Demo: create a scan only task
Before you run any sort of migration task you will first want to get a handle on the current situation of your file share content, what areas will migrate easily and which will require remediation.
Both the scan and migration actions sit within Tasks in the Migration Manager and the initial setup for both actions is the same:
Press add task > under Method select the default single source and destination (unless you wish to scan multiple sources)
Under Source, enter the file share path that you wish to scan using the correct format
Under Destination, leave the SharePoint site URL and location as the default as we are only performing a scan
Under Settings, give your task a name > under common settings ensure perform scan only is checked
Press Run Now
Demo: create a migration task
So once you have ran a pre-scan of your file share source and you are happy with the results, it is now time to create a migration task! This again is very similar to how we approached creating the pre-scan task but here are the steps:
Press add task > under Method select the default single source and destination (unless you wish to scan multiple sources)
If you wish to do a bulk upload, you will need to provide a CSV or JSON file which is well documented here
Under Source > enter the file share path that you wish to migrate using the correct format
Under Destination > select the application where you are migrating the data to. Press next
Enter the URL of the location, then select the library or channel you wish to migrate to
Under Settings > enter a name for you task and configure your migration task based on the following check boxes:
Preserve file share permissions
Migrate hidden files
Migrate files created/modified after specified date
Do not migrate files with specific extensions
Migrate files and folders with invalid characters
Migrate OneNote folder as OneNote notebook
Azure Active Directory lookup
User mapping file
Automatically rerun failed tasks up to 4 times
Press Run Now
Whether you have performed a pre-scan or migration in Migration Manager, each task once complete will provide a “task report” zip folder that is available to download. Microsoft break the reports down into Summary, Task level and Performance reports, but in fact all reports are included as part of the task report download.
The Microsoft breakdown of each report is very thorough, so I won’t bother adding any more detail, however I will highlight the reports that I found useful when using the Migration Manager:
Summary Report: contains a single row of data that gives the total picture; including total size, number of files migrated, duration
Item Failure Report: contains any errors found resulting in a file being unable or failing to migrate
Item Report R1: detailed report with data on each file within a task. If large number of files migrated, split into separate, sequential reports (R1, R2, R3 etc.)
An interesting aside I did notice was on the task details pane for a pre-scan I ran it showed under files scanned with issues as 0, but within the task report ZIP there was, in fact several files and folders listed as having failed due to a variety of reasons. So I’m not really sure how accurate the files scanned with issues is, or what it actually defines as issues.
Microsoft documentation includes a page on troubleshooting Migration Manager issues and errors here, but typically during a recent migration I experiencing issues that weren’t including in the above.
Error: the source file share does not exist (but it does)
This error I received recently during a home drive – OneDrive migration I was conducting. I was using windows virtual machines with the migration agents installed previously and had logged into the machine and SharePoint admin center as accounts with the relevant permissions and roles as described here by Microsoft.
When I set up my migration tasks as demo’ed above they were instantly failing. The only errors I would receive were these:
Microsoft have this listed as an agent error message – which was the clue, but their action was to:
Make sure the source file share is an existing network file share. Confirm that the Windows account associated with the agent has read permissions to the file share you want to migrate.
What I found was in my case, even if the migration agents were showing as enabled, because I had originally installed and configured the agents some time ago (1-2 months prior), they need to be repaired and re-authenticated:
Download the migration agent setup file
Microsoft documentation hasn’t yet been updated, but to get to the agent download you need to open Migration Manager > Press Agents > + Add
Run the agent setup file
Press continue to reinstall the agent
The installer will then begin the installation prerequisites
Enter account credentials for the SharePoint Admin, then Windows account
The installer will authenticate, then prompt you to test permissions with a file share, press OK to close
Now, when I re-ran my migration task with my agents repaired and re-authenticated, the tasks completed successfully.
Error: failed to duplicate token file due to error ‘There is not enough space on the disk’
This error occurred for me during a pre-scan of file servers locations. After kicking the scans off, I returned to find many of them failed. When looking at the task error details from the logs I found the following:
Error, Failed to duplicate token file due to error ‘There is not enough space on the disk.
This error was pretty much what it says on the tin, the VM had ran out of disk space. As referenced before, looking through Microsoft’s issues and errors post didn’t prove helpful as the error above is not listed.
So, through investigation of my own I found the following:
There are a bunch of MTHost text files that appear to be log files of actions completed when tasks are ran within the Migration Manager.
There is a Migration folder, within a subsequent MigrationTool folder, another folder with your tenant name will be found. In here will be folders with names beginning “WF_” which represents each task created within the Migration Manager.
Each “WF_” folder contains a Log and Report folder, with the report folder containing all the migration reports we detailed earlier in this post.
There is also a MigrationToolStorage folder which also contains a mirrored “WF_” series of folders related to tasks created within Migration Manager.
What’s crucial to understand here isyou are able to delete the “WF_” folders from these locations, without it affecting the tasks in Migration Manager, or the ability to download the corresponding task reports.
I’ve tested deleted all the “WF_” folders from both the above folders and then refreshing and trying to download task reports for the tasks in the Migration Manager – and they download perfectly!
What I would question is what is the point of these “WF_” files downloading locally in the first place if they are also stored somewhere in the M365 cloud as well!
You cannot migrate to a deeper destination than first sub-folder
When migrating files from source to destination, in some cases you may want to migrate specific data that sits within a larger, nested folder structure. For example, lets say you have a folder that sits in a file share like this:
You only want to migrate folder band nothing elseexcept its associated folder structure. In Migration Manager, you are only able to select the first, top-level folder from a library. This applies to OneDrive, SharePoint or Teams and doesn’t offer any other sub-folders to be selected via the tool itself.
If you try to create migration tasks via CSV and include the sub-folders as additional columns in the spreadsheet, the Migration Manager ignores them and just migrates whatever is in the destination to the source library.
I would say generally this isn’t an issue as you can either skip the files you do not want to re-migrate and just set the source/ destination to be at the top-levels. This specific scenario came about because I was trying to remediate some files and folders that failed a prior migration, and once remediated migrate those folders only via Migration Manager into the corresponding folder structure in SharePoint. I would say this is an issue if you are doing what I described and don’t want to have to run another large migration task, even if it is skipping most of the files as the task will still take a while to complete.
The first thing to say about this is it’s quite literally a click of a button to actually provision the M365 learning pathways solution from the SharePoint look book, but please make sure you do read the prerequisites as that may well catch you out.
The prereq’s makes mention of being a tenant administrator, the account I used was an O365 global administrator. A quick google search here shows that a tenant administrator is now the global admin role.
The solution requires an app catalog site to be created in order to work, to do this you’ll need to navigate back to the old SharePoint admin center, or the classic site collection page to be more in-line with the official verbiage!
To get there, press More features > under Apps, Open.
Web Site Address suffix: enter your preferred suffix for the app catalog; for example: apps
Administrator: enter your username, and then select the resolve button to resolve the username
Check global admin is app catalog site collection admin
You also need to make sure the provisioning account is also a site collection administrator for the app catalog site. To do this just select your app catalog site, press Owners > Manage Administrators and make sure its the either listed in the primary or one of the other site collection administrators.
Initialize the CustomConfig List & assign owners
Once the learning pathways site has been provisioned, the account used will receive an email to confirm. In the email there will be a link to the custom config list that needs to be run to set up the site for first use.
If you don’t receive an email from the PnP provisioning service, then just navigate to your learning pathways site, then just add /SitePages/CustomLearningAdmin.aspx to the end of the url:
Next, you will need to add owners to the learning pathways site. Owners will have admin privileges on the site, but also be able to hide and show content delivered through the learning pathways web part. In addition, they’ll have the ability to build custom playlist and assign them to custom subcategories.
From the SharePoint Settings menu, click Site Permissions.
Click Advanced Permission Settings.
Click Microsoft 365 learning pathways Owners.
Click New > Add Users to this group, and then add the people you want to be Owners.
What I found was if you don’t click the link to initialize the CustomConfig List, all of the learning pathways content that’s delivered from the web part won’t work. I also then tried to go back and open the link to see if the problem would correct itself. It didn’t and the CustomLearningAdmin.aspx page just hung and wouldn’t respond.
What worked for me in the end was to permanently delete the learning pathways site, delete the learning pathways solution from the app catalog site, wait 24 hours then provision again (this was I could use the same URL).
Naturally this time around I initialized the CustomConfig List from the URL before sharing it!
2. Delete sites from recycle bin in order to provision again
So stemming from issue number 1 above, I also noticed that unless you permanently delete your learning pathways site, you cannot create one with the same name. You will get a message similar to the below:
“Unfortunately your site provisioning at least partially failed!”
To permanently delete a site, all you need to do is delete it from Active sites (if not connected to an O365 group), then under Delete sites select the site and press Permanently delete.
3. Multi factor authentication enabled for the provisioning account
I had an issue where I kept receiving a generic message from the provisioning service page saying:
“Unfortunately your site provisioning at least partially failed!”
The global admin account I used to run the provisioning service had multi factor authentication enabled, more specifically using the authentication app. What I found was when I changed by 2-step verification from using the authentication app to text, the provisioning service completed successfully.
4. app catalog site takes longer than 30 minutes to allow provisioning to complete
If you don’t already have an app catalog created, you will receive an error from the provisioning service similar to this:
“In order to provision the template you need to have an App Catalog in your tenant. Please, create one (for instructions you can read this document: https://go.microsoft.com/fwlink/?linkid=2087251), wait up to 30 minutes, and try again.“
When you then create an app catalog site, I found it took well over 2 hours before the provisioning service recognized as such.
ID numbers, reference numbers, ticket numbers…this is something that regularly gets asked to be a part of any SharePoint solution or request based system. My first thought when this is required is “easy, we can just use the SharePoint item ID column and use that”. However, creating a simple calculated column that leverages the in-built ID column is not as easy as it seems.
My first attempt at creating a custom ID column involved creating a new calculated column, and appending some text before the ID and then inserting the ID column into the formula, like this:
The problem with this approach is that when new items are added, the ID appears to “slip” resulting in the custom ID column having no ID number being pulled from the SharePoint ID.
Custom ID column – modern SharePoint
Before you begin you will naturally need to create either a list or library in SharePoint, and the relevant apps checked as part of your O365 license.
2. The setup
Create a new column, with the type Number – I called this ‘solIncrementNum‘
Create a new column, with the type Calculated – I called this ‘solReqNum‘, later renamed ‘Request Number’
In the formula field, add the following: ="SOL-00"&[SolIncrementNum]
For the Data Type, select Single line of text
NOTE: for the Request Number formula if you want to prefix your custom ID with something else just replace what’s between the ” “ in the formula field above.
3. Build the Flow
Flow action: when a new item is created
Create a new flow from the template “when a new item is created, complete a custom action”
Give your Flow a name, I called mine “Populate Solution Request Number”
In the “when a new item is added” step, make sure the site address and list name are the same as the list you built the custom ID column for earlier
Flow: update item
Press + New step, start typing “update item”, select the update item action from the selection
Select the site in question, then copy and paste the List Name from the previous action
Make sure this action has the following fields set:
NOTE: make sure that when you set these fields, that the values you use are coming from the “when a new item is created” action.
Now when new items are created within the list or library, the flow will fire and create a new request number.
(this post was written using a SharePoint online environment and SharePoint Designer 2013)
So here’s the scenario, there is a central list where users add items and once submitted a workflow runs that assigns tasks to separate task lists. No big deal right?
Task actions in SharePoint 2013 workflows are a pretty standard thing, the example above just assigns tasks to different lists (think HR, IT, Pensions) for work to be completed. The additional requirement I had was for these tasks to notsend any system generated assignment emails when the tasks are assigned.
This one really had me pulling my hair, but in a nutshell there is no obvious way to turn on/ off the emails that are system generated at the point which a task is assigned.
When you double-click on a task, an “Assign a Task” window opens. Within this window there is an “Email options” drop down, but this only has the email editor for the task creation email and the ability to turn on/ off the task overdue email(s). It doesn’t have any settings for switching on/ off the initial emails themselves.
The hidden task properties pane
So at this point I began thinking there is no way to do this and the design for this process is now fundamentally flawed…until I right-clicked!
If you right-click on the list within the assign task action (the ICT Task List Members bit underlined in the example below) a menu will appear. Normally this menu contains some simple options like moving an action up or down. However, with task actions there is an additional option called “properties”.
After clicking on the “Properties” button, you’ll find an additional “Assign a Task Properties” window which contains the following, hidden properties:
PreserveIncompleteTasks: set to true if you want non-completed tasks to be deleted when the task process is complete.
WaiveAssignmentEmail: set to false if you want to have an email sent out to the assignee when a task is created
WaiveCancelationEmail: set to false if you want to have an email sent out to the assignee when a task is canceled.
By default, all of these properties that are set to “no” or “false”, so will send emails based on the above parameters. To change, just click on the drop down next to each option and update to “yes” or “true” and the emails will stop sending!
This is a take on my a previous post on creating an a – z page in classic SharePoint, but this time using modern SharePoint pages. If you want to take a look at the classic SharePoint example, please click on the link below.
I’ve been suitably inspired by Andrew Warland’s fantastic two-part series documenting his approach and migration to SharePoint Online, so much so that I thought it would be a fun series to write about my own experiences.
It is’nt my intention to necessarily document Microsoft best practice in this series, rather just to explore some of the challenges, sucesses and experiences I notice along the way.
The current situation
My organisation has recently made the decision to move to to the cloud, with O365 being the naturally preferred destination. SharePoint has been well embedded, and heavily used within the business for several years, with on-premises SharePoint 2010 currently in production.
Finally, in terms of the SharePoint architecture and data volume, there are only three web applications to merge together as part of the migration effort. However, there are several site collections within our main intranet web app, plus many sub-sites nested within them, meaning the huge database sizes behind these site collections could prove difficult come migration time.
A note on the new, flat structure
Our current environment has a well established top-down structure in place that is generally consitent across the environment.
Having already made the investment in ShareGate, this will be the tool of choice for the migration. In the version 11.0 release of ShareGate, a new restructure option now allows you to promote sub-sites to top-level sites post inital migration from the source SharePoint environment.
Considerations for a successful migration plan
One of the biggest issues to be resolved before we can start any sort of migration activity, is the fact that we have several content databases well over the 200GB recommended general use size limit.
Microsoft best practice suggests that any environment that has site collections, sites, content databases, libraries or lists that exceed the software boundaries and limits should be remediated prior to any migration activity. In this case, the main idea is to split each content database that exceeds 200GB into seperate content db’s, and where neccessary, move or promote sub-sites to site collections and attach new db’s.
Armed with the knowledge of the recent restrcuture functionality coming to ShareGate, plus my own personal feeling that any remediation activities to our current environment may in of itself carry adverse risk to the estate we proposed a different approach.
With all the reporting capabilities at our disposal via ShareGate, I was able to get a firm grasp of what resides within each site collection in our environment, in terms of:
The size of each sub-site underneath the top-level
Number/ size of libraries and lists
Number of items in each of the above
Any workflows running in any of the above
From this I ran a trial migration of a sub-site from SharePoint 2010 to a newly created team site in SharePoint Online.
Before I kicked off the migration, I ran the source analysis tool within the Migration > Plan section of ShareGate. I noted the following obersavations:
The source analysis within “migration” in the ShareGate tool, although listed as only being able to analyze up to SharePoint 2013, does in-fact work for 2010
The source analysis cannot run at the sub-site level, meaning that you need to run it at the site collection level then just filter down to the sub-site in question through the report itself
Source analysis gives you a report of all checked-out files within a source site.From this, I created a simple view within each of the libraries that contained checked-out files to send to the site owners for action
The trial migration completed successfully as expected, however there were several interesting results I noted:
1. Everyone receieves a welcome email
If you migrate the permissions, once the source permission groups migrate each user will recieve a welcome email to the new SharePoint Online site.
Publishing sites seem to be the trickiest to migrate, especially those with custom master pages or page layouts. When migrating publishing sites, the Pages library is migrated wholesail, meaning the content won’t reside in the SitePages library (where new client-side pages are located).
3. Un-editable modern homepage
After the migration had completed, the new team site homepage threw up an error every time you tried to edit it.
I tried some of the documented resolution steps found here, but none of them worked for me. My solution was to just create a new page to replace the broken homepage, add all the relevant webparts and make this one the new default homepage.
Transforming classic publishing site pages to client-side pages
Publishing site pages will all be migrated as classic SharePoint pages, without the modern look and feel of a client-side page. My understanding is that for publishing pages with custom page layouts, additional metadata or custom content types will need to be transformed via PowerShell and creating a custom mapping file.
(I’m planning on writing a seperate blog post walking through an advanced publishing page transformation in the near future)
Its also worth considering that in the release notes for ShareGate 11.0 it makes mention of the fact they are researching the ability to transform classic to modern pages, so that could well simplify this process in a future release.
Overall, I was happy with our trial migration and believe it is a viable approach for us to move from on-prem to O365. Some lessons learned for myself would be to consider and SharePoint permissions audit prior to migration to remove any unecessary permissions, send an inventory out to site owners aswell as checked-out files, all in the name of reducing the migration effort.
This will be an ongoing series of posts, which i’ll focus more the on the nitty-gritty of the migration effort than anything else, but as always if there is any feedback or suggestions on how to improve this site, please let me know!
A while ago I did a short series on how to provision and deploy the SharePoint Starter Kit. From this, I thought it would be fun to detail how to take one of the many, great spfx web part samples available in the SharePoint GitHut repository and go through the steps involved to deploy it end to end.
As with my last series on the SharePoint Starter Kit and generally with all my posts my aim is to simplify and detail every step involved to show that you can these things working without huge effort of developer expertise, I am not a developer by any stretch of the imagination so hopefully putting all the pieces together is useful 🙂
spfx client-side web parts
All the samples are available in the sp-dev-fx-webparts repository on GitHub from the link below. For the purpose of this example, I am deploying the Modern Experience Theme Manager web part, as I wanted to test out how easy it makes applying, removing and updating custom themes (it does).
Before you can begin, it’s much easier to clone a repository from GitHub than downloading and extracting a zip file. I also couldn’t get the web part to open from the localhost workbench when I manually downloaded the code.
NOTE: I originally wrote this walkthrough a while back, when I came to finish it up I tried to update by cloned repo but it just wouldn’t work. I just removed the repo from Github desktop, then deleted all the files from the /sp-dev-fx-webparts/ sub-folder and cloned from scratch and it worked.
Make sure a developer certificate is installed
If you followed my SharePoint Starter Kit series, you can skip this step as you’ll of already done it, but if not run the following to install a dev certificate: gulp trust-dev-cert
Step 1: get the webpart working locally
Clone repository/ run web part on localhost
Now we need clone and build the repository to start using the webpart sample. Clone the repository by following the steps below:
Open cmd prompt, then navigate to the samples folder > then the web part samples folder you wish to use for example:
Run the following command to install the npm packages to build and run the client-side project
Run the following command to preview your web part in the SharePoint Workbench
Your browser should now open and a localhost version of SharePoint Workbench will allow you to add the web part and access the properties:
NOTE: I tried this using IE at first, it didn’t open at all, so i just switched to using another browser and the workbench loaded just fine.
Test the web part in your tenant workbench
Copy the URL of the localhost workbench, open a new browser window and paste the URL but update it to reflect your sharepoint tenant, for example: