text
stringlengths
3
2.35M
id
stringlengths
20
185
title
stringlengths
0
680
file_path
stringlengths
18
183
repo_id
stringclasses
9 values
token_count
int64
1
568k
__index_level_0__
int64
0
3.22k
title: "Customize address bar shortcuts for Microsoft Edge" ms.author: davidedwards author: dawholl manager: kellis ms.audience: Admin ms.topic: article ms.service: mssearch localization_priority: Normal ms.date: 03152022 search.appverid: - BFB160 - MET150 - MOE150 description: "Add custom Microsoft Edge shortcuts for Microsoft Search in Bing or turn off these shortcuts for your organization" Customize address bar shortcuts for Microsoft Edge Help your users stay focused and find work results faster when searching from the Microsoft Edge address bar. Two shortcuts are enabled by default, 'work' and your organization's preferred or shortened name. In the Microsoft Edge address bar, users can type a keyword, then press the Tab key. The address bar will indicate they're searching within your organization. When they type their search and press the Enter key, they'll see a search results page with relevant answers and results. You can add two custom shortcuts keywords. :::image type="content" alt-text="Animated GIF of entering work keyword and using Microsoft Edge shortcut to search work." source="mediaedge-shortcutsmicrosoft-edge-address-bar-shortcut.gif" lightbox="mediaedge-shortcutsmicrosoft-edge-address-bar-shortcut.gif"::: [!NOTE] This article applies to Microsoft Edge version 96 or later. Manage shortcuts and keywords In the Microsoft 365 admin center, go to Configurations. Under Microsoft Search in Bing shortcut, select Change. In the panel, Enable the Microsoft Search in Bing shortcut is selected by default. To disable these shortcuts, clear the check box. In Search Keywords field, enter one or two more keywords. You can include spaces and special characters. Select Save. Frequently asked questions Q: The keywords don't work. What's wrong? A: In the Microsoft Edge address bar, enter edge:settingssearch to go to your search settings. Verify that Show me search and site suggestions using my typed characters is enabled. You can also use Microsoft Edge group policy to enable search suggestions. To learn more, see SearchSuggestEnabled. Q: Do these shortcuts only support English keywords? A: No. For localized keywords, you'll need to add the language-specific keyword in the Search Keywords field. Q: How long does it take for new keywords to be recognized as shortcuts? A: It takes up to two days for Microsoft Edge to recognize custom keywords as a shortcut. Q: Can I add shortcuts for my Google Chrome users? A: Not through the Microsoft 365 admin center. Users can create their own shortcuts in Chrome by going to settings for Manage search engines, and under Other search engines, adding a site name, keyword, and Query URL. Q: Can I use these same keywords when using Windows Search? A: No, only Microsoft Edge supports these keywords shortcuts.
OfficeDocs-MicrosoftSearch/MicrosoftSearch/edge-shortcuts.md/0
Customize address bar shortcuts for Microsoft Edge
OfficeDocs-MicrosoftSearch/MicrosoftSearch/edge-shortcuts.md
OfficeDocs-MicrosoftSearch
632
0
title: "Monitor Microsoft Graph connectors for Microsoft Search" ms.author: mecampos author: monaray97 manager: mnirkhe audience: Admin ms.audience: Admin ms.topic: article ms.service: mssearch ms.localizationpriority: medium search.appverid: - BFB160 - MET150 - MOE150 description: "Monitor your connection state and index quota utilization." ms.date: 10082019 Monitor your connections To access and manage your Microsoft Graph connectors, you must be designated as a search administrator for your organization. Contact your administrator to assign you the search administrator role. Connection operations In the Microsoft 365 admin center, go to the Connectors tab. For each connector type, the Microsoft 365 admin center supports the operations shown in the following table. Operation Connectors by Microsoft Connectors by partners --- --- --- Add a connection :heavy_check_mark: (see Setup overview) :x: (refer to your partner or custom-built connector admin UX) Delete a connection :heavy_check_mark: :heavy_check_mark: Edit a published connection :heavy_check_mark: Name and description :heavy_check_mark: Connection settings :heavy_check_mark: Property labels :heavy_check_mark: Schema :heavy_check_mark: Refresh schedule :heavy_check_mark: Name :heavy_check_mark: Description Edit a draft connection :heavy_check_mark: :x: Monitor your connection state After you create a connection, the number of processed items shows on the Connectors tab on the Microsoft Search page. After the initial full crawl completes successfully, the progress for periodic incremental crawls displays. This page provides information about the connector's day-to-day operations and an overview of the logs and error history. Five states show up in the State column against each connection: Syncing. The connector is crawling the data from the source to index the existing items and make any updates. Ready. The connection is ready, and there's no active crawl running against it. Last sync time indicates when the last successful crawl happened. The connection is as fresh as the last sync time. Paused. The crawls are paused by the admins through the pause option. The next crawl runs only when it's manually resumed. However, the data from this connection continues to be searchable. Failed. The connection had a critical failure. This error requires manual intervention. The admin needs to take appropriate action based on the error message shown. Data that was indexed until the error occurred is searchable. Next section talks about getting notified if such failures happen in a connection. Delete Failed. The deletion of the connection failed. Depending upon the failure reason, the data might still be indexed, item quota might still be consumed, and crawls might still run for the connection. We recommend that you try deleting the connection again in this state. Notifications for permanent crawl failures in your connections The connection crawls are scheduled to run at specific times. The crawls can fail because of certain issues in the connections. Some times these issues are temporary and the crawls resume automatically and some times these failures are permanent where admin intervention is needed to start the crawls. In such cases of permanent failures, we mark the connection as "Failed" and send notifications to the Service Health Dashboard under the section: Issues for your organization to act on. :::image type="content" alt-text="Screenshot that shows issues in your environment section of Service health." source="mediamanage-connectorshd-notification-home.png" lightbox="mediamanage-connectorshd-notification-home.png"::: The same can also be seen in the form of Advisory in the "Service Status" section of Service Health page, under Microsoft 365 suite category. :::image type="content" alt-text="Screenshot that shows service status section" source="mediamanage-connectornotification-service-status.png" lightbox="mediamanage-connectornotification-bar-mac.png"::: If there are active notifications, admins get alerts in the form of notification bars in the Microsoft admin center home page. Notification bars contain the connectionId of the connection for which the crawls have failed. Admins can navigate to see more details of the notifications or remove the notification bars from the page. :::image type="content" alt-text="Screenshot that shows sample notification bar" source="mediamanage-connectornotification-bar-mac.png" lightbox="mediamanage-connectornotification-bar-mac.png"::: Admins can check the notification details by clicking the notification. :::image type="content" alt-text="Screenshot that shows sample notification" source="mediamanage-connectorsample-notification.png" lightbox="mediamanage-connectorsample-notification.png"::: Some points to note: The notification is live in the Service Health Dashboard for six days. After that the notification is automatically moved to the "Issue History" section where the same is stored for a maximum of 30 days. If the connection resumes the crawl, the notification is automatically moved to the "Issue History" section. No new notification is sent for the same connection until the crawls on that connection restart. Once the crawls are restarted and if a failure happens again, a new notification is sent. If there are crawl failures in multiple connections, each connection has a separate notification bar in the admin center home page and service health dashboard landing page. Subscribing for getting notifications in e-mail To get these failure notifications and updates on the e-mail, admins can add up to two e-mail addresses for the same. Go to Customize section in Service Health page and open the Email tab. Select the check box for - Issues in your environment that require action. In the "Include these services" section, select Microsoft 365 suite. Admins get all notifications for Microsoft 365 suite, including Graph connector notifications, after they subscribe to the service health notifications. Save :::image type="content" alt-text="Screenshot that shows e-mail subscription for notifications" source="mediamanage-connectornotification-mail.png" lightbox="mediamanage-connectoron-demand-crawl.png"::: Manage crawls in your connections During connection creation or edit connection flow, you can configure the crawl schedule through Refresh Settings. To learn more about different types of crawls available see: Setup Overview. Apart from the scheduled crawls, you can run on-demand crawls for your connection through the connection pane. :::image type="content" alt-text="Screenshot that shows on-demand crawl connection pane." source="mediamanage-connectoron-demand-crawl.png" lightbox="mediamanage-connectoron-demand-crawl.png"::: On-demand crawl helps you start a crawl irrespective of the crawl schedule. You can choose to run a full or incremental crawl using the drop-down as shown in the image: :::image type="content" alt-text="Screenshot that shows on-demand crawl drop-down." source="mediamanage-connectoron-demand-dropdown.png" lightbox="mediamanage-connectoron-demand-dropdown.png"::: [!NOTE] Graph Connector Agent, only from version 2.1.0.0 onwards, supports on-demand crawl. There can be only one category of crawl, scheduled or on-demand, running on a connection at any time. If a connection is in "Syncing" state, on-demand crawls are disabled. Scheduled crawls are auto triggered. If a scheduled or an on-demand crawl continues beyond the time of the schedule of the next full or incremental crawl, the ongoing crawl is stopped, and the next scheduled crawl is skipped and queued. After the ongoing crawl completes, the crawl of the opposite type (full or incremental) is picked from the skipped queue and triggered. For example, if the previous crawl was of the type full crawl, only the incremental crawl, if present in the skipped queue, is triggered and vice versa. Monitor your index quota utilization The available index quota and consumption is displayed on the connectors landing page. :::image type="content" alt-text="Screenshot that shows index quota utilization bar." source="mediamanage-connectorquota-exceeded.png" lightbox="mediamanage-connectorquota-exceeded.png"::: The quota utilization bar indicates various states based on consumption of quota by your organization: State Quota utilization levels --- --- Normal 0–79% High 80–89% Critical 90%–99% Quota Exceeded >=100% The number of items indexed is also displayed with each connection. The number of items indexed by each connection contributes to the total quota available for your organization. When index quota is exceeded for your organization, all active connections are impacted, and those connections operate in a limit exceeded state. In this state, your active connections: Will not be able to add new items. Are able to update or delete existing items. To fix this, you can do any of the following actions: Purchase index quota for your organization, to learn more see: Licensing requirements and pricing. Identify connections that have some items that you didn't want to index. To update this connection, you must delete and create a new connection with a data source exclusion filter to exclude the items you don't want to index anymore. Permanently delete one or more connections.
OfficeDocs-MicrosoftSearch/MicrosoftSearch/manage-connector.md/0
Monitor your connections
OfficeDocs-MicrosoftSearch/MicrosoftSearch/manage-connector.md
OfficeDocs-MicrosoftSearch
2,083
1
title: Allow or prevent custom script ms.reviewer: lucaband ms.author: ruihu author: maggierui manager: jtremper recommendations: true ms.date: 05142024 audience: Admin f1.keywords: - CSH ms.topic: article ms.custom: - 'O365M_NoScript' - 'O365E_NoScript' - 'seo-marvel-apr2020' - admindeeplinkSPO ms.service: sharepoint-online ms.localizationpriority: medium ms.collection: - Strat_SP_admin - M365-collaboration search.appverid: - SPO160 - ODB160 - ODB150 - BSA160 - MET150 ms.assetid: 1f2c515f-5d7e-448a-9fd7-835da935584f description: Learn how global and SharePoint admins can change the custom script setting for SharePoint sites in the organization. Allow or prevent custom script As a Global Administrator or SharePoint Administrator in Microsoft 365, you can allow custom script as a way of letting users change the look, feel, and behavior of sites and pages to meet organizational objectives or individual needs. If you allow custom script, all users who have Add and Customize Pages permission to a site or page can add any script they want. (By default, users who create sites are site owners and therefore have this permission.) [!NOTE] For simple ways to change the look and feel of a site, see Change the look of your SharePoint site. By default, script is not allowed on most sites that admins create using the SharePoint admin center as well as all sites created using the New-SPOSite PowerShell command. Same applies to OneDrive, sites users create themselves, modern team and communication sites, and the root site for your organization. For more info about the security implications of custom script, see Security considerations of allowing custom script. [!IMPORTANT] If SharePoint was set up for your organization before 2015, your custom script settings might still be set to Not Configured even though in the SharePoint admin center they appear to be set to prevent users from running custom script. In this case, users won't be able to copy items between SharePoint sites and between OneDrive and SharePoint. On the Settings page in the SharePoint admin center, to accept the custom script settings as they appear, select OK, and enable cross-site copying. For more info about copying items between OneDrive and SharePoint, see Copy files and folders between OneDrive and SharePoint sites. To allow custom script on OneDrive or user-created sites [!NOTE] This feature will be removed during H1 calendar year 2024. Once removed, it will no longer be possible to allow custom script on OneDrive sites. In the SharePoint admin center, you can choose to allow users to run custom script on OneDrive (referred to as personal sites) or on all classic team sites they create. For info about letting users create their own sites, see Manage site creation in SharePoint. [!CAUTION] Before you allow custom script on sites in your organization, make sure you understand the security implications. Go to Settings in the SharePoint admin center, and sign in with an account that has admin permissions for your organization. [!NOTE] If you have Office 365 operated by 21Vianet (China), sign in to the Microsoft 365 admin center, then browse to the SharePoint admin center and open the Settings page. At the bottom of the page, select classic settings page. Under Custom Script, select: Allow users to run custom script on personal sites. Allow users to run custom script on self-service created sites. :::image type="content" alt-text="Screenshot of custom script section of settings page in SharePoint admin center." source="mediaa96d5c23-6389-4343-81cb-7f055617f6e8.png" lightbox="mediaa96d5c23-6389-4343-81cb-7f055617f6e8.png"::: [!NOTE] Because self-service site creation points to your organization's root site by default, changing the Custom Script setting allows custom script on your organization's root site. For info about changing where sites are created, see Manage site creation in SharePoint. Select OK. It can take up to 24 hours for the change to take effect. To allow custom script on other SharePoint sites [!CAUTION] Before you allow custom script on sites in your organization, make sure you understand the security implications. To allow custom script on a particular site (previously called site collection) immediately, follow these steps: Download the latest SharePoint Online Management Shell. [!NOTE] If you installed a previous version of the SharePoint Online Management Shell, go to Add or remove programs and uninstall SharePoint Online Management Shell. Connect to SharePoint as a Global Administrator or SharePoint Administrator in Microsoft 365. To learn how, see Getting started with SharePoint Online Management Shell. Run the following command. PowerShell Set-SPOSite -DenyAddAndCustomizePages 0 or by means of the PnP.PowerShell cmdlet Set-PnPSite PowerShell Set-PnPSite -Identity -NoScriptSite $false If you change this setting for a classic team site, it will be overridden by the Custom Script setting in the admin center within 24 hours. [!NOTE] You cannot allow or prevent custom scripts to an individual user's OneDrive. Manage custom script from SharePoint admin center [!NOTE] If you do not see the new options in SharePoint tenant admin center, the feature is not enabled in your tenant yet. Every customer will have this new set of capabilities enabled by end of June 2024 Tenants administrators have a set of tools available in SharePoint tenant administration to manage custom script within their organization. Specifically, tenant administrators can do the following: verify custom script status change custom script settings persist custom script settings Verify custom script status A new Custom script column is now available in the Active sites page under Sites. :::image type="content" alt-text="Screenshot of active sites view with custom script column visible." source="media232a2283-7f38-4f77-b32d-e076bbcbbb01.png" lightbox="media232a2283-7f38-4f77-b32d-e076bbcbbb01.png"::: The column can be added to any view. A new Custom script allowed sites is also available to provide an easy access to all the sites where custom script is enabled: :::image type="content" alt-text="Screenshot of the list of default views, which includes the 'custom script allowed sites' view." source="mediae19f29a8-601a-416a-b8fd-2f128461b52c.png"::: Change custom script settings In the Active sites page, upon selecting a site, under settings, a Custom scripts setting is available for administrators: :::image type="content" alt-text="Screenshot of the 'Custom scripts' setting." source="media7a9c6b79-db8b-4577-9a8c-978f011196a9.png"::: Administrators can control custom script settings for a specific site; deciding if they want to allow or block custom script on a specific site: :::image type="content" alt-text="Screenshot of 'Custom scripts' values." source="media05b24a6e-7dec-4b50-80e8-f09fe18e7dd4.png" lightbox="media05b24a6e-7dec-4b50-80e8-f09fe18e7dd4.png"::: By default, any changes to custom script settings for a specific site only last for a maximum of 24 hours. After that time, the setting will reset to its original value for that specific site. Persist custom script settings To prevent SharePoint in resetting custom script settings to its original value to the whole tenant follow these steps: Download the latest SharePoint Online Management Shell. [!NOTE] If you installed a previous version of the SharePoint Online Management Shell, go to Add or remove programs and uninstall "SharePoint Online Management Shell." Connect to SharePoint as a Global Administrator or SharePoint Administrator in Microsoft 365. To learn how, see Getting started with SharePoint Online Management Shell. Run the following command. PowerShell Set-SPOTenant -DelayDenyAddAndCustomizePagesEnforcement $True [!NOTE] This setting affects all sites. There are no options to preserve changes to custom script settings only on some specific sites. This parameter will be available until November 2024. After that date, it will no longer be possible to prevent SharePoint in resetting custom script settings to its original value for all sites. Running the command where Multi-Geo capabilities on OneDrive and SharePoint are configured, will only affect the current geo from which you ran the command. To persiste custom script settings across the entire tenant you must run the command on each geo. Features affected when custom script is blocked When users are prevented from running custom script on OneDrive or the classic team sites they create, site admins and owners won't be able to create new items such as templates, solutions, themes, and help file collections. If you allowed custom script in the past, items that were already created will still work. The following site settings are unavailable when users are prevented from running custom script: Site feature Behavior Notes :-----:-----:----- Save Site as Template No longer available in Site Settings Users can still build sites from templates created before custom script was blocked. Save document library as template No longer available in Library Settings Users can still build document libraries from templates created before custom script was blocked. Save list as template No longer available in List Settings Users can still build lists from templates created before custom script was blocked. Solution Gallery No longer available in Site Settings Users can still use solutions created before custom script was blocked. Theme Gallery No longer available in Site Settings Users can still use themes created before custom script was blocked. Help Settings No longer available in Site Settings Users can still access help file collections available before custom script was blocked. Sandbox solutions Solution Gallery is no longer available in Site Settings Users can't add, manage, or upgrade sandbox solutions. They can still run sandbox solutions that were deployed before custom script was blocked. SharePoint Designer Pages that are not HTML can no longer be updated. Handling List: Create Form and Custom Action will no longer work. Subsites: New Subsite and Delete Site redirect to the Site Settings page in the browser. Data Sources: Properties button is no longer available. Users can still open some data sources. To open a site that does not allow custom script in SharePoint Designer, you must first open a site that does allow custom script. Uploading files that potentially include script The following file types can no longer be uploaded to a library .asmx .ascx .aspx .htc .jar .master .swf .xap .xsf Existing files in the library are not impacted. Uploading Documents to Content Types Access denied message when attempting to attach a document template to a Content Type. We recommend using Document Library document templates. Publishing of SharePoint 2010 Workflows Access denied message when attempting to publish a SharePoint 2010 Workflow. The following web parts and features are unavailable to site admins and owners when you prevent them from running custom script. Web part category Web part :-----:----- Business Data Business Data Actions Business Data Item Business Data Item Builder Business Data List Business Data Related List Excel Web Access Indicator Details Status List Visio Web Access Community About This Community Join My Membership Tools What's Happening Content Rollup Categories Project Summary Relevant Documents RSS Viewer Site Aggregator Sites in Category Term Property Timeline WSRP Viewer XML Viewer Document Sets Document Set Contents Document Set Properties Advanced Embed Forms HTML Form Web Part Media and Content Content Editor Script Editor Silverlight Web Part Search Refinement Search Box Search Navigation Search Results Search-Driven Content Catalog-Item Reuse Social Collaboration Contact Details Note Board Organization Browser Site Feed Tag Cloud User Tasks Master Page Gallery Can't create or edit master pages Publishing Sites Can't create or edit master pages and page layouts Best practice for communicating script setting changes to users Before you prevent custom script on sites where you previously allowed it, we recommend communicating the change well in advance so users can understand the impact of it. Otherwise, users who are accustomed to changing themes or adding web parts on their sites will suddenly not be able to and will see the following error message. :::image type="content" alt-text="Screenshot of the Error message that's displayed when scripting is disabled on a site." source="media1c7666a0-9538-484f-a691-6e424c5db71a.png"::: Communicating the change in advance can reduce user frustration and support calls.
OfficeDocs-SharePoint/SharePoint/SharePointOnline/allow-or-prevent-custom-script.md/0
Allow or prevent custom script
OfficeDocs-SharePoint/SharePoint/SharePointOnline/allow-or-prevent-custom-script.md
OfficeDocs-SharePoint
2,903
2
ms.date: 05312024 title: Build Learning and Training Experiences for Employees ms.reviewer: ms.author: ruihu author: maggierui manager: jtremper recommendations: true audience: Admin f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-online localization_priority: Priority ms.collection: - Strat_SP_modern - M365-collaboration - m365solution-corpcomms - m365solution-scenario - highpri - best-practices search.appverid: - SPO160 - MET150 description: "Learn how to build a learning and training experience for employees using Microsoft 365" Build learning and training experiences for employees Learn how to build a training and learning experience that will keep employees and team members up to date with important skills and proficiencies required for professional success. This article will show you how to create a training solution that includes a durable training site, online learning, virtual training events, promotion, and tracking insights and feedback to improve your experience overtime. Create a learning and training experience by: Taking inventory of all organizational needs, learning objectives and goals, and identifying important timelines Choosing the best tools that fit your organization's learning and training goals and objectives Keeping employees up to date about important learning and training due dates and opportunities Launching an accessible and scalable experience by ensuring employees know how to access modules, courses, and resources and asking for feedback to make improvements along the way Overview of creating a training and learning experience for employees PlanBuildLaunch :---::---::---: - Understand your organizational needs - List all learning objectives - Personalize the experience based on role and career stage - Identify timelines - Define goals and outcomes- Choose the best tools for your training and learning experience - Ensure tools and resources can be accessed by everyone - Create a place where employees with similar learning objectives can connect- Set realistic deadlines - Notify employees of upcoming training and learning opportunities and initiatives - Gather insights from each platform used to measure overall effectiveness of training and learning experience Plan a training and learning experience Planning considerations: For some professions and disciplines, federal or state mandates may be in place requiring specific learning and training. Be sure to keep up with these dates to inform your organization's practice. Employees have varying and busy schedules. Be sure to notify employees more than one time about upcoming required learning before the due date to make sure it's top of mind. Depending on the organization, employees may be working across multiple regions and time zones. For live learning events, choose times that accommodate most employees. :---::---::---::---: Step 1: Understand your organizational needsStep 2: List and categorize all learning objectivesStep 3: Define goals and outcomesStep 4: Review Microsoft 365 Learning and Training tools Step 1: Understand your organizational needs Take inventory of all organizational departments, teams, and disciplines. Are there any legally required continued learning mandates for any employee groups in your organization? How many? Is there a broad range of disciplines and areas of expertise across your organization? Is there a certain budget that needs to be set and adhered to for learning and training resources? Set aside time to list all factors out to get a full scope of what is needed. Step 2: List and categorize all learning objectives Now that you have listed out your organizational needs, identify the learning objectives for each employee group. Separate the required learning from suggested or recommended learning. Identify timelines for completion based on organizational needs. Establish starting points and finish lines for each module or experience. Step 3: Define goals and outcomes Every organization has its own variation of technical and soft skills that need to be upheld and maintained over time. These skills can range from learning how to use job-related tools and in-house resources to learning about the intricacies of the company's culture and policies. Define what needs to be prioritized as learning goals and outcomes from the employees in your organization, prioritize these goals and begin curating modules and learning materials that best meet these desired goals and outcomes. Step 4: Review Microsoft 365 Learning and Training tools Use Viva Learning to surface all your learning content in one place. Bring learning content from Microsoft, your organization's SharePoint, and the learning management systems your organization already uses into Viva Learning so your employees can easily access the learning and training content they need without leaving Microsoft Teams. Curate custom training playlists on SharePoint with Learning Pathways. Bring custom content and Microsoft training content together to create learning and training playlists. Use Learning Pathways web parts with audience targeting to make sure that your employees see the content that's most relevant to them. Create a learning and development landing page using SharePoint. Create a SharePoint communication site using the Learning Central SharePoint Communication site template. Use the site to direct employees to events, news, and information about required learning and extra-curricular resources. Use Microsoft Teams to hold live learning events. Meetings in Teams include audio, video, and screen sharing, and are great for communicating with groups of fewer than 300 individuals. Teams can be used for meetings that happen in person, remotely, or a combination of the two. Additionally, the meeting can be recorded and shared with new employees who were unable to attend the meeting. Provide an opportunity for employees to learn from each other in Viva Engage. Create a Viva Engage channel just for new employees. Choose to create one NEO channel for the entire organization or region or create channels for each new group of new employees. Then, use the Viva Engage web part on the SharePoint NEO site to integrate the conversation with other resources and contacts. Build the learning and training experience Build out the learning and training experience one platform at a time. The tools listed here can be connected to each other, providing a thorough and connected learning and training journey for all employees. Review learning and training tools Use Viva Learning to make learning content easily accessible from within Teams Viva Learning in Microsoft Teams allows users to discover, recommend, and access learning modules from different platforms to help users gain knowledge in any specific focus area. Viva Learning pulls content from LinkedIn Learning, Microsoft Learn, and Microsoft 365 Training. You can also add your organization's own custom content from SharePoint, and integrate Viva Learning with learning management systems and third-party content providers that you already use. Use Viva Learning to make sure each of your employees has the knowledge they need for organizational needs, team needs, and the tools that they will be working with daily. In Viva Learning, managers of an organization or team can recommend learning and track the learning progress of each module. The Viva Learning home view aggregates a variety of information, including assigned content from learning management systems, recommended learnings, trending content, and learning provider content libraries. Learn more about Viva Learning Curate and target custom training playlists with Learning Pathways Learning pathways is a customizable, on-demand learning solution in SharePoint that brings together out-of-the-box Microsoft 365 training playlists and custom playlists created by your organization. Surface your learning pathways playlists on any site in your organization using the Microsoft 365 learning pathways web part, and use audience targeting to make sure your playlists are seen by everyone who needs them. Get started with learning pathways and easily provision learning pathways to begin using your customizable learning pathways portal. Create a learning and development landing page using SharePoint Create an internal communication site that acts as a home for the learning and training experience. This site should lay out resources, deliverables, and learning objectives in and organized way. This site should also contain the most up-to-date information to ensure the employee has everything needed to be successful. Try organizing action items on this site in a way that signals priority. List the things that need to be finished first at the top. This will help employees organize their learning and training to the best of their ability. Microsoft offers a new customizable Learning Central SharePoint communication site template that can help get you started. Get started with SharePoint site templates. Use Teams to hold live learning events Many learning experiences require hands-on training. Some experiences are better held in live environments where employees can ask questions and get guidance in real time. With the emergence of virtual work, having face time with other people in the organization is still important in making sure employees can communicate with learning instructors and other peers and ask top-of-mind questions. Use Microsoft Teams to administer live learning and training experiences. Make this live event fun and engaging by giving employees an opportunity to communicate with each other in the Teams chat box or allow employees to ask questions on camera or through the moderated Q&A in Teams live events. Record each session so employees can review the learning material later, or so future employees can experience the session as well. Make this recording available through the SharePoint Learning and Training site or through the Viva Learning platform. Learn more about Teams live events.. Provide an opportunity for employees to learn from each other in communities on Viva Engage Give employees that have similar learning objectives the opportunity to connect with each other and build a supportive virtual community in Viva Engage. Viva Engage is a platform that connects leaders, communicators, and employees to build communities, share knowledge, and engage across the organization. Viva Engage allows you to set up a community specifically for employees within the same or similar disciplines. Naturally, employees come across information at different times in their learning and training process. Creating a space where new employees can build a community for themselves gives them the opportunity to share information as they come across it. In this virtual space, new employees can share resources, share ideas, and get to know each other. It also helps them build internal bonds that can last over the course of their career. Learn more about building communities in Viva Engage. Make sure all tools and experiences are accessible Ensure each learning and training experience is accessible by incorporating accessibility best practices across all relevant Microsoft 365 tools. Make sure everything in your learning and training experience is visible and labeled properly. Use alternative text for images and graphics. Incorporate transcripts in videos and training material. Share this experience with the right people. Make sure employees know how to access all the tools needed in the learning experience. Use all sharing options including email, Teams messages, Viva Engage posts, SharePoint web parts, Viva Learning, and Viva Connections to build awareness and increase accessibility. Ensure every employee and instructor has access to all tools and resources. Consider the unique needs of everyone involved in making sure the learning and training experience is successful. Stay informed about the specific time zones of every stakeholder to make sure all timelines and deliverables are feasible for everyone involved. Lastly, make the starting point and ending point for each training opportunity. Review your experience to ensure employees know exactly where to start and when the experience has been completed successfully. This can be done within the experience, or by planning to generate an email through Outlook that lets them know when they have finished each learning objective and if anything else is required of them. Launch a training and learning experience After the learning and training experience has been planned and the tools have been selected, it's time to launch your experience. Notify employees of upcoming training Build awareness of upcoming training by sending an email in Outlook or by adding a SharePoint news posts to spread the word across various platforms. Let employees know if the training is required or suggested. List important details about the training like the due date and the learning objectives. Ensure all employees know how to access the learning module or experience. Gather insights to determine success Gather insights from each platform along the way. Use these insights to inform managers, team leads, and other stakeholders. Look for insights such as live event attendance, audience reach, site traffic, module completion, and more. Maintain your experience by updating things frequently, providing the most up-to-date and relevant information across all platforms. Learn more about maintaining your SharePoint communication sites and keeping SharePoint news posts updated. Ask for feedback on the learning and training experience often using Microsoft Forms. Use this feedback to determine what needs to be improved or altered for the next learning and training experience. More Resources Corporate communications overview Overview of Viva Learning Use the SharePoint Learning central site template
OfficeDocs-SharePoint/SharePoint/SharePointOnline/build-learning-and-training-experiences-for-employees.md/0
Build learning and training experiences for employees
OfficeDocs-SharePoint/SharePoint/SharePointOnline/build-learning-and-training-experiences-for-employees.md
OfficeDocs-SharePoint
2,600
3
ms.date: 06102022 title: Plan compliance requirements for SharePoint and OneDrive ms.reviewer: ms.author: ruihu author: maggierui manager: jtremper recommendations: true audience: Admin f1.keywords: NOCSH ms.topic: concept-article ms.service: sharepoint-online ms.localizationpriority: medium ms.collection: essentials-compliance ms.custom: intro-get-started search.appverid: MET150 description: Learn what features are available in Microsoft 365 to help you plan your compliance requirements for SharePoint and OneDrive. Plan compliance requirements for SharePoint and OneDrive Most organizations have business or legal requirements that govern how data is used, shared, and retained. Some organizations also have data residency requirements or regulatory requirements that restrict communication between certain users and groups. Microsoft 365 has a wide range of governance and compliance features to address these needs. This article provides an overview of features you may want to consider as part of your OneDrive and SharePoint rollout. Data lifecycle management Use data lifecycle management capabilities in Microsoft Purview to govern your OneDrive and SharePoint content for compliance or regulatory requirements. The following table describes the capabilities to help you keep the content you need you and delete what you don't need. CapabilityWhat problems does it solve?Get started :------:------------:---------------------------- Retention policies and retention labelsLearn about retention for SharePoint and OneDrive Retain or delete content with policy management for SharePoint and OneDrive documents Create and configure retention policies Create retention labels for exceptions to your retention policies Deleted users' data When a user leaves your organization and you've deleted that user's account, what happens to the user's data? When considering data retention compliance, determine what needs to happen with the deleted user's data. For some organizations, retaining deleted user data could be important continuity and preventing critical data loss. If a user's Microsoft 365 account is deleted, their OneDrive files are preserved for 30 days. To change this setting, Set the OneDrive retention for deleted users. By default, when a user is deleted, the user's manager is automatically given access to the user's OneDrive. To change this, see OneDrive retention and deletion. Information protection Microsoft Purview Information Protection capabilities help you discover, classify, and protect sensitive information in OneDrive and SharePoint. The following table describes these capabilities. Consider if you want to implement any of these capabilities as part of your OneDrive and SharePoint rollout. CapabilityWhat problems does it solve?Get started :------:------------:-------------------- Sensitive information types Identifies sensitive data by using built-in or custom regular expressions or a function. Corroborative evidence includes keywords, confidence levels, and proximity. Customize a built-in sensitive information type Trainable classifiers Identifies sensitive data by using examples of the data you're interested in rather than identifying elements in the item (pattern matching). You can use built-in classifiers or train a classifier with your own content. Get started with trainable classifiers Sensitivity labels A single solution across apps, services, and devices to label and protect your data as it travels inside and outside your organization. Sensitivity labels can be used to protect files themselves or individual SharePoint sites and teams.Enable sensitivity labels for Office files in SharePoint and OneDrive Use sensitivity labels to protect content in Microsoft Teams, Microsoft 365 Groups, and SharePoint sites Data loss prevention Helps prevent unintentional sharing of sensitive items. Get started with the default DLP policy File sync The OneDrive sync app has policies that you can use to help you maintain a compliant environment. Consider configuring these policies before you roll out SharePoint and OneDrive. PolicyWindows GPOMac :-----:----------:-- Allow syncing OneDrive accounts for only specific organizationsAllowTenantListAllowTenantList Block syncing OneDrive accounts for specific organizationsBlockTenantListBlockTenantList Prevent users from syncing libraries and folders shared from other organizationsBlockExternalSyncBlockExternalSync Prevent users from syncing personal OneDrive accountsDisablePersonalSyncDisablePersonalSync Exclude specific kinds of files from being uploadedEnableODIgnoreListFromGPOEnableODIgnore Data residency Multi-Geo is Microsoft 365 feature that allows organizations to span their storage over multiple geo locations and specify where to store users' data. For multinational customers with data residency requirements, you can use this feature to ensure that each user's data is stored in the geo location necessary for compliance. For more info about this feature, see Multi-Geo Capabilities in OneDrive and SharePoint. Features such as file sync and mobile device management work normally in a multi-geo environment. There's no special configuration or management needed. The multi-geo experience for your users has minimal difference from a single-geo configuration. For details, see User experience in a multi-geo environment. For more information about Microsoft 365 Multi-Geo, see Microsoft 365 Multi-Geo. Information barriers Microsoft Purview Information Barriers is a compliance solution that allows you to restrict two-way communication and collaboration between groups and users in Microsoft Teams, SharePoint, and OneDrive. Often used in highly regulated industries, information barriers can help to avoid conflicts of interest and safeguard internal information between users and organizational areas. When information barrier policies are in place, users who shouldn't communicate or share files with other specific users won't be able to find, select, chat, or call those users. Information barrier policies automatically put checks in place to detect and prevent unauthorized communication and collaboration among defined groups and users. If your business requires information barriers, see Learn about information barriers and Use information barriers with SharePoint to get started. Next steps [!div class="nextstepaction"] Plan sharing and collaboration options Related topics Plan for SharePoint and OneDrive in Microsoft 365 B2B Sync Implement compliance in Microsoft 365 Protect your enterprise data using Windows Information Protection (WIP) Control OneDrive and SharePoint access based on network authentication or app
OfficeDocs-SharePoint/SharePoint/SharePointOnline/compliant-environment.md/0
Plan compliance requirements for SharePoint and OneDrive
OfficeDocs-SharePoint/SharePoint/SharePointOnline/compliant-environment.md
OfficeDocs-SharePoint
1,297
4
ms.date: 04242024 title: "Create a hub site in SharePoint" ms.reviewer: metorres ms.author: ruihu author: maggierui manager: jtremper recommendations: true audience: Admin f1.keywords: - CSH ms.topic: article ms.service: sharepoint-online ms.localizationpriority: medium ms.collection: - Strat_SP_admin - M365-collaboration - m365initiative-spsitemanagement ms.custom: - seo-marvel-apr2020 - admindeeplinkSPO search.appverid: - SPO160 - MET150 ms.assetid: 92bea781-15d8-4bda-805c-e441e2191ff3 description: "In this article, you'll learn how to register a site as a hub site in the SharePoint admin center." Create a hub site in SharePoint If you're a Global Administrator or SharePoint Administrator in Microsoft 365, you can convert any existing site to a hub site. [!NOTE] We recommend selecting a communication site, or a team site that uses the new template. If you use a classic team site, the hub navigation will appear only on modern pages, and hub site settings will only appear on modern pages.Sites that are already associated with another hub can't be converted to a hub site. You can create up to 2,000 hub sites for an organization. This applies to hub-to-hub associations as well. Any site labeled as a hub site will count against this limit. There is no limit on the number of sites that can be associated with a hub site. When users associate their sites with a hub, it doesn't impact the permissions of either the hub site or the associated sites. It's important to make sure all users you allow to associate sites to the hub have permission to the hub. Create a hub site in the new SharePoint admin center Go to Active sites in the SharePoint admin center, and sign in with an account that has admin permissions for your organization. [!NOTE] If you have Office 365 operated by 21Vianet (China), sign in to the Microsoft 365 admin center, then browse to the SharePoint admin center and open the Active sites page. Select the site, select Hub on the command bar, and then select Register as hub site. [!TIP] Using the Hub site menu, you can also associate a site with the hub site, change a site's association to a different hub site, or disassociate a site from a hub site. Enter a display name for the hub site, and specify the individual users or security groups you want to allow to associate sites with the hub. [!IMPORTANT] If you leave the People who can associate sites with this hub box empty, any user can associate their site with the hub.If you later want to change the hub site display name or the list of people who can associate sites with the hub, you need to use PowerShell or go to hub site settings on the hub site. Select Save. Related topics For info about using a site design that gets applied when sites join the hub, see Set up a site design for your hub site. For more info about site designs and site scripts, see SharePoint site design and site script overview. To learn how to use Microsoft PowerShell to create and manage hub sites, see Manage SharePoint hub sites. For info about how site owners can customize hub sites, see Set up your SharePoint hub site. For info about removing a hub site, see Remove a hub site.
OfficeDocs-SharePoint/SharePoint/SharePointOnline/create-hub-site.md/0
Create a hub site in SharePoint
OfficeDocs-SharePoint/SharePoint/SharePointOnline/create-hub-site.md
OfficeDocs-SharePoint
802
5
ms.date: 08172018 title: "Difference between classic & modern search experiences - SharePoint" ms.reviewer: ms.author: ruihu author: maggierui manager: jtremper recommendations: true audience: Admin f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-online ms.collection: M365-collaboration ms.localizationpriority: medium search.appverid: - SPO160 - MET150 ms.custom: - seo-marvel-apr2020 - admindeeplinkSPO description: "This article provides an overview of the difference between the classic and modern search experiences in Microsoft SharePoint." Differences between the classic and modern search experiences in SharePoint SharePoint in Microsoft 365 has both a classic and a modern search experience. Microsoft Search in SharePoint is the modern search experience. Both search experiences use the same search index to find results. As a search admin, you can’t enable or disable either search experience, both are enabled by default. Users get the classic search experience on publishing sites, classic team sites, and in the Search Center. Users get the Microsoft Search experience on the SharePoint start page, hub sites, communication sites, and modern team sites. Learn about classic and modern sites The most visible difference is that the Microsoft Search box is placed at the top of the SharePoint, in the header bar. Another difference is that Microsoft Search is personal. The results you see are different from what other people see, even when you search for the same words. You'll see results before you start typing in the search box, based on your previous activity and trending content in Microsoft 365, and the results update as you type. Learn more about the Microsoft Search experience for users. Search admin can customize the classic search experience, but not the Microsoft Search experience. As a search admin you can tailor Microsoft Search to your organization so it's easy for your users to find often needed content in your organization. For example, if your organization has Microsoft Search fully deployed, custom result sources at site collection or tenant level won't affect the search result. The search admin can use Microsoft search verticals instead. To learn more, see Manage search verticals. You use the SharePoint admin center to manage classic search and the Microsoft 365 admin center to manage Microsoft Search. Certain aspects of the classic search settings also impact the modern search experience: The search schema determines how content is collected in and retrieved from the search index. Because both search experiences use the same search index to find search results, any changes you make to the search schema, apply to both experiences. The Microsoft Search experience doesn't support changing the sort order of results or building refiners based on metadata. Therefore, the following search schema settings don’t affect the Microsoft Search experience: Sortable Refinable Company name extraction (deprecated since November 15, 2019) In environments where vertical configuration is available the modern search experience only shows results from the standard result source (Local SharePoint Results). To learn more, see Manage search verticals. In environments where vertical configuration is not available the modern search experience only shows results from the default result source. If you change the default result source, both modern and classic search experiences are impacted. Depending on the search scenario, some Microsoft Search features might not work if the classic global Search Center URL is not set to point to the URL of the default classic Search Center. Depending on your tenant, this URL is "yourcompanyname.sharepoint.comsearch" or "yourcompanyname.sharepoint.comsearchpages". Furthermore, ensure that the Search Center site collection exists and that all users have read access to it. If you temporarily remove a search result, the result is removed in both search experiences. The classic search experience lets admins define promoted results to help users find important content, while the Microsoft Search experience uses bookmarks to achieve the same. When you create a promoted result at the organization level, users might also see it on the All tab on the Microsoft Search results page if they searched across the whole organization. For example, when users search from the search box on a hub site, they're only searching in the sites associated with the hub and therefore they don't see any promoted results even if they are on the All tab. But when users search from the SharePoint start page, they might see promoted results on the All tab. If you have defined both a promoted result and a bookmark for the same content (same URL), only the bookmark will appear on the All tab.
OfficeDocs-SharePoint/SharePoint/SharePointOnline/differences-classic-modern-search.md/0
Differences between the classic and modern search experiences in SharePoint
OfficeDocs-SharePoint/SharePoint/SharePointOnline/differences-classic-modern-search.md
OfficeDocs-SharePoint
974
6
ms.date: 02252024 title: "Find your Microsoft 365 tenant ID" ms.reviewer: ms.author: mactra author: MachelleTranMSFT manager: jtremper audience: Admin f1.keywords: - CSH ms.topic: article ms.service: one-drive ms.localizationpriority: medium ms.custom: - Adm_O365 - onedrive-toc - has-azure-ad-ps-ref - azure-ad-ref-level-one-done search.appverid: - MET150 - BCS160 ms.collection: - Strat_OD_admin - M365-collaboration ms.assetid: 6891b561-a52d-4ade-9f39-b492285e2c9b description: "Learn how to find the Microsoft 365 tenant ID using the Microsoft Entra admin center." Find your Microsoft 365 tenant ID Your Microsoft 365 tenant ID is a globally unique identifier (GUID) that is different than your organization name or domain. You can use this identifier when you configure OneDrive policies. Find your Microsoft 365 tenant ID in the Microsoft Entra admin center Your tenant ID can be found in the Tenant ID box on the Overview page. [!NOTE] For info about finding your tenant ID by using PowerShell instead, first read Microsoft Graph PowerShell and then use Get-MgOrganization.
OfficeDocs-SharePoint/SharePoint/SharePointOnline/find-your-office-365-tenant-id.md/0
Find your Microsoft 365 tenant ID
OfficeDocs-SharePoint/SharePoint/SharePointOnline/find-your-office-365-tenant-id.md
OfficeDocs-SharePoint
333
7
author: omondiatieno ms.author: jomondi ms.date: 04042024 ms.topic: include ms.service: microsoft-graph-powershell [!NOTE] Azure AD and MSOnline PowerShell modules are deprecated as of March 30, 2024. To learn more, read the deprecation update. After this date, support for these modules are limited to migration assistance to Microsoft Graph PowerShell SDK and security fixes. The deprecated modules will continue to function through March, 30 2025. We recommend migrating to Microsoft Graph PowerShell to interact with Microsoft Entra ID (formerly Azure AD). For common migration questions, refer to the Migration FAQ. Be aware that versions 1.0.x of MSOnline may experience disruption after June 30, 2024.
OfficeDocs-SharePoint/SharePoint/SharePointOnline/includes/aad-powershell-deprecation-note.md/0
OfficeDocs-SharePoint/SharePoint/SharePointOnline/includes/aad-powershell-deprecation-note.md
OfficeDocs-SharePoint
166
8
ms.date: 06292021 title: "Use Microsoft 365 to connect leaders and teams" ms.reviewer: ms.author: ruihu author: maggierui manager: jtremper recommendations: true audience: Admin ms.topic: article ms.service: sharepoint-online localization_priority: Normal search.appverid: MET150 ms.collection: - m365solution-corpcomms - m365solution-scenario - highpri description: "Use Microsoft 365 to connect leaders and teams" Leadership connection: Use Microsoft 365 to connect leaders and teams Microsoft 365 can help your organization's leadership teams connect with employees to build community around a common purpose and goal. Learn how to create a culture and internal place to help engage and connect leadership teams with the rest of your organization. Help unite your organization by: Creating a sustainable, two-way dialogue between leadership and the rest of the organization by removing communication barriers Hosting organizational or department-wide meetings with live Q&A sessions so everyone is up-to-speed on leadership initiatives Sharing relevant news, ideas, and updates to encourage employees to engage and network with leadership teams Using analytics to gain insights on engagement and understand the impact of activities Bringing elements of the experience together using Viva Connections StageTasks ------ Plan- Define audience profile and scope audience size - Align with stakeholders - Develop a support team that ensures content is relevant, plan future events, and publish news - Consider multi-geo and multi-lingual options - Find corporate sponsorship and champions - Determine success metrics Build- Create a SharePoint leadership site - Start a new community in Viva Engage - Set up news that can be shared in SharePoint, Teams, and Outlook - Use M365 groups to provide access to large audiences - Use audience targeting to make sure your audience can find the right content - Get user feedback before launching your communications - Plan a kickoff event and identify promotional channels Launch- Share the leadership SharePoint site, Viva Engage community, and event invites with their intended audiences - Send email invites to the kickoff event - Use Microsoft Teams and Viva Engage to post messages about the kickoff event - Equip champions and corporate sponsors with promotional materials - Produce and host a live event with Q&A - Create feedback channels - Post the event recording on the leadership connection site Maintain- Ensure content stays relevant - Audit M365 group membership, access, and settings periodically - Develop a news publishing cycle - Schedule regular events and share them in advance Plan your leadership connection strategy Start by making sure you understand your audience's needs and preferences. Review how to profile and scope your audience. This scenario will have the most successful outcome when multiple communication methods are combined. Start with a SharePoint communication site and a Viva Engage channel that connects the organization with leadership. Then, plan and host a live event that is recorded and shared with others. Finally, bring all these communication elements together into an engaging mobile experience using Viva Connections. 1. SharePoint communication site - Create a SharePoint communication site that houses all things leadership – everything from news and announcements to events, people profiles and networking opportunities. Use the Leadership connection site template. 2. Viva Engage community - Use Viva Engage to instantly connect, ask questions, and share ideas with the leadership team. Then, use the Viva Engage web part to embed the conversation on the leadership connection SharePoint site. 3. Viva Engage live events - Use the Viva Engage community for live events and questions and answers. 4. Viva Connections - Bring the SharePoint site and Viva Engage conversation into one place where users can catch up and connect with leaders in the Microsoft Teams mobile app. Learn more about Viva Connections. Planning considerations Multiple communication solutions can be combined to provide opportunities across the organization to connect with leadership regardless of worksite location or time zone. Create opportunities outside of a live event to allow everyone within the organization to connect, network, and learn from leadership teams. Develop communication channels that collect feedback and questions for the leadership team, and save responses for the next live event. Create a practice of recording live events that can be shared and viewed later. Build durable communication methods that do not require organized events to generate content. Identify other opportunities across the organization to spread awareness about the leadership connection initiative such as new. employee orientation, regular communications to networking groups, and promotion in employee resource groups. Build your communication strategy Start by creating or using an existing SharePoint communication site. Then connect a new or existing Viva Engage community to the SharePoint site using the Viva Engage conversations web part. Then, plan and host live events, share the recordings on the leadership site, and keep the conversation going in Viva Engage channels. Step 1: Create a SharePoint leadership site SharePoint communication sites are a great tool to create a landing place to share leadership news, initiatives, and opportunities to network and connect. This is an ideal opportunity to use the Viva Engage web part on the home page to connect to an existing Viva Engage community that connects leadership team members with the rest of the organization. Use the Events web part to post and target leadership events to specific audiences. Image of the Leadership connection site template: Share recording from leadership events: Showcase leadership profiles: Get started creating your SharePoint leadership site There are several resources that can help you quickly create a leadership site: Use a SharePoint site template provided by Microsoft. Apply the Leadership connection template, then customize the site to for the needs of your organization. Create your own leadership site following step-by-step instructions in this guided walkthrough. Use the Leadership connection template in the SharePoint look book. The SharePoint look book Leadership connection template requires the use of SharePoint PowerShell and administrative credentials. Publish news from the leadership site Once your leadership site has been created and shared with the right audiences, you can publish news from this site. Then, use the News web part on the home page to display leadership news and target leadership news to specific audiences. Other sites across your organization can customize the settings in the News web part to pull news from an organizational news site. Consider cross promoting leadership news on the home site, on a new employee onboarding site, and where other leadership-oriented news can be found. Monitor usage analytics for your SharePoint site As a SharePoint site owner, you can view information about how users interact with your site. For example, you can view the number of people who have visited the site, how many times people have visited the site, and a list of files that have received the most views. Learn more about how to access usage data for your leadership site. Step 2: Create a leadership community in Viva Engage Communities in Viva Engage help employees share knowledge, engage with others in the employee experience, and provide a social platform for company-wide communications. It can also be used to drive leadership engagement by providing a central place for your conversations, documents, events, and updates. Viva Engage can help you host a live event with moderated Q&A for up to 10,000 attendees. Use Viva Engage to help engage your organization by: Engaging employees in the goals and vision of the company Informing employees about strategic initiatives and important updates Showing employees that their feedback is being heard by senior leadership If your organization doesn't already have a Viva Engage community that includes members of the leadership team, start by creating a new Viva Engage community. Then, use the Viva Engage web part to embed conversations or highlights from existing Viva Engage communities on the SharePoint site. New to Viva Engage? Learn more about managing a community in Viva Engage and administrative tips and tricks. Help onboard your organization to Viva Engage and help others understand how to use Viva Engage. Monitor conversations and engagement insights Viva Engage community insights help you measure your community's reach and engagement. You can find out more about the people, conversations, and questions and answers that make up your community. Learn more about managing communities in Viva Engage. Step 3: Host a live leadership event in Viva Engage with Q&A Create and produce live events for people in the leadership Viva Engage network, with built-in discussions for use before, during, and after the event. Up to 10,000 people can attend at once from anywhere on their device or computer (higher limits for event attendees are temporarily available through the Microsoft 365 Live Events Assistance program. Make the video available after the event on the leadership site, so that people who can't make it at the scheduled time can still participate. Image of live questions and answers during an event There are two ways live events in Viva Engage can be produced. The requirements depend on which video production methods you intend to use in your organization. Learn more about which method of live event you should use. For live events that only require visual and audio support, consider hosting a live event using Viva Engage in Teams. Once you've determined the right method for your live event, get started organizing and scheduling the event. Learn more about how to organize the live event in Viva Engage. Review the Viva Engage live event playbook and Viva Engage live event FAQs to understand all the different roles and responsibilities, how to ensure the event goes smoothly, and how to drive engagement. Consider using guidance from How to host a town hall for your organization to plan your live event. Insights and engagement metrics for live events Before the event starts, you will have access to a dashboard that will help you understand event reach, engagement during the event, top conversations, and more. Insights are only available to event organizers and producers. Learn more about how to view live event data and use insights to host powerful live events. Step 4: Expand your reach and engagement using Viva Connections Bring the SharePoint leadership site, news, and conversations from the Viva Engage community into one central place in Microsoft Teams using Viva Connections. Viva Connections creates an opportunity to push specific content and display popular resources by combining the power of your SharePoint intranet with Microsoft Teams and other Microsoft 365 apps like Viva Engage and Stream. Use Viva Connections to: Meet your employees in the apps and devices they know and love with a personalized view of news, conversations, and communities Promote events, news, and conversations in Viva Connections dashboard to specific audiences using audience targeting Boost important news and announcements to the top of employees' news feeds Viva Connections offers added functionality through three primary components - the Dashboard, the Feed, and Resources. Dashboard: The Viva Connections Dashboard is your employee’s digital toolset. It brings together the tools your employees need, enabling quick and easy access whether they are in the office or in the field. Feed: The Feed delivers updates to the right people at the right time and is tightly integrated with Viva Engage, SharePoint news, and Stream to display a personalized feed, based on post-level targeting of the groups that employees belong to. Resources: The Resources experience enables way finding across platforms. It uses navigation elements from the SharePoint app bar, which can be audience targeted. Launch and manage your communication strategy Help others in your organization discover the leadership connection resource. Consider hosting a kickoff event in Viva Engage to announce the new SharePoint site and Viva Engage community. Set the tone for what to expect in terms of ongoing engagement, mentorship and networking opportunities, event, and news publishing schedules. Launch checklist: Assign site owners and content authors who will be responsible for making sure the leadership site and news are always up to date Make sure your audience has access to the SharePoint site Make sure your audience has been added to the Viva Engage community Assign a Viva Engage community moderator and review Viva Engage community best practices Use audience targeting to highlight leadership events, news, and links to the leadership SharePoint site across your intranet Check site SharePoint usage and analytics and Viva Engage community insights during and after the launch to measure engagement More resources Overview of corporate communications Use the SharePoint Leadership connection site template Viva Connections for leaders
OfficeDocs-SharePoint/SharePoint/SharePointOnline/leadership-connection.md/0
Leadership connection: Use Microsoft 365 to connect leaders and teams
OfficeDocs-SharePoint/SharePoint/SharePointOnline/leadership-connection.md
OfficeDocs-SharePoint
2,618
9
ms.date: 07112018 title: Manage Business Connectivity Service Applications ms.reviewer: abloesch ms.author: ruihu author: maggierui manager: jtremper recommendations: true audience: Admin f1.keywords: - CSH ms.topic: article ms.service: sharepoint-online ms.localizationpriority: medium ms.collection: - M365-collaboration ms.custom: admindeeplinkSPO search.appverid: - SPO160 - MET150 ms.assetid: 2ced10aa-db9a-4828-a7a5-e47a57c3a823 description: Learn how to create BCS connections to data sources, such as SQL Azure databases or Windows Communication Foundation (WCF) web services, that are outside the SharePoint site. Manage Business Connectivity Service Applications In SharePoint in Microsoft 365, you can create Business Connectivity Services (BCS) connections to data sources, such as SQL Azure databases or Windows Communication Foundation (WCF) web services, that are outside the SharePoint site. Once you've created these connections, you can manage or edit BCS information in the SharePoint admin center. Microsoft SharePoint uses BCS together with Secure Store Services to access and retrieve data such as BDC Models from external data systems. See also Deploy a Business Connectivity Services hybrid solution in SharePoint. [!NOTE] Business Connectivity Services (BCS) in Microsoft 365 is a deprecated feature. On January 8, 2024, it will be disabled in new tenants as well as existing tenants that haven't used the feature since October 30, 2023. It will be retired and removed from all tenants on September 30, 2024. Customers are encouraged to explore using Microsoft Power Apps to replace their Business Connectivity Services solutions in Microsoft 365. For more information, see Business Connectivity Services (BCS) Retirement in Microsoft 365. Manage BCS permissions After setup is complete, user permissions control access to the data that the connection provides. BCS has two types of permissions: Object permissions Metadata Store permissions Object permissions Object permissions apply only to a specific External System, BDC Model, or External Content Type (ECT). Each ECT is a securable object. For example, if you have an ECT called WCFBookSales, object permissions apply only to the WCFBookSales object, and not to any other ECT that might be defined. To set object permissions for an object, follow these steps. Go to More features in the SharePoint admin center, and sign in with an account that has admin permissions for your organization. [!NOTE] If you have Office 365 operated by 21Vianet (China), sign in to the Microsoft 365 admin center, then browse to the SharePoint admin center and open the More features page. Under BCS, select Open. In the business data catalog section, select Manage BDC Models and External Content Types. Select the check box next to the name of the ECT or external system that you want to manage. On the ribbon, select Set Object Permissions. Enter a user account or group name in the text box, and then select Add. You can also select Browse to look for the name that you want. Select the name of the account or group for which you want to set access to the ECT or external system. You can set permissions for only one account at a time. If you have multiple accounts or groups, you have to set levels of access for each account separately, by selecting them one at a time. The following table describes the permissions and their associated access levels. PermissionNotes :-----:----- Edit Allows the user or group to create External Systems and BDC Models, to import BDC Models, and to Export BDC Models. This setting should be reserved for highly privileged users. Execute Allows the user or group to execute operations (create, read, update, delete, or query) on ECTs. Selectable in clients Allows the user or group to create external lists for any ECTs, and to view the ECTs in the external item picker. Set permissions Allows the user, group, or claim to set permissions on the Metadata Store. At least one user or group must have this permission on every BCS connection so that permissions management can occur. With this permission, a user can grant Edit permissions to the Metadata Store. This setting should be reserved for highly privileged users Metadata Store Permissions Metadata Store permissions apply globally to the whole BCS store. That is, they apply to all BDC Models, external systems, ECTs, methods, and methods instances that are defined for that external data system. You can set permissions on the metadata store to determine who can edit items and set permissions for the store. Metadata Store permissions apply to many objects, such as BDC Models, ECTs, and external systems. Because Metadata Store permissions can replace object permissions, they must be managed carefully. When applied with forethought, Metadata Store permissions can grant access quickly and completely. To set Metadata Store permissions, follow these steps. In the left pane of the new SharePoint admin center, select More features. Under BCS, select Open. Select Manage BDC Models and External Content Types. On the ribbon, select Set Metadata Store Permissions. Enter a user account or group into the text box, and then select Add. You can also select Browse to look for the account that you want. The account or group will appear in the second text box. If you have multiple accounts or groups, you must select them one at a time to set the level of access. PermissionNotes :-----:----- Edit Allows the user or group to create External Systems and BDC Models, to import BDC Models, and to export BDC Models. This setting should be reserved for highly privileged users. Execute Allows the user or group to execute operations (create, read, update, delete, or query) on ECTs. Selectable in clients Allow the user or group to create external lists for any ECTs, and to view the ECTs in the external item picker. Set Permissions Allows the user, group, or claim to set permissions on the Metadata Store. At least one user or group must have this permission on every BCS connection so that permissions management can occur. With this permission, a user can grant Edit permissions to the Metadata Store. This setting should be reserved for highly privileged users. To propagate permissions to all items in the Metadata Store, select Propagate permissions to all BDC Models, External Systems and External content types in the BDC Metadata Store. If you select this option, you'll replace all existing permissions (including object permissions) that you may have set anywhere else in your selected BCS Application. Import or export a Business Data Connectivity (BDC) Model The BDC Model view allows a user to import and export the underlying framework for the business data connection. This is very useful if you have to re-create the connection in a new environment. A BDC Model file can be imported to create an ECT connection to an external system. You can import or export two types of model files: Model Exports the XML metadata for a selected system. Resource Exports the localized names, properties, and permissions for a selected system. [!NOTE] You can create a BDC Model using XML code. If you do so, it's important to know that you cannot use the authentication modes RevertToSelf and PassThrough with SharePoint. Although you might be able to import a BDC Model that was written in XML, the connection will not be usable. Import a BDC Model When you import a BDC Model, you also import its specified permissions. Before you import a BDC Model, it's a good idea to understand how imported permissions interact with existing permissions. Imported permissions for a BDC Model are added to the store of existing permissions in the BDC service. If an entry for an object already exists in the access control list, the existing value is overwritten with the permissions information from the imported file. To import a BDC Model, follow these steps: In the left pane of the new SharePoint admin center, select More features. Under BCS, select Open. In the business connectivity services section, select Manage BDC Models and External Content Types. On the ribbon, select Import. In the BDC Model section, enter the name of the BDC Model File. The Model name must not include any special characters, such as ~ " % & : \ ? \ { } or the character 0x7f. You can also select Browse to locate the .bdcm file for a BDC Model. In the File Type section, select Model or Resource as the file type that you want to import. In the Advanced Settings section, select one or more of the following resources to import: Localized names to import localized names for the ECTs in a particular locale. Imported localized names are merged with the existing localized names by Business Data Connectivity. Properties to import properties for ECTs. Imported properties are merged with the existing property descriptions by Business Data Connectivity. Permissions to import permissions for ECTs and other securable objects in the model. (Optional) To save the resource settings in a file for later use, type a name for the resource file in the Use Custom Environment Settings text box. Select Import. Export a BDC Model You can export a BDC Model and then read its contents to determine differences between connections. This can be useful if you are troubleshooting. You can also import an exported BDC Model file into another environment for testing or reuse. To export a BDC Model or Resource file, follow these steps: In the left pane of the new SharePoint admin center, select More features. Under BCS, select Open. Select Manage BDC Models and External Content Types. Select the dropdown, and in the View group, select BDC Model. Select the name of the BDC Model that you want to export, on the ribbon, select Export. On the Business Data Connectivity Models page, select the model or resource file to export. On the Export page, in the File Type section, to specify the type of file that you want to export, select Model or Resource. In the Advanced Settings section, to further refine the data export, select one or more of the following: To export localized names for the ECTs in a particular locale, selectLocalized names. To export properties for ECTs, select Properties. To export permissions for ECTs, select Permissions. To export an implementation-specific proxy that is used to connect to the external system, select Proxies. . If you saved a file of resource settings for later use, enter the name of the file to export in the Use Custom Environment Settings field. Select Export to start a dialog that enables you to save a .bdcm file to your local drive. You can open the .bdcm file in a text editor. Add actions to external content types By adding actions to ECTs, administrators associate a uniform resource locator (URL) with an ECT. This automatically starts a specified program or opens a specified web page. Actions can specify parameters that are based on one or more fields in an ECT. For example, you can create an action for an ECT that specifies a Search page URL. The parameter for this action might be the ID of an item in the external data source. This would allow you to specify a custom action for the ECT that automates search for this item. [!NOTE] When you add a new action to an ECT, that action is not added to existing external lists for that ECT. The action is only available in new external lists for the ECT. To add an action to an ECT, follow these steps. In the left pane of the new SharePoint admin center, select More features. Under BCS, select Open. Select Manage BDC Models and External Content Types. Point to the name of the ECT to which you want to add an action, and then select the arrow that appears. From the menu, to open the Add Action page, select Add Action. In the Name field, give the action a meaningful name. In the URL field, for the action you want to open, enter the URL. [!NOTE] Under the control, you can find an example URL. The example shows how to add one (or more) parameter place-holders such as {0}, or {1} (http:www.adventure-works.comsample.aspx?p0={0}&p1={1} ). If you want web parts on the site to be able to use this new action, select one of the following options: CommandAction :-----:----- Yes Starts the action in a new browser window (preserves the page context). No Starts the action in the same browser window. In the URL Parameters field, specify any parameters that are required by the URL. These are numbered in the interface starting at 0. Decide if you want to use an Icon or not. This field also allows you to use Standard icons. If you want the action to be the default action, select the Default Action check box. [!IMPORTANT] Parameters can contain personally identifying information such as names and Social Security numbers. When you design an action, be careful not to use fields that display personally identifying information. View external data and external data settings You use the View section of the ribbon to choose different views of BCS connections. The three views display information about the BCS connection in different ways, and give you access to different actions. It is important to become familiar with these views because some tasks are available only in specific views. The three view options are BDC Models, External Systems, and External Content Types, as shown in the following illustration. For more information about how you can use these views to help manage BCS, see the sections that follow. External Content Types view By default, the BCS connection uses the External Content Types view. This view shows Service Application Information, and lists the following information: ECT name ECT display name ECT type namespace Namespace version External system name For most processes in BCS, this view is sufficient. However, if there are many ECTs, this view can be difficult to navigate. External Systems view The External Systems view shows a BCS connection in terms of its system of origin. This view is useful if you want to know the BCS connection information after you create the BCS. In this view, you can see the property settings for a named External System. In addition, you can configure some of the property settings. View property settings The name of the External System appears on this page as a selectable link (a navigable URL). You can select the URL to open a window that shows the original property settings for that store. In addition, if you are connected to SQL Azure, you can see the database server name and database, in this view. Depending on the type of BCS connection, the property settings can include any combination of the following items: Access Provider (such as WCF Service) Authentication Mode (such as User's Identity) Database Server Impersonation Level (such as None, Anonymous, Identification, Impersonation, Delegation) Initial Database name Integrated Security (such as SSPI) Secure Store Implementation Secure Store Target Application ID (as the ID entered in Secure Store) Service EndPoint Address (such as the URL pointing to SomeWCFService.svc) Connection Pooling (ActiveInactive) Secondary Secure Store Target Application ID Secure Store Implementation Configure property settings If you point to an External System Name, you can open a shortcut menu that includes a Settings command. This is useful for SharePoint connections that use Windows Communication Foundation (WCF) Web Services. By selecting the Settings option from the menu, you can configure any of the following settings: Metadata Exchange URL Metadata Exchange Discovery Mode Web Services Description Language (WDSL) Authentication Mode WSDL Secure Store Target Application Id Secure Store Implementation. BDC Model view The BDC Model view offers ribbon commands that enable you to import or export BDC Models. In addition, the BDC Model view can make it easier to move around in a very large collection of ECTs. Because the BDC Model shows hyperlinks for each distinct connection, rather than showing all ECTs for each connection, it can make a more manageable list. If you want to see all the ECTs for a BDC Model, select the name of the Model. If you select the name of an ECT, open a table that shows the fields that are defined for the ECT. It resembles the following table. NameTypeDisplay by Default :-----:-----:----- Order Id System.String No Employee Id System.String No Freight System.Nullable '1[[System.Decimal, .... No This display can closely mirror the layout of the data source connected via an ECT, and give better insight into the structure of the underlying data. Also, at the bottom of the page, any Associations, Actions, or Filters for this ECT appear.
OfficeDocs-SharePoint/SharePoint/SharePointOnline/manage-business-connectivity-service-applications.md/0
Manage Business Connectivity Service Applications
OfficeDocs-SharePoint/SharePoint/SharePointOnline/manage-business-connectivity-service-applications.md
OfficeDocs-SharePoint
3,824
10
ms.date: 03212024 title: "Manage sites in the SharePoint admin center" ms.reviewer: daminasy ms.author: ruihu author: maggierui manager: jtremper recommendations: true audience: Admin f1.keywords: - CSH ms.topic: how-to ms.service: sharepoint-online ms.collection: - Strat_SP_admin - M365-collaboration - m365initiative-spsitemanagement - essentials-manage ms.custom: - seo-marvel-apr2020 - admindeeplinkSPO search.appverid: - SPO160 - MET150 - BSA160 ms.assetid: d8c63491-0410-405c-880a-8cef7fa4480a description: "In this article, you learn about tasks you can perform on the Active sites page of the SharePoint admin center, such as view site details, view and change site membership, and change a site's hub association." Manage sites in the SharePoint admin center The Active sites page of the SharePoint admin center lets you view the SharePoint sites in your organization, including communication sites, channel sites, and sites that belong to Microsoft 365 groups. It also lets you sort and filter sites, search for a site, and create new sites. The Active sites page lists the root website for each site collection. [!NOTE] Subsites and the following sites aren't included: Sites with these URLs aren't included for the Active sites page: URLDescription :-----:----- sitescontentTypeHub Content hub host sitesCompliancePolicyCenter Policy center portalhub PointPublishing hub search Search site personal OneDrive sites Sites with these templates aren't included for the Active sites page: IDNameDescription :-----:-----:----- 6000 REVIEWCTR Review center 10043 FunSite SharePoint tenant fundamental site 65 POINTPUBLISHINGHUB PointPublishing hub 66 POINTPUBLISHINGPERSONAL0 Personal blog 67 POINTPUBLISHINGTOPIC0 PointPublishing topic 3500 POLICYCTR Compliance policy center 30003 TestSite Test site 3 CENTRALADMIN Central admin site 54 SPSMSITEHOST My Site host 21 SPSPERS SharePoint Portal Server personal space 16 TENANTADMIN Tenant admin site 301 REDIRECTSITE Redirect site 70 CSPCONTAINER CSP container Note that you may see differences between the sites in the active sites list and those listed in the SharePoint site usage report in the Microsoft 365 admin center because the site templates and URLs listed above are included in the SharePoint site usage report. For more info about tasks on the Active sites page, see: Create a site Register a site as a hub site and Unregister a site as a hub site Change sharing settings for a site Delete a site Manage site storage limits Add or remove site admins and group owners For all site types except channel sites, you can add or remove site admins and change the primary admin. For group-connected team sites, you can also add and remove group owners. Note that if you remove a person as a primary admin, they will still be listed as an additional admin. For info about each role, see About site permissions. In the SharePoint admin center, select Sites > Active sites or browse to the Active sites page. In the left column, select a site. Select Membership on the command bar to open the details panel to update the permissions of the members. :::image type="content" source="mediaadd-remove-site-members.png" alt-text="Screenshot of membership tab in details panel"::: Add or remove people or change their role, and then select Save. Change a site's hub association In the SharePoint admin center, select Sites > Active sites or browse to the Active sites page. In the left column, select a site. Select Hub on the command bar. The options that appear depend on whether the site you selected is registered as a hub site, or associated with a hub. The Hub menu lets you register a site as a hub site, associate it with a hub, change its hub association, and unregister it as a hub site. For more information, see More info about hubs. View site details For more info about a site, select the site name or click anywhere on the site row except on the URL column to open the details panel or for channel sites select the link in the Channel sites column and then select the site name. To view site activity including the number of files stored and storage usage, select the Activity tab. Activity information is not available for US Government GCC High and DoD customers. To view site admins, owners, members, and visitors, select the Membership tab. :::image type="content" source="mediaview-site-members.png" alt-text="Screenshot of Membership tab selection on details panel"::: For info about the roles in this panel, see About site permissions. Related topics Manage site storage limits
OfficeDocs-SharePoint/SharePoint/SharePointOnline/manage-sites-in-new-admin-center.md/0
Manage sites in the SharePoint admin center
OfficeDocs-SharePoint/SharePoint/SharePointOnline/manage-sites-in-new-admin-center.md
OfficeDocs-SharePoint
1,158
11
ms.date: 09232021 title: Onboard new employees into your organization ms.reviewer: ms.author: ruihu author: maggierui manager: jtremper recommendations: true audience: Admin f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-online localization_priority: Priority ms.collection: - Strat_SP_modern - M365-collaboration - m365solution-corpcomms - m365solution-scenario - highpri search.appverid: - SPO160 - MET150 description: "Learn how to onboard new employees into your organization using Microsoft 365" Onboard new employees into your organization Make new employee onboarding (sometimes referred to as NEO) a great experience by fostering an all-in-one hybrid work environment where new employees can find important resources, meet people in their organization, and prepare to be successful in their new role. When entering a new organization, knowing where resources are located, who to go to for help, and how to find training materials in an organized and personalized environment will help new employees navigate your organization efficiently. This article will guide you on how to create a welcoming onboarding environment for new employees using Microsoft’s modern work tools. Create a virtual onboarding experience by: Planning and implementing an end-to-end experience based on your organization’s tools, resources, and initiatives Hosting a virtual welcome event to share resources and introduce onboarding buddies Using a SharePoint site template to provide a home base for new team members Creating a personalized onboarding checklist using the Onboarding checklist list template Curating a training experience for new team members using Viva Learning Overview of how to create an onboarding experience Icon Phase Tasks :-------------------: :-------------------: :---------------- Plan - Scope your audience - Map out the employee onboarding experience - Review Microsoft 365 onboarding tools - Consider creating a new employee support community in Viva Engage Build - Consider using a live event to welcome and orient new team members - Include partners like hiring managers, business owners, and human resources in the planning process - Have a plan to maintain NEO content over time - Create an onboarding site using the New employee onboarding template - Customize an Onboarding checklist template in Microsoft 365 Lists - Prepare a survey to ask for feedback at the end of onboarding Launch - Send out invites, instructions, and resources using Outlook Mail and Calender. - Gather insights from each platform used to measure overall performance of the employee onboarding process - Use feedback to inform the next onboarding session. Plan the onboarding experience :-------------: :--------------: :------------: Step 1: Scope your audience and goals Step 2: Map the onboarding experience Step 3: Review Microsoft 365 tools -Determine the needs and the size of your audience. -Define deliverables and desired outcomes of the onboarding experience -Establish tone and messaging -Establish and onboarding timeline -Define which onboarding tasks need to be completed - Prioritize onboarding tasks -Curate learning materials -Ask for feedback frequently - choose onboarding tools that will best suit your needs. Planning considerations: Even before a new employee’s first day on the job, they will need some information about NEO before the event itself. Consider a pre-onboarding SharePoint site that can help new employees gather the right documents and prepare for the NEO event as soon as they sign their offer letter. Some new employees cannot attend the NEO event in person. Consider if it’s better to pre-record an event that can be shown to people who cannot attend in person or if you can live-stream the in-person event and new employees who are remote can join online. Regardless of the NEO event attendance is in person or remotely, the NEO audience will need access to the same resources on desktop and mobile devices. Step 1: Start by determining your audience profile and size New employees will have diverse backgrounds, roles, and career experience. Some new employees will be able to attend an event in person while other can only attend remotely. Some tools and resources will need to be accessible from a mobile device. The same resources will need to be available regardless of the new employee orientation method. The size of your onboarding group will determine the platforms and resources you use. Learn more about defining your audience profile and scoping audience size. Groups of 1-19 Groups of 20 or more :--------------- :---------------- Office (PowerPoint, word) SharePoint Teams Viva Learning Viva Connections SharePoint Spaces Office (PowerPoint, word) SharePoint Teams Viva Learning Viva Connections SharePoint Spaces Viva Engage Define the desired outcome and priority content Spend time defining what success looks like when orientation is complete. In many organizations, onboarding happens for more than one part of the business, consider offering an onboarding structure that allows the new employee to become familiar with the organization and the team they will be working with. Consider setting up modules that guide employees through onboarding and lets them know about their progress. Consider the tone and messaging for your onboarding process Make new employees feel welcome and excited to be joining your organization. Use PowerPoint to prepare a well-crafted message that lists out the various benefits and resources your new employees can take advantage of as members of your organization. Introduce them to the culture of your company by laying out company history, fun facts, and more. Step 2: Map the ideal employee onboarding experience From inception to orientation to day one on the job, the new employee journey has a huge impact on the success of that employee. Ensure the new employee is provided the best tools to start out with by mapping out the new employee onboarding experience. A. Establish an onboarding timeline Determine how long the onboarding process should take based on the role and level of career experience. Think about how often new team members should be taking training courses and attending events that connect them with leadership and the rest of the organization. B. Scope and define the most important content and tasks Create deliverables that let you know that each employee has achieved full competency in a specific subject that is important to the job. This can be learning about the company itself, specific tools, organizational expectations, and more. C. Collect, prioritize, and organize resources and content When entering a new environment, understanding the order in which deliverables should be completed may be a blind spot for new employees. Help them prioritize onboarding deliverables by laying them out in a way that signals priority. This could be as simple as writing them out for them or integrating Microsoft Lists. Learn how to use Microsoft Lists to help onboard employees. D. Create a personalized training experience Which learning materials are needed to help make the new employee successful? Curate learning modules for each employee that helps orient them on organizational needs, team needs, and the tools that they will be working with daily. Learn more about using Viva Learning. E. Get feedback along the way The goal of every onboarding process is to make sure each new employee feels fully equipped to be successful in their new job. Receiving honest feedback once employees are finished with the onboarding process is the best way to fill in the gaps and improve onboarding for incoming employees. Find out what worked and what could use improvement by creating simple and thoughtful surveys. Learn more about getting feedback using Microsoft Forms. Step 3: Review Microsoft 365 onboarding tools A great option is to use Teams live meetings, live events, and chat functionality to welcome new employees and provide an engaging experience for those that can attend in-person as well as remote. Direct users to information and contacts on a SharePoint communications site. Consider using a NEO Viva Engage channel where new people can ask questions and get answers. You can even consider using SharePoint spaces for virtual training or facility tours. Then, bring it all together using Viva Connections, which integrates SharePoint content into Microsoft Teams. Microsoft Teams meeting - Meetings in Teams include audio, video, and screen sharing, and are great for communicating with groups of fewer than 300 individuals. Teams can be used for meetings that happen in person, remotely, or a combination of the two. Additionally, the meeting can be recorded and shared with new employees who were unable to attend the meeting. SharePoint communication site - Create a SharePoint communication site using the New employee onboarding site template. Use the site to direct new employees to relevant resources, contacts, and conversations in new employee Viva Engage communities. Viva Engage channel – Create a Viva Engage channel just for new employees. Choose to create one NEO channel for the entire organization or region or create channels for each new group of new employees. Then, use the Viva Engage web part on the SharePoint NEO site to integrate the conversation with other resources and contacts. Microsoft Viva – Viva modules are personalized employee experiences that display as apps in Microsoft Teams. Use Viva Connections to bring onboarding resources and tools into one place. Content in the Viva Connections experience can be targeted to specific audiences. Use Viva Learning to curate learning modules for each employee that helps orient them on organizational needs, team needs, and the tools that they will be working with daily. SharePoint spaces - Use SharePoint Spaces to create an immersive platform that allows new employees to take a virtual tour of the campus. Build the new employee onboarding experience Build out the employee onboarding experience one platform at a time. The tools listed here can be connected to each other providing a thorough and connected onboarding journey for the new employee. Create an onboarding site using SharePoint Create an internal communication site that acts as a home for new employees. This site should lay out resources, deliverables, and company information in and organized way. This site should also contain the most up-to-date information to ensure the new employee has everything needed to be successful. Try organizing action items on this site in a way that signals priority. List the things that need to be finished first at the top. This will help employees organize their work in onboarding to the best of their ability and gives each new employee the same starting point. Microsoft offers a new customizable employee onboarding SharePoint communication site template that can help get you started. Get started with SharePoint site templates Use live meetings to administer a virtual employee orientation With the emergence of virtual work, having face time with other people in the organization is still important in making sure employees can communicate with leaders, peers, and ask top-of-mind questions. Use Microsoft Teams to administer virtual employee orientation. Make this live event fun and engaging by giving employees an opportunity to communicate with each other in the Teams chat box or allow employees to ask questions on camera or through the moderated Q&A in Teams live events. Learn more about Teams live events. Give employees a tour of the campus using SharePoint Spaces SharePoint Spaces is a tool that provides immersive online experiences using 2D and 3D web parts. Guide new employees through a virtual 360-degree tour of the campus. Incorporate helpful links and web parts that provide important information and helps the employee complete onboarding tasks.Learn more about SharePoint Spaces. Help the new employee prioritize deliverables with Microsoft Lists When entering a new environment, understanding the order in which onboarding tasks should be completed is important. Help them prioritize onboarding deliverables by laying them out in a way that signals priority. Consider using Microsoft Lists and List templates and add them as tabs in the New employee Teams channel for easy access. You can also integrate List into the new employee onboarding SharePoint site using the List web part. Use Viva Connections and integrate with other Microsoft Viva experiences Help orient new employees by providing them with the most important company information, resources, and updates regularly. Viva Connections is a digital platform built on the existing capabilities of Microsoft 365 that allows organizations to customize employee experiences and gain access to the most important information the most used tools and resources on both desktop and mobile devices. Learn more about creating onboarding experiences using Microsoft Viva. Integrate multiple Viva modules to create a rich onboarding experience In the image above, Viva Connections is being used to display cards that integrate with other Viva experiences. Create a card that links to Viva Insights to help new team members understand how to spend time productively and help managers provide better guidance on time management. Learn more about Viva Insights and how it helps protect and organize worklife balance. A separate card integrates with Topics, which helps to address a key business issue in many companies — providing the information to users when they need it. For example, new employees need to learn a lot of new information quickly, and encounter terms they know nothing about when reading through company information. Learn more about how Topics can help with knowledge management. Viva Learning is an application for Microsoft Teams that allows users to discover, recommend, and access learning modules from different platforms to help users gain knowledge in any specific focus area. Viva Learning pulls content from LinkedIn Learning, Microsoft Learn, Microsoft 365 Training, and other partners. Use Viva Learning to curate learning modules for each employee that helps orient them on organizational needs, team needs, and the tools that they will be working with daily. In Viva Learning, managers of an organization or team can assign learning and track the learning progress of each module. The Viva Learning home view aggregates a variety of information, including assigned content from learning management systems, recommended learnings, trending content, and learning provider content libraries. Learn more about Viva Learning. Learn more about Viva modules: Viva Learning Topics Viva Insights Viva Connections --------------- Target specific training for new employees. New team members can learn about popular terms, acronyms, projects, and more. Help new team members spend time productively. Provide easy access to benefits, training materials, and resources. Create Virtual Communities for new employees in Viva Engage Give new employees the opportunity to connect with each other and build a supportive virtual community in Viva Engage. Viva Engage is a platform that connects leaders, communicators, and employees to build communities, share knowledge, and engage across the organization. Viva Engage allows you to set up a community specifically for new employees within the organizational platform. Naturally, employees come across information at different times in their onboarding process. Creating a space where new employees can build a community for themselves gives them the opportunity to share information as they come across it. In this virtual space, new employees can share resources, share ideas, and get to know each other. It also helps them build internal bonds that can last over the course of their career. Learn more about building communities in Viva Engage. Ask for feedback using Microsoft Forms The best way to improve the new employee onboarding experience is by asking for feedback. Understand what worked best and what could use improvement by asking for feedback after all deliverables have been completed by new employees. Use Microsoft forms to build short surveys that answer the questions that could inform the onboarding experience moving forward. Learn more about Microsoft forms. Launch the new employee onboarding experience Use Outlook to invite all new employees to orientation. In this email, lay out all the resources starting with the Viva Connections homepage that contains links to the New Employee SharePoint site, the New Employee orientation live meeting event post, learning objectives and more. Be sure to include any necessary information and important direct contact. Gather insights from each platform along the way. Use these insights to inform managers, team leads, and other stakeholders. Look for insights such as live event attendance, audience reach, site traffic, and more. Use the feedback gathered from each onboarding cycle to inform the next onboarding cycle. More Resources Review Microsoft 365 communication tools and methods Corporate communications overview Use the SharePoint New employee onboarding site template
OfficeDocs-SharePoint/SharePoint/SharePointOnline/onboard-employees.md/0
Onboard new employees into your organization
OfficeDocs-SharePoint/SharePoint/SharePointOnline/onboard-employees.md
OfficeDocs-SharePoint
3,404
12
ms.date: 06102022 title: Plan file sync for SharePoint and OneDrive in Microsoft 365 ms.reviewer: ms.author: ruihu author: maggierui manager: jtremper recommendations: true audience: Admin f1.keywords: NOCSH ms.topic: concept-article ms.service: sharepoint-online ms.localizationpriority: medium ms.collection: - essentials-get-started ms.custom: intro-get-started search.appverid: MET150 description: Learn how to plan file sync for SharePoint and OneDrive in your organization Plan file sync for SharePoint and OneDrive in Microsoft 365 Even though users can upload, download, and interact with SharePoint and OneDrive files from a web browser, the ideal experience comes with the OneDrive sync app for Windows and Mac, and the iOS and Android mobile apps. The OneDrive sync app has a variety of configuration options for compliance, performance, user experience, and disk space management. While these can be configured at any time, it's important to consider some of them as part of your rollout plan. Key decisions for sync: How do you want to deploy the sync app? How do you want to manage sync on Windows computers? Which update ring do you want to use? Do you want to limit network utilization for sync? Do you want to sync commonly used folders with OneDrive Do you want to limit which organizations users can sync with? Do you want to allow users to sync their personal OneDrive? Do you want to block certain file types from being uploaded? Do you need to sync files in a hybrid environment with SharePoint Server? Do you want to limit sync to computers joined to a specific domain? For information about the recommended configuration options for the sync app, see Recommended sync app configuration. How do you want to deploy the sync app? You have several different options for deploying the OneDrive sync app: manually, using scripting, using Windows Autopilot (for the sync app on Windows), using a mobile device management solution such as Intune, or using Microsoft Endpoint Configuration Manager. The OneDrive sync app is included as part of Windows 10, Windows 11, and Office 2016 or higher. You do not need to deploy the sync app to devices running these, though you may need to update the sync app to the latest version. To deploy the OneDrive sync app to Windows using Microsoft Endpoint Configuration Manager, see Deploy OneDrive apps by using Microsoft Endpoint Configuration Manager. If you need to install the sync app on a single computer, see Install the sync app per machine. For a full list of OneDrive sync app requirements, see OneDrive sync app system requirements. How do you want to manage sync on Windows computers? You can manage OneDrive sync app settings on Windows computers using Windows Group Policy or by using administrative templates in Intune. Using group policy requires that Windows computers be joined to an Active Directory domain. Using Intune requires that the device be managed by Microsoft Endpoint Manager. For information, see: Use OneDrive policies to control sync settings Use administrative templates in Intune Mac settings are configured using .plist files. For information, see Deploy and configure the OneDrive sync app for Mac. Which update ring do you want to use? You can select how soon your users receive updates we release for the sync app. Insiders ring - In this ring, users get the first changes that are released to the public. We recommend selecting several people in your IT department to join this ring. Production ring – In this ring, users get fixes and new features in a timely fashion. We recommend leaving everyone else in the organization in this ring. Enterprise ring – In this ring, you have more control over the deployment of updates, but users have to wait longer to receive fixes and new features. Configure the following policy to set the sync app update ring. PolicyWindows GPOMac :-----:----------:-- Set the sync app update ringGPOSetUpdateRingTier For details about the update process for the OneDrive sync app, see The OneDrive sync app update process. Do you want to limit network utilization for sync? Depending on your network capacity, you may want to consider limiting how much network bandwidth the sync app can use. This can be useful during a migration phase when large amounts of content are being synced. Use the following policies to limit the network bandwidth used by the sync app. PolicyWindows GPOMac :-----:----------:-- Limit the sync app upload rate to a percentage of throughputAutomaticUploadBandwidthPercentageAutomaticUploadBandwidthPercentage Enable automatic upload bandwidth management for OneDriveEnableAutomaticUploadBandwidthManagementNA Do you want to sync commonly used folders with OneDrive? Users often save files to their documents folder or desktop. They may not realize that they should save these files in OneDrive. You can automatically sync these commonly used folders to OneDrive, prompt users to do so, or prevent them from doing so. Use the following policies to configure how users commonly used folders are synced with OneDrive. PolicyWindows GPOMac :-----:----------:-- Silently move commonly used folders to OneDriveKFMSilentOptInKFMSilentOptIn Prompt users to move their commonly used folders to OneDriveKFMOptInWithWizardKFMOptInWithWizard Prevent users from stopping sync of their commonly used folders to OneDriveKFMBlockOptOutKFMBlockOptOut Prevent users from moving their commonly used folders to OneDriveKFMBlockOptInKFMBlockOptIn For more information about syncing commonly used folder with OneDrive, see Redirect and move Windows known folders to OneDrive and Redirect and move macOS Desktop and Documents folders to OneDrive. Do you want to limit which organizations users can sync with? By default, users can sync shared libraries from other organizations. You can limit this to specific organizations or disable it all together. Use the following policies to configure which organizations users can sync with. PolicyWindows GPOMac :-----:----------:-- Allow syncing OneDrive accounts for only specific organizationsAllowTenantListAllowTenantList Block syncing OneDrive accounts for specific organizationsBlockTenantListBlockTenantList Prevent users from syncing libraries and folders shared from other organizationsBlockExternalSyncBlockExternalSync For more information about syncing with other organizations, see B2B Sync. Do you want to allow users to sync their personal OneDrive? Depending on your governance practices, you can prevent users from syncing their personal OneDrive accounts to devices managed by your organization. Use the following policies to specify if users can sync personal OneDrive accounts. PolicyWindows GPOMac :-----:----------:-- Prevent users from syncing personal OneDrive accountsDisablePersonalSyncDisablePersonalSync Do you want to block certain file types from being uploaded? You can specify if you don't want users to be able to upload certain types of files using the sync app. Use the following policy to configure this. PolicyWindows GPOMac :-----:----------:-- Exclude specific kinds of files from being uploadedEnableODIgnoreListFromGPOEnableODIgnore This can also be configured in the SharePoint admin center. For more information, see Block syncing of specific file types. Do you need to sync files in a hybrid environment with SharePoint Server? If your organization uses SharePoint Server 2019 or SharePoint Server Subscription Edition, you can sync files using the OneDrive sync app. For information, see Configure syncing with the new OneDrive sync app. If you are using the previous OneDrive sync app (Groove.exe), see Transition from the previous OneDrive for Business sync app for information on how to move to the new OneDrive sync app. Do you want to limit sync to computers joined to a specific domain? To make sure that users sync OneDrive files only on managed computers, you can configure OneDrive to sync only on PCs that are joined to specific domains. For more information, see Allow syncing only on computers joined to specific domains. Next steps [!div class="nextstepaction"] Plan for content migration Related topics Plan for SharePoint and OneDrive in Microsoft 365
OfficeDocs-SharePoint/SharePoint/SharePointOnline/plan-file-sync.md/0
Plan file sync for SharePoint and OneDrive in Microsoft 365
OfficeDocs-SharePoint/SharePoint/SharePointOnline/plan-file-sync.md
OfficeDocs-SharePoint
1,784
13
ms.date: 07112018 title: "Manage query client types" ms.reviewer: ms.author: ruihu author: maggierui manager: jtremper recommendations: true audience: Admin f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-online ms.collection: M365-collaboration ms.custom: admindeeplinkSPO ms.localizationpriority: medium search.appverid: - SPS150 - SPO160 - MET150 ms.assetid: 0d335bc4-e7a0-46bc-ba40-da34e414174f description: "Learn how query client types decide in which order queries are performed." Manage query client types [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] Learn how query client types decide in which order queries are performed. A query client type is how a client performing a query tells the system what type of client it is. For example, a client might tell us it is UI, or an automated query. Query throttling monitors the use of resources and protects the search system. Administrators can use client-type information for throttling, to make sure lower-priority clients like automated queries don't squeeze out higher-priority clients like UI. Query client types are also used for things like logging, reports, and determining relevance. The client sets the client type as a label in the query. The administrator configures the valid client types (though some are default and mandatory), and the client chooses one for each query. [!NOTE] You can't turn query throttling on or off. Add a query client type [!NOTE] You can change the name of a client type that has been created for your tenant only. Go to More features in the SharePoint admin center, and sign in with an account that has admin permissions for your organization. [!NOTE] If you have Office 365 operated by 21Vianet (China), sign in to the Microsoft 365 admin center, then browse to the SharePoint admin center and open the More features page. Under Search, select Open. On the search administration page, select Manage Query Client Types. To add a client type, select New Client Type. On the Edit a client type page, in the Query Client Type field, for the client type, enter a name. From the Throttling Tier list, select either Top, Middle, or Bottom. [!NOTE] Lower priority queries are throttled first. The search system processes queries from top tier to bottom tier. Select OK. Prioritize a client query type You can use throttling tiers to prioritize query processing. When the resource limit is reached, query throttling kicks in, and the search system processes queries, starting from the top tier, right through to the bottom tier. Go to More features in the SharePoint admin center, and sign in with an account that has admin permissions for your organization. [!NOTE] If you have Office 365 operated by 21Vianet (China), sign in to the Microsoft 365 admin center, then browse to the SharePoint admin center and open the More features page. Under Search, select Open. On the search administration page, select Manage Query Client Types. Go to the Client Type section, and select the System Type that you want to change. From the Throttling Tier list, select either Top, Middle, or Bottom. [!NOTE] Lower priority queries are throttled first. The search system processes queries from top tier to bottom tier. Select OK.
OfficeDocs-SharePoint/SharePoint/SharePointOnline/query-throttling.md/0
Manage query client types
OfficeDocs-SharePoint/SharePoint/SharePointOnline/query-throttling.md
OfficeDocs-SharePoint
817
14
ms.date: 04122024 title: "Curate the allowed list for Restricted SharePoint Search" ms.reviewer: ms.author: ruihu author: maggierui manager: jtremper recommendations: true audience: administrator ms.topic: how-to ms.service: sharepoint-online ms.localizationpriority: medium ms.collection: - Ent_O365_Hybrid - M365-collaboration - m365copilot - magic-ai-copilot description: "Learn how to use SharePoint Admin Center active sites report and SharePoint Advanced Management Data Access Governance report to find the most active and shared sites for the Restricted SharePoint Search allowed list. " Curate the allowed list for Restricted SharePoint Search [!IMPORTANT] Restricted SharePoint Search is designed for customers of Copilot for Microsoft 365. Visit here and the overview of Restricted SharePoint Search for more information. As a Global and SharePoint admin, you can set up an allowed list of Restricted SharePoint Search with a maximum of 100 selected SharePoint sites. For Copilot and organization-wide search, besides the contents that they already have access to, either by direct sharing, visiting, or owning, your organization’s users will only be able to reach the sites on the allowed list, honoring these sites’ current permissions. Setting up the allowed list in Restricted SharePoint Search gives you time to review and audit site permissions. But which sites should be included in the allowed list? This article introduces strategies and techniques for curating the allowed list. Steps to create the curated allowed list To create a curated allowed list for Restricted SharePoint Search, we recommend you start by creating an initial list of sites. Then you work with your site admins and stakeholders to assess permissions and review the sites. Finally, you can apply the list with PowerShell scripts as a Global admin. Step 1: Get an initial list of sites Every organization might have different criteria on what sites you choose to be searched across your organization and discoverable by Copilot. You can use SharePoint Admin Center (SPAC) features to identify sites that can be part of the allowed list based on your own criteria. To keep the list manageable, we recommend starting with the following two types of sites when for the allowed list: The “Known” sites: You and your SharePoint site admins already know set of sites from your organization that are safe to participate in organization wide search and the Copilot experience. These sites can be included in the allowed list. The top active and shared sites: the allowed list affects what users can see in their organization-wide search results and their Copilot experience. To optimize users’ search and Copilot experience, the hypothesis is that those top active and shared sites need to be included in the allowed list. Depending on your license, you can use either the SharePoint Admin Center (SPAC) or the SharePoint Admin Center Data access governance (SPAC DAG) Activity(sharing) report to identify the most active and shared sites. Step 2: Review site permissions Once you have the list of sites (up to 100), make sure the site permissions and content controls are implemented well enough to make them visible for search and Copilot experience. You can work with your site admins and stakeholders to assess permissions and review the sites. [!NOTE] The limit of up to 100 SharePoint sites includes Hub sites, but not their sub-sites. When you enable Hub sites, the sub-sites under a Hub site are included in the allowed-list but do not count towards the 100-site limit. So if you are picking Hub sites, make sure all the child sites have proper permissions. Step 3: Use PowerShell Scripts to apply the allowed list After you review permissions in your curated sites, you can use PowerShell Scripts to turn on Restricted SharePoint Search, add, and remove sites. You can also use PowerShell Scripts to get the list of all sites in your allowed list. Find the most active and shared sites You can find the most active sites using the SharePoint Admin Center (SPAC). If you have either Microsoft 365 E5 licensing or Microsoft Syntex - SharePoint Advanced Management, you can also use SPAC DAG to find the most shared sites. Using SharePoint Admin Center (SPAC): If your organization has SharePoint, you have access to SPAC. You can get this list of sites using SPAC Active sites feature that had most page views and files in the last 30 days. This can be an indicator to get to the list of sites interested by the broadest possible audience in the organization. Using SPAC DAG Activity(sharing) report to identify sites that have been shared most in the last 28 days. Use SharePoint Admin Center (SPAC) to find the most active sites Admins can use SharePoint Admin Center (SPAC) features to identify sites that can be part of the allowed list based on their criteria. The Active sites page in the SharePoint admin center lets you view the SharePoint sites in your organization. Based on your organization’s needs, you can sort and filter sites based on columns such as Last activity, Page views, Page visits. You can search for sites, and customize the columns and views. Step 1: In SPAC left pane, select “Active Sites.” Step 2: Using the sorting and filtering functionality of the Active sites you can curate top 100 sites based on your organization needs and create a custom view. Step 3: First move the columns by scrolling to right and clicking the “Customize columns”. Use up and down arrows next to each column's name to move up the Page Views and Files next to the URL, so it's easy for you to see the important columns together for analysis. Step 4: To create custom view of the top 100 sites sorted by “page views” for last 30 days do the following: Select the arrow next to the column header of Page views and select Large to Small. Select the arrow next to the column header of Last activity select on the Filter by last activity> Last 30 days. You can use the other columns to sort, or filter based on your needs. Step 5: Once you're done with your sorting and filtering, you can create a custom view based on your current setting, and save it for future use. To create a custom view, select All sites > Save view as. Enter a name for your custom view: Your saved custom view is now available for you to choose from the drop-down menu next time. Step 6: Export the sites and manage the list in CSV file that you can use to add to the list by selecting the Export tab. Your exported CSV file looks similar to the following CSV file: Use SPAC DAG Activity (sharing) report to find most shared sites The SPAC DAG report (Data access governance reports for SharePoint sites) is part of SharePoint admin center. Admins with Microsoft 365 E5 licensing or Microsoft Syntex - SharePoint Advanced Management can access this report. This report helps you identify potential sites that are active and shared in the last 28 days. Step 1: In the left pane, select Reports > Data access governance. Select the Sharing links report. Step 2: In the right pane Sharing Links page, select on the Anyone links report. Step 3: "Anyone" links report gives you a list of sites in which the highest number of Anyone links were created. These links let anyone access files and folders without signing in. These sites might be great candidates to allow in tenantorg wide search. Resources Microsoft Copilot for Microsoft 365 - best practices with SharePoint
OfficeDocs-SharePoint/SharePoint/SharePointOnline/restricted-sharepoint-search-allowed-list.md/0
Curate the allowed list for Restricted SharePoint Search
OfficeDocs-SharePoint/SharePoint/SharePointOnline/restricted-sharepoint-search-allowed-list.md
OfficeDocs-SharePoint
1,654
15
ms.date: 07112018 title: "Set up a new term set" ms.reviewer: shrganguly ms.author: ruihu author: maggierui manager: jtremper recommendations: true audience: Admin f1.keywords: NOCSH ms.topic: article ms.service: sharepoint-online ms.collection: M365-collaboration ms.custom: admindeeplinkSPO ms.localizationpriority: medium search.appverid: - SPO160 - OSU150 - MET150 ms.assetid: 8255dbdf-1c0a-4987-88d8-8f4a0a980953 description: "Learn how to create a new term set for managed metadata in SharePoint" Set up a new term set To add a term set, you must be a contributor, group manager or a term store admin. To set up a new term set In the SharePoint admin center, under Content services, select Term store. In the tree-view navigation pane, expand the groups to select the group to which you want to add a term set. Click Add term set. . Type a name for the term set and press ENTER. General tab On the General tab, for Owner, select Edit. The Edit Properties panel appears. Specify the following information about who owns and maintains this term set: Term Set owner: If you want the owner of the term set to be someone other than you, enter the person, group, or email address for who will maintain this term set. Stakeholders: Add the names of users, groups, or email addresses that should be notified before major changes are made to the term set. Contact: If you want site users to be able to provide feedback on the term set, enter an email address. Click Save. Usage settings tab To configure the term submission policy On the Usage settings tab, for Submission policy, select Edit. The Edit submission policy panel appears. Specify whether you want the term set to be Closed or Open. If you select Closed, only people with contribute permissions can add terms to this term set. If you select Open, users can add terms from a tagging application. Click Save. To configure the tagging policy Under the Usage settings tab, for Available for tagging, select Edit. The Available for tagging panel appears. Select the Enable check box to make the terms in the term set available for tagging. If you clear the Enable check box, this term set won't be visible to most users. If the term set is still in development, or is not otherwise ready for use, you might want to clear the Enable check box. Select Save. Navigation tab Enabling site navigation means you can use the terms in this term set for site navigation links with friendly URLs and dynamic content. Enabling faceted navigation means users can use refiners based on managed metadata from the search index to quickly browse to specific content Under the Navigation tab, for Use term set for site navigation, select Edit. The Edit Properties panel appears. Click the Enable check boxes to use this term set for site or faceted navigation. Click Save. Enabling either using the term set for site or faceted navigation enables options to set a custom target page and a custom catalog item page. You can choose a custom target page if you want to display a specific page. Custom target pages that you set for individual terms will override this setting. To set a custom target page For Custom target page, select Edit. The Edit term set target page panel appears. Move the toggle switch to enable Use a custom target page. Click Select, and then select Save. The target page appears when users navigate to a friendly URL in this term set. If terms in this term set are used as catalog categories, you can select the page used to render catalog data for items under those categories. To set a custom catalog item page For Custom catalog item page, select Edit. The Edit term set catalog item page panel appears. Move the toggle switch to enable Use a custom catalog item page. Click Select and then select Save. Advanced tab You can use machine translation to translate your terms, or you can export and import XLIFF files. You must repeat the translation each time you update the term set. To configure translations Under the Advanced tab, for Translation, select Manage. The Translation panel appears. To use machine translation to translate this term set into the working languages for the term store, select Start. The Machine translation panel appears. For the terms you want to translate, select either All terms, or Only the terms updated since the last translation. From both the Translate from and Translate to dropdowns, select a language. Click Translate. You can use custom properties to store additional data about a term set. To edit custom properties For Custom properties, select Edit. The Edit Custom properties panel appears. Enter a Property name and Value, and then select Add.\ Click Save. To learn how to add a term to the new term set, see Create and manage terms in a term set.
OfficeDocs-SharePoint/SharePoint/SharePointOnline/set-up-new-term-set.md/0
Set up a new term set
OfficeDocs-SharePoint/SharePoint/SharePointOnline/set-up-new-term-set.md
OfficeDocs-SharePoint
1,114
16
ms.date: 04232024 title: "Admin center site permissions reference" ms.reviewer: srice ms.author: ruihu author: maggierui manager: jtremper recommendations: true audience: Admin ROBOTS: NOINDEX f1.keywords: - CSH ms.topic: article ms.service: sharepoint-online ms.localizationpriority: medium ms.custom: admindeeplinkSPO layout: ContentPage ms.collection: - Strat_OD_share - M365-collaboration search.appverid: - SPO160 - MET150 description: "Learn about site permissions that you can configure in the SharePoint admin center." Admin center site permissions reference On the Membership tab, you can manage permissions for the site and also for any associated Microsoft 365 group or Microsoft Teams team. These roles are specific to the selected site or group and don't give users access to the SharePoint admin center. Owners Microsoft 365 group owners can manage group membership, privacy, and classification, as well as the associated SharePoint site. If the Microsoft 365 group is associated with a team, then the group owners are also team owners. Members Microsoft 365 group members can participate in the group and have access to the associated SharePoint site. If the Microsoft 365 group is associated with a team, then the group members are also team members and can send messages and participate in channels if allowed by the team owner. Site admins Site admins (formerly called site collection administrators) have the highest level of SharePoint permissions. They have the same Full Control permissions of a site owner, plus they can do more things, such as managing search, the recycle bin, and site collection features. They also have access to any items in the site, including in subsites, even if permissions inheritance has been broken. If there's a Microsoft 365 group or team connected to the site, then group or team owners are automatically included as site admins and group or team members are automatically included as site members. Managing site permissions through group or team membership is recommended over giving people permissions directly to the site. This method allows for easier administration and consistent access for users across group and team resources. Non-primary admins Additional admins beyond the Primary admin are site admins only and can only manage the SharePoint site. They have no access to the associated Microsoft 365 group or team unless they have also been directly added to the group or team. Site owners Site owners have full control of the SharePoint site. If the site has an associated Microsoft 365 group or team, then group or team owners are automatically included as site owners. However, people added directly to the site owners group don't have access to the Microsoft 365 group or team unless they are added there directly. Site members Site members have edit permissions to the SharePoint site and can add and remove files, lists, and libraries. If the site has an associated Microsoft 365 group or team, then group or team members are automatically included as site members. However, people added directly to the site members group don't have access to the Microsoft 365 group or team unless they are added there directly. Site visitors Site visitors have view-only permissions to the SharePoint site. This permission level is only used by SharePoint and isn't related to permissions in an associated Microsoft 365 group or team. [!NOTE] For information on how to manage Site owners, Site members and Site visitors permission groups, see Sharing and permissions in the SharePoint modern experience. Additional permissions There are additional permission levels in SharePoint beyond those shown on this panel. Users may have access to the site or its contents through those roles. Users may also have access to files or folders in the site through sharing links. See also External sharing overview Overview of Microsoft 365 Groups for administrators
OfficeDocs-SharePoint/SharePoint/SharePointOnline/site-permissions.md/0
Admin center site permissions reference
OfficeDocs-SharePoint/SharePoint/SharePointOnline/site-permissions.md
OfficeDocs-SharePoint
817
17
ms.date: 11142023 title: "Transition from the previous OneDrive for Business sync app" ms.reviewer: ms.author: mactra author: MachelleTranMSFT manager: jtremper audience: Admin f1.keywords: - NOCSH ms.topic: article ms.service: one-drive ms.localizationpriority: medium ms.collection: - Strat_OD_admin - M365-collaboration ms.custom: - seo-marvel-apr2020 - onedrive-toc search.appverid: - ODB160 - ODB150 - GOB150 - GOB160 - MET150 ms.assetid: 4100df3a-0c96-464f-b0a8-c20de34da6fa description: "Learn how to upgrade users from the previous OneDrive for Business sync app to the new OneDrive sync app (OneDrive.exe)." Transition from the previous OneDrive for Business sync app [!IMPORTANT] Support for the previous OneDrive sync app (Groove.exe) ended on January 11, 2021. As of February 1, 2021, users can no longer sync OneDrive or SharePoint files in Microsoft 365 by using Groove.exe. Groove.exe will continue to work only for files in SharePoint Server. This article is for global and SharePoint admins who want to transition their users off of the previous OneDrive sync app (Groove.exe) so that they sync with only the new OneDrive sync app (OneDrive.exe). If you're not an IT admin, to learn how to begin syncing files using the new OneDrive sync app, see Sync files with the new OneDrive sync app in Windows. [!NOTE] If your organization never used the previous OneDrive sync app, or had fewer than 250 licensed Office 365 users in June 2016, your users are already using the new OneDrive sync app to sync files in OneDrive and SharePoint. Syncing files with OneDrive sync app to OneDrive sync app When users who are syncing files with the previous OneDrive sync app (Groove.exe) sign in to the new OneDrive sync app (OneDrive.exe), the following things happen: If the new OneDrive sync app can take over syncing a library, the previous sync app stops syncing it,, and the new OneDrive sync app takes over syncing it without redownloading the content. If the new OneDrive sync app can't sync the library, the previous sync app continues to sync it. If a library requires checkout or has required columns or metadata, it's synced read-only. The previous sync app stops running and removes itself from automatic startup, unless it's still syncing libraries that the new OneDrive sync app can't sync. When SharePoint libraries begin syncing with the new OneDrive sync app, the folder hierarchy that appears in File Explorer may be simplified. Limits The following library types aren't yet supported by the new OneDrive sync app, and won't transition from the previous sync app: On-premises locations in SharePoint Server 2016 or earlier. Learn about using the OneDrive sync app with SharePoint Server 2019 SharePoint libraries that people from other organizations shared that your users are syncing with the previous sync app. For more info about sync restrictions and limitations, see Invalid file names and file types in OneDrive and SharePoint Requirements To transition users off of the previous sync app, first make sure users have: Windows 10, Windows 8.1, Windows 8, or Windows 7. A current version of the new OneDrive sync app installed. For info about deploying the new OneDrive sync app, see Deploy OneDrive apps using Microsoft Endpoint Configuration Manager. OneDrive.exe must be deployed and configured before you try the takeover command. Download the latest version of the new OneDrive sync app that's fully released to production. To learn about the versions that are rolling out to different rings, see New OneDrive sync app release notes. The following versions of Office or higher installed. For info about deploying Office, see Choose how to deploy Microsoft 365 Apps for enterprise. Make sure you don't install the previous OneDrive sync app. For info, see Changes to OneDrive sync app deployment in Office Click-to-Run. Office version Minimum version ------------------ Microsoft 365 Apps for enterprise 16.0.7167.2 Office 2016 MSI 16.0.4432.1 Office 2013 MSIC2R 15.0.4859.1 [!NOTE] If any users have Office 2010 installed, we strongly recommend removing the SharePoint Workspace component. If users previously set up SharePoint Workspace (even if they're no longer using it), it will cause problems syncing team sites. Before starting OneDrive Setup, either Uninstall Office from a PC or modify the installation. To do this by running Setup, first create the following XML file: xml Then run Setup: console Setup.exe modify ProPlus config RemoveSharepointDesigner.xml For more info, see Setup command-line options for Office 2010 and Config.xml file in Office 2010. The latest Rights Management Service (RMS) client if you want users to be able to sync IRM-protected SharePoint document libraries and OneDrive locations. Configure takeover When the required software is installed on your users' computers, you can configure automatic takeover of syncing silently (review the prerequisites and steps), and then use this policy. After you install and configure OneDrive.exe, Groove.exe should no longer be able to sync. If the takeover didn't succeed, or your users are stuck in a hybrid state (some content syncing with OneDrive.exe and some with Groove.exe), try running: %localappdata%\Microsoft\OneDrive\OneDrive.exe takeover. [!TIP] Make sure to run the command in a user context, rather than as admin, or the error "OneDrive.exe cannot be run with Admin privileges" appears. To affect all users on the computer, configure the command to run on every user account so it will run for any user who signs in. If the takeover didn't succeed, the previous OneDrive sync app (Groove.exe) may be an older version that can't successfully transition to the new client. To patch the previous sync app, update groove-x in Office 2016 or Office 2013, and then try again. See also To help your users get started with the OneDrive sync app, you can refer them to the following articles: Sync files with the new OneDrive sync app in Windows Get started with the new OneDrive sync app for Mac Sync SharePoint files with the new OneDrive sync app
OfficeDocs-SharePoint/SharePoint/SharePointOnline/transition-from-previous-sync-client.md/0
Transition from the previous OneDrive for Business sync app
OfficeDocs-SharePoint/SharePoint/SharePointOnline/transition-from-previous-sync-client.md
OfficeDocs-SharePoint
1,472
18
title: "About user profile synchronization" ms.reviewer: amysim ms.author: ruihu author: maggierui manager: jtremper recommendations: true ms.date: 5212020 audience: Admin f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-online ms.collection: M365-collaboration ms.localizationpriority: medium search.appverid: - SPO160 - MET150 ms.assetid: description: "This article describes the user profile sync process for SharePoint in Microsoft 365, and the properties that are synced into user profiles." User profile synchronization Microsoft SharePoint uses the Active Directory synchronization job to import user and group attribute information into the User Profile Application (UPA). When a new user is added to Microsoft Entra ID, the user account information is sent to the SharePoint directory store and the UPA sync process creates a profile in the User Profile Application based on a predetermined set of attributes. Once the profile has been created, any modifications to these attributes will be synced as part of regularly scheduled sync process. [!NOTE] The profile properties that are synced by the UPA sync process are not configurable. Synchronization times will vary based on workloads. Sync process There are four steps in the sync process. StepDescription ------ 1. Active Directory to Microsoft Entra ID Microsoft Entra Connect syncs data from on-premises Active Directory to Microsoft Entra ID. For more info, see What is hybrid identity with Microsoft Entra ID? and Attributes synchronized. 2. Microsoft Entra ID to SharePoint Microsoft Entra ID syncs data from Microsoft Entra ID to the SharePoint directory store. 3. SharePoint to UPA The UPA sync process syncs user account information in SharePoint directory store to the User Profile Application (UPA). 4. UPA to sitesUser account information from the UPA is synced to SharePoint sites (previously called "site collections"). Typically, user profiles are created automatically for all accounts that are created in Microsoft 365. For organizations that have a Microsoft 365 Education subscription, user profiles are not created for new accounts by default. The user must access SharePoint once, at which time a basic stub profile will be created for the user account. The stub profile will be updated with all remaining data as part of the sync process. If block sign-in is set on the user account in Microsoft Entra ID or disabled accounts are synced from Active Directory on premises, those user accounts will not be processed as part of the UPA sync process. The user must be enabled and licensed for changes to be processed. Properties that are synced into SharePoint user profiles The following Microsoft Entra user attributes are synced to the UPA. Microsoft Entra attributeUser profile property display namesNotesSync to sites :-------:-------:-------:------- UserPrincipalNameAccount Name User Name User Principal NameExample: i:0.f membership gherrera@contoso.com gherrera@contoso.comYes DisplayNameNameYes GivenNameFirstNameYes snLastNameYes telephoneNumberWork phoneExample: (123) 456-7890Yes proxyAddressesWork Email SIP AddressWork Email is set to the value prefixed with SMTP. (SMTP:gherrera@contoso.com) Example: gherrera@contoso.comYes PhysicalDeliveryOfficeNameOfficeYes TitleTitle Job TitleJob Title contains the same value as Title and is connected to a term set.Yes DepartmentDepartmentDepartment is connected to a term set.Yes WWWHomePagePublic site redirectNo PreferredLanguageLanguage PreferencesUsed by SharePoint to determine language for the user when the multilingual user interface (MUI) feature is enabled.Yes msExchHideFromAddressListSPS-HideFromAddressListsNo ManagerManagerUser Manager for organization hierarchyYes [!NOTE] To update additional or custom properties, see Bulk update custom user profile properties. Some property names could differ between Azure AD Graph and Microsoft Graph, see Property differences between Azure AD Graph and Microsoft Graph. Frequently asked questions (FAQs) How often are changes synced into the User Profile Application? User account attribute changes are collected in batches and processed for UPA synchronization. Times will vary based on the amount of changes requested in a single batch. The UPA synchronization is schedule to run at regular intervals. Will UPA synchronization overwrite existing properties in SharePoint user profiles? For the default properties that are synced by UPA synchronization, values will be overwritten to align with Microsoft Entra ID. Does UPA synchronization update only properties that have changed? UPA synchronization is driven primarily by changes that are made Microsoft Entra ID, including adding new users. A full import can occur under certain maintenance events. Why isn't it possible to map additional properties for UPA synchronization to sync from Microsoft Entra ID to the User Profile Application? UPA synchronization is limited to a preconfigured set of properties to guarantee consistent performance across the service.
OfficeDocs-SharePoint/SharePoint/SharePointOnline/user-profile-sync.md/0
User profile synchronization
OfficeDocs-SharePoint/SharePoint/SharePointOnline/user-profile-sync.md
OfficeDocs-SharePoint
1,104
19
title: "Configuration failure during removal" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 312018 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ROBOTS: NOINDEX ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: 4451ffa5-3119-4402-9c67-168e58c5154d description: "Summary: Learn how to remove SharePoint Server during a configuration failure." Configuration failure during removal Summary: Learn how to remove SharePoint Server during a configuration failure. When you choose Uninstall from Uninstall or change a program, the Setup Wizard starts and attempts to uninstall the product. If an error is encountered, the uninstall process won't complete and the error will be noted in the setup log file. [!NOTE] The setup log file is stored in the temp directory for the user account that is running setup (%USERTEMP% or %WINDIR%\Users\user account\AppData\Local\Temp) and is named "SharePoint Server Setup ( YYYYMMDDHHMMSSrandomnumber).log" where YYYYMMDD is the date and HHMMSS is the time (hours in 24-hour clock format, minutes, seconds, and milliseconds) and the random number is used to differentiate between possible simultaneous attempts to run the setup program. You can review the log file for error messages. After you understand why the error occurred, you can address the issue, and then you can choose to either stop the uninstall process, address the problem, and then run Uninstall again, or you can continue the uninstall process. If you exit Setup when an error is encountered, the binary files won't be removed. However, tasks that were successfully completed won't be undone. This approach will enable you to restore the server to working condition by running the configuration wizard in Repair mode. If you choose to continue with the uninstall process, the binary files will be removed. The resulting state of the computer will depend on when the configuration wizard failed. For example, the computer might still: Be joined to the server farm. Be registered in the configuration database and the connection string on the local computer could exist. Include services that are running. With the binary files removed, you won't be able to use the configuration wizard to clean up the configuration settings on the local computer or in the configuration database.
OfficeDocs-SharePoint/SharePoint/SharePointServer/PSConfigHelp/configuration-failure-during-removal.md/0
Configuration failure during removal
OfficeDocs-SharePoint/SharePoint/SharePointServer/PSConfigHelp/configuration-failure-during-removal.md
OfficeDocs-SharePoint
574
20
title: "Specify farm security settings" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 312018 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ROBOTS: NOINDEX ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: 01d6f28f-675c-4418-a9ad-fcc8bbfdc58c description: "Summary: Learn how to use a passphrase in SharePoint Server." Specify farm security settings Summary: Learn how to use a passphrase in SharePoint Server. Type a passphrase to help to secure farm configuration data, and then select Next. Although a pass phrase is similar to a password, it's usually longer to enhance security. It's used to encrypt credentials of accounts that are registered in SharePoint products, for example, the system account that you provide when you run the SharePoint 2016 Products Configuration Wizard. Ensure that you remember the pass phrase, because you must use it each time you add a server to the farm. Ensure that the passphrase meets the following criteria: Has at least eight characters Contains at least three of the following four character groups: English uppercase characters (A through Z) English lowercase characters (a through z) Numerals (0 through 9) Nonalphabetic characters (such as !, $, , %) You can change the passphrase after the farm has been configured by running the Set-SPPassphrase cmdlet in Microsoft PowerShell. By default, the new passphrase will be deployed across all servers in the farm. However, if there's a failure in the deployment of the new passphrase, you must manually update the passphrase on the individual server on which the deployment failed. Run the Set-SPPassphrase - LocalServerOnly cmdlet in Microsoft PowerShell to manually update the passphrase. [!NOTE] When you perform an upgrade, the Specify Farm Security Settings page does not appear.
OfficeDocs-SharePoint/SharePoint/SharePointServer/PSConfigHelp/specify-farm-security-settings.md/0
Specify farm security settings
OfficeDocs-SharePoint/SharePoint/SharePointServer/PSConfigHelp/specify-farm-security-settings.md
OfficeDocs-SharePoint
491
21
title: "Assign a category page and a catalog item page to a term in SharePoint Server" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 7142017 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: IT_Sharepoint_Server_Top ms.assetid: a2c1b8a0-68a2-4399-931f-cf58cfc3875d description: "Learn how to assign a category page and a catalog item page to a term in term store management." Assign a category page and a catalog item page to a term in SharePoint Server [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] Category pages and catalog item pages are page layouts that you can use to show structured content consistently across a site. They are often used when displaying catalog content on a site that uses managed navigation. This saves you from having to create many individual pages for content that you want to show in the same manner across your site. You can assign a category page or a catalog item page to all terms in a term set, or to specific terms in a term set. For more information, see "Catalog pages and catalog item pages" in Overview of cross-site publishing in SharePoint Server. Before you begin [!NOTE] Because SharePoint Server runs as websites in Internet Information Services (IIS), administrators and users depend on the accessibility features that browsers provide. SharePoint Server supports the accessibility features of supported browsers. For more information, see the following resources: > Plan browser support> Accessibility guidelines in SharePoint> Accessibility in SharePoint> Keyboard shortcuts> Touch. Before you assign a category page and a catalog item page in term store management, verify the following: The publishing site is using managed navigation. By default, site collections that are created by using the Publishing Portal Site Collection template use managed navigation. You have created a navigation term set in term store management as described in Create and manage terms in a term set. You have created a category page and a catalog item page. [!IMPORTANT] If you have connected a publishing site to a catalog and selected to integrate the catalog into the publishing site as described in Connect a publishing site to a catalog in SharePoint Server, the category page and catalog item page configurations that you specified are displayed in term store management. Assign a category page and a catalog item page to a term By default, when you assign a category page to a term, the page that you specify will also be assigned to the children of the term unless you specify a different page to be used on all the children of a term. [!NOTE] You should only assign catalog item pages to a term if the term set is used as a tagging term set for catalog content. To assign a category page and a catalog item page to a term Verify that the user account that performs this procedure is a member of the Owners SharePoint group on the site. On the site, on the Settings menu, click Site Settings. On the Site Settings page, in the Site Administration section, click Term store management. On the Term Store Management Tool page, in the TAXONOMY TERM STORE section, click the term to which you want to assign a category page and a catalog item page. Click the tab TERM-DRIVEN PAGES. To assign a category page to a term, in the Target Page Settings section, select the check box Change target page for this term, and then type the URL of the category page that you want to assign to the term. Or you can click the Browse button, and then go to the category page that you want to assign to the term. To assign a category page to the children of a term, select the check box Change target page for the children of this term, and then type the URL of the category page that you want to assign to the children of the term. Or, you can click the Browse button, and then go to the category page that you want to assign to the children of the term. To assign a catalog item page for catalog items that are tagged with the current term, select Change Catalog Item Page for this category, and then type the URL of the catalog item page that you want to assign to catalog items that are tagged with the term. Or you can click the Browse button, and then go to the catalog item page that you want to assign to catalog items that are tagged with the term. To assign a catalog item page for catalog items that are tagged with a child of the current term, select Change Catalog Item Page, and then type the URL of the catalog item page that you want to assign to catalog items that are tagged with children of the term. Or you can click the Browse button, and then go to the catalog item page that you want to assign to catalog items tagged with children of the term. See also Other Resources Blog post: Assign a category page and a catalog item page to a term
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/assign-a-category-page-and-a-catalog-item-page-to-a-term.md/0
Assign a category page and a catalog item page to a term in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/assign-a-category-page-and-a-catalog-item-page-to-a-term.md
OfficeDocs-SharePoint
1,125
22
ms.date: 03132018 title: "Back up site collections in SharePoint Server" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: 45acdd33-b322-4f36-97f1-0701159e15f0 description: "Learn how to back up a single site collection in SharePoint Server." Back up site collections in SharePoint Server [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] You can back up a site collection in SharePoint Server by using the SharePoint Central Administration website or Microsoft PowerShell. Before you begin We recommend that you regularly back up the complete farm. However, IT practices might require you to also back up a site collection. For more information about what to back up, see Plan for backup and recovery in SharePoint Server. Before you begin this operation, review the following information: You must first create a folder on the local computer or the network in which to store the backups. For better performance, we recommend that you back up to the local computer and then move the backup files to a network folder. For more information about how to create a backup folder, see Prepare to back up and restore farms in SharePoint Server. If the site collection's Lock status is set to Not locked or Adding content prevented, SharePoint Server temporarily sets the site to Read-Only while the backup operation is occurring. SharePoint Server does this to reduce the possibilities of users changing the site collection while it is being backed up. After the backup is complete, the setting is changed back its normal status. Performing a site collection backup might require resources and might slightly affect farm performance when the backup is running. You can help avoid performance issues by backing up the farm during hours when farm use is lowest, such as outside office hours. Use PowerShell to back up a site collection in SharePoint Server You can use PowerShell to back up a site collection manually or as part of a script that can be run at scheduled intervals. To back up a site collection by using PowerShell Verify that you have the following memberships: securityadmin fixed server role on the SQL Server instance. db_owner fixed database role on all databases that are to be updated. Administrators group on the server on which you are running the PowerShell cmdlets. An administrator can use the Add-SPShellAdmin cmdlet to grant permissions to use SharePoint Server cmdlets. [!NOTE] If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For additional information about PowerShell permissions, see Add-SPShellAdmin. Start the SharePoint Management Shell. At the PowerShell command prompt, type the following command: powershell Backup-SPSite -Identity -Path [-Force] [-NoSiteLock] [-UseSqlSnapshot] [-Verbose] Where: \ is the ID or URL for the site collection you want to back up. \ is the path to where the backup file is located. If you want to overwrite a previously used backup file, use the Force parameter. You can use the NoSiteLock parameter to keep the read-only lock from being set on the site collection while it is being backed up. However, using this parameter can enable users to change the site collection while it is being backed up and could lead to possible data corruption during backup. To display the site collection GUID or URL at the PowerShell command prompt, type the following command: powershell Get-SPSite format-list -property id,url If the database server is running an Enterprise Edition of SQL Server, we recommend that you also use the `UseSqlSnapshot` parameter for more consistent backups. You can also export sites or lists from these snapshots. > [!NOTE] > If the RBS provider that you are using does not support snapshots, you cannot use snapshots for content deployment or backup. For example, the SQL FILESTREAM provider does not support snapshots. For more information about how to use SQL snap-shots, see [Back up databases to snapshots in SharePoint Server](back-up-databases-to-snapshots.md). For more information, see Backup-SPSite. [!NOTE] We recommend that you use Microsoft PowerShell when performing command-line administrative tasks. The Stsadm command-line tool has been deprecated, but is included to support compatibility with previous product versions. Use Central Administration to back up a site collection in SharePoint Server You can use Central Administration to back up a site collection. To back up a site collection by using Central Administration Verify that the user account performing this procedure is a member of the Farm Administrators group. Additionally, verify that the Windows SharePoint Services Timer V4 service has Full Control permissions on the backup folder. Start Central Administration. In Central Administration, on the home page, in the Backup and Restore section, click Perform a site collection backup. On the Site collection backup page, select the site collection from the Site Collection list. Type the local path of the backup file in the Filename box. [!NOTE] If you want to reuse a file, select the Overwrite existing file check box. Click Start Backup. You can view the general status of all backup jobs at the top of the Granular Backup Job Status page in the Readiness section. You can view the status for the current backup job in the lower part of the page in the Site Collection Backup section. The status page updates every 30 seconds automatically. You can manually update the status details by clicking Refresh. Backup and recovery are Timer service jobs. Therefore, it may take several seconds for the backup to start. If you receive any errors, you can review them in the Failure Message column of the Granular Backup Job Status page. See also Concepts Plan for backup and recovery in SharePoint Server Restore site collections in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/back-up-site-collections.md/0
Back up site collections in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/back-up-site-collections.md
OfficeDocs-SharePoint
1,367
23
title: "Change the Content Search Web Part display template and use Windows PowerShell to start Usage analytics in SharePoint Server" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 382018 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: IT_Sharepoint_Server_Top ms.assetid: df979ec9-bdf7-4d96-b3a6-37213c45e5da description: "Learn how to change the Content Search Web Part display template and use Microsoft PowerShell to start Usage analytics in SharePoint Server." Change the Content Search Web Part display template and use Windows PowerShell to start Usage analytics in SharePoint Server [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] [!NOTE] The examples in this series are based on an on-premises SharePoint Server deployment. Change the mapping of the UsageAnalyticsID managed property In our Contoso website, we want to recommend one product per product group, that is, we want Usage analytics to ignore the product color. This means that our recommendations must be calculated on Group Number. We can do this, because Group Number is part of the friendly URL (FURL) on our item detail page (see Stage 10: Configure the query in a Content Search Web Part on a catalog item page in SharePoint Server). In the previous blog post, we told you that managed property that's used to specify how recommendations between individual catalog items should be calculated is UsageAnalyticsID (see About the UsageAnalyticsID managed property). For Usage analytics to do its calculation on Group Number, we must change the mapping of the UsageAnalyticsID property. Here's how you do that: [!IMPORTANT] You have to change the property mapping on the authoring site. On your authoring site, go to Site settings --> Search Schema. On the Managed Properties page, in the Managed property field, type UsageAnalyticsID, and then select the arrow button. From the Property Name field, select EditMap Property. On the Edit Managed Property page, select Add a Mapping. Notice that by default, this property is mapped to the crawled property ows_ProductCatalogItemNumber. In the Crawled property selection dialog, use the Search for crawled property name field to search for the crawled property that you want to map to this managed property. In our Contoso scenario, we want to map the site column called Group Number. Crawled properties don't contain spaces. Therefore, exclude the space, enter GroupNumber and select Find. Two crawled properties are found. Select the crawled property with the ows_ prefix, and select OK. If you are confused because two crawled properties that look about the same are found, you're not alone. This is somewhat tricky. The article From site column to managed property - What's up with that? explains the naming convention for crawled and managed properties. If you are interested in an abbreviated version, here it is as follows: When mapping a crawled property to the UsageAnalyticsID managed property, you should select the crawled property with the ows_ prefix! On the Edit Managed Property page, select the ows_ProductCatalogItemNumber crawled property and then Remove Mapping. Select OK to save the new mapping. [!IMPORTANT] Map only one crawled property to the UsageAnalyticsID managed property. If you map more than one crawled property, the Usage analytics calculation won't work correctly. After you have changed the mapping of the UsageAnalyticsID managed property, perform a full crawl of your catalog, as explained in Stage 4: Set up search and enable the crawling of your catalog content in SharePoint Server. Change a Content Search Web Part display template so the usage events are logged correctly On our Contoso site, we use a Content Search Web Part (CSWP) to display items on the catalog item page, as explained in Stage 10: Configure the query in a Content Search Web Part on a catalog item page in SharePoint Server. By default, the CSWP does not log usage events. To enable our CSWP to log usage events we must change the display template the CSWP is using. Here's what you have to do: In your How to map your network drive, open the display template that you have applied to your CSWP. In the ManagedPropertyMapping element, add the following two properties: powershell 'Original Path'{Original Path}:'OriginalPath', 'SiteID'{SiteID}:'SiteID', Add the following JavaScript just above the HTML part of your display template: ```javascript Log Views usage event on URL of catalog item window.LogViewsToEventStore = function(url, site) { SP.SOD.executeFunc("sp.js", "SP.ClientContext", function() { var spClientContext = SP.ClientContext.get_current(); if(!$isNull(spClientContext)) { var spWeb = spClientContext.get_web(); var spUser = spWeb.get_currentUser(); var spScope = "{00000000-0000-0000-0000-000000000000}"; SP.Analytics.AnalyticsUsageEntry.logAnalyticsEvent2(spClientContext, 1, url, spScope, site, spUser);spClientContext.executeQueryAsync(null, null); } }); }; var originalPath = $getItemValue(ctx, "Original Path"); var originalSite = $getItemValue(ctx, "SiteID"); LogViewsToEventStore(originalPath.value, originalSite.value); ``` In View the usage event definitions we explained the EventTypeIDs for the usage events. The value 1 in this script represents the EventTypeID of the Views usage event. To log a different usage event, substitute this value with the EventTypeID of the usage event that you want to log. Save the file. Why you should simulate the generation of Views usage events Now that our CSWP can correctly log usage events, the next step is to actually generate usage events. In our case, we changed the CSWP to log Views. If the Contoso site was in production, visitors would create a Views usage event every time that they viewed an item on the website. But, the Contoso site is only a demo site. Therefore, it doesn't have any visitors. When you set up your site, you'll most likely want to test the Usage analytics feature before you put it into production. To be able to test the Usage analytics feature, you'll have to generate usage events. To generate recommendations based on usage events, a minimum of three users have to click the same items. There is no single correct way of simulating the generation of Views usage events. To generate Views usage events for the Contoso site, you may want to invite coworkers to a "click party." To to make sure that recommendations are generated, give each user a list of items to click. That way, you can make sure that at least three users click the same items. Here's an example of the instructions that you can give your coworkers: When Usage analytics is run, SV Keyboard E10 will generate a recommendation for WWI Desktop PC2.30 M2300 (people who viewed WWI Desktop PC2.30 M2300 also viewed SV Keyboard E10), and WWI Desktop PC2.30 M2300 will generate a recommendation for SV Keyboard E10 (people who viewed SV Keyboard E10 also viewed WWI Desktop PC2.30 M2300). Run Microsoft PowerShell scripts to start search analytics and push usage events to the Event store After you have generated Views usage events, you have two options on how to continue. The Usage analytics timer job runs once every 24 hours. If you want results faster, you can use some Microsoft PowerShell scripts to speed up the process. Here's what you have to do: Verify that you meet the minimum permission requirements. On the server where SharePoint Server is installed, open the SharePoint 2013 Management Shell as an Administrator. At the Microsoft PowerShell command prompt, type the following commands to start Search analytics. The output from Search analytics is used by Usage analytics to map usage events against the actual items in the search index. powershell $job = Get-SPTimerJob -Type Microsoft.Office.Server.Search.Analytics.AnalyticsJobDefinition $sa = $job.GetAnalysis("Microsoft.Office.Server.Search.Analytics.SearchAnalyticsJob") $sa.StartAnalysis() Wait for the search analytics job to finish. To check the status of the search analytics job, type the following command: powershell $sa.GetAnalysisInfo() As long as the search analytics job is running, State is Running. The search analytics job is finished when State is Stopped and Status is 100. The usage events are added to the Event store in 10-minute intervals. To push the usage events to the Event store, type the following commands: powershell $job = Get-SPTimerJob -Identity ("job-usage-log-file-import") $job.RunNow() View usage events in the Event store After you have pushed the usage events into the Event store, you should verify that the usage events are recorded correctly. To do this, on the machine where SharePoint Server is installed, go to the Event store. In most cases, you can find the Event store in the following folder: C:\Program Files\Microsoft Office Servers\15.0\Data\Office Server\Analytics_\\EventStore In the Event store, the usage events of each day are stored in a separate folder. In our scenario, we can see that a folder was added. In this folder, you'll see some text files. These files contain our usage events. Notice that all file names start with 1_. This number is the EventTypeID of the usage event that is logged in the file. Remember, 1 is the EventTypeID of the Views usage event (see View the usage event definitions). At this point the only usage event we are logging is the Views event. So this is a good sign that we are doing things right. Open one of the files in a text editor. This file contains lots of information, but you should really only be looking for two things: Verify that the usage events are logged correctly. Verify that different users have generated the usage event. In About Usage analytics in a cross-site publishing scenario we explained that for Usage analytics to work, the usage event must be recorded on the URL of the item. In the Event store file, you'll see many URLs. Look for URLs that end in dispform.aspx?id= followed by a number. In our Contoso version of this file, we see there are many entries with such URLs. This means that the usage events are being recorded correctly. To verify that the URLs actually belong to one of our catalog items, copy one of the URLs from this file, and paste it into your browser. To verify that different users have generated the usage event, look in the third column of the file. In our scenario, we can see that we have at least three user IDs. Now that we have verified that the usage events are correctly logged, you might be tempted to think that we are ready to run the Usage analytics job. But remember, by using Microsoft PowerShell to start Usage analytics, we are actually starting timer jobs. When the Usage analytics timer job starts, it'll take the usage events from yesterday and process them. Since we want to process the files from today, we'll use a simple trick so that the correct files can be processed by Usage analytics. Prepare usage events files before you start Usage analytics with Windows PowerShell In your EventStore folder, create a folder named myevents. Copy the usage event files that you want Usage analytics to process into your myevents folder. In our Contoso scenario, copy all files from the View usage events in the Event store folder into myevents. Right-click your myevents folder and select Properties. In the Attributes section, select Read-only (Only applies to files in folder), and then click OK. In the Confirm Attribute Changes dialog, select Apply changes to this folder, subfolder and files and then click OK. Now you are ready to start the Usage analytics job. Start the Usage analytics job with Microsoft PowerShell At the Microsoft PowerShell command prompt, type the following commands: powershell $job = get-sptimerjob -type microsoft.office.server.search.analytics.usageanalyticsjobdefinition $job.DisableTimerJobSchedule() $job.StartAnalysis("\\\Analytics_\EventStore\myevents") $job.EnableTimerJobSchedule() Notice that one command contains two placeholders: host name and guid. The host name is the name of the server where SharePoint Server is installed. You can see the GUID in the file path of your EventStore. Check the status of the Usage analytics job by entering the following command: powershell $job.GetAnalysisInfo() The Usage analytics job is finished when State is Stopped and Status is 100. Now that Usage analytics have processed the usage events, the next step is to display the results of the analysis on our Publishing site. To do that, we'll add and configure two Web Parts. Next article in this series Add and configure the Recommended Items and Popular Items Web Part in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/change-the-content-search-web-part-display-template-and-use-windows-powershell-t.md/0
Change the Content Search Web Part display template and use Windows PowerShell to start Usage analytics in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/change-the-content-search-web-part-display-template-and-use-windows-powershell-t.md
OfficeDocs-SharePoint
2,987
24
title: "Configure digest authentication for a claims-based web application in SharePoint Server" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 8212017 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: de49a030-60bc-49aa-979e-8b76678b63f0 description: "Learn how to configure digest authentication for a web application that uses claims-based authentication in SharePoint Server." Configure digest authentication for a claims-based web application in SharePoint Server [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] You can configure digest authentication for one or more zones in a SharePoint Server claims-based web application. A web application is an Internet Information Services (IIS) web site that SharePoint Server creates and uses. Zones represent different logical paths for gaining access to the same web application. Within each web application, you can create up to five zones. A different web site in IIS represents each zone. Use zones to enforce different access and policy conditions for large groups of users. To configure digest authentication for one or more zones in a SharePoint Server web application, use IIS Manager console, instead of SharePoint Server Central Administration. Unlike basic authentication, digest authentication encrypts user credentials to increase security. User credentials are sent as an MD5 message digest in which the original user name and password cannot be determined. Digest authentication uses a challengeresponse protocol that requires the authentication requestor to present valid credentials in response to a challenge from the server. To authenticate against the server, the client has to supply an MD5 message digest in a response that contains a shared secret password string. The MD5 Message-Digest Algorithm is described in RFC 1321. For access to RFC 1321, see The Internet Engineering Task Force (https:go.microsoft.comfwlinkp?LinkId=159913). Before you begin Before you perform this procedure, confirm the following: Your system is running SharePoint Server. The user and IIS server must be members of, or trusted by, the same domain. Users must have a valid Windows user account stored in Active Directory Domain Services (AD DS) on the domain controller. The domain must use a Windows Server 2008 or Windows Server 2008 R2 domain controller. [!NOTE] For SharePoint Server 2016, the domain must use a Windows Server 2012 R2 or Windows Server 2016 domain controller You understand digest authentication for web traffic. For more information, see What is Digest Authentication? (previous-versionswindowsit-prowindows-server-2003cc778868(v=ws.10)). Configure IIS to enable digest authentication Use IIS Manager console to configure IIS to enable digest authentication for one or more of the following zones for a claims-based web application: Default Intranet Extranet The Default zone is the zone that is first created when a web application is created. The other zones are created by extending a web application. For more information, see Extend claims-based web applications in SharePoint. To configure IIS to enable digest authentication Verify that you are a member of the Administrators group on the server on which you are configuring IIS. Click Start, point to Administrative Tools, and then click Internet Information Services (IIS) Manager to start IIS Manager console. Expand Sites in the console tree, and then click the IIS web site that corresponds to the web application zone on which you want to configure digest authentication. In Features View, in IIS, double-click Authentication. In Features View, in Authentication, right-click Digest Authentication, and then click Enable. Right-click Digest Authentication, and then click Edit. In the Edit Digest Authentication Settings dialog, in the Realm text box, type the appropriate realm, and then click OK. The realm is a DNS domain name or an IP address that will use the credentials that have been authenticated against your internal Windows domain. You must configure a realm name for digest authentication. The web site is now configured to use digest authentication. See also Concepts Configure Basic authentication for a claims-based Web application Other Resources Plan for user authentication methods in SharePoint Server Extend claims-based web applications in SharePoint
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/configure-digest-authentication-for-a-claims-based-web-application.md/0
Configure digest authentication for a claims-based web application in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/configure-digest-authentication-for-a-claims-based-web-application.md
OfficeDocs-SharePoint
1,008
25
ms.date: 03132018 title: "Configure backup and restore permissions in SharePoint Server" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: 3a25437a-e994-42c7-b4df-ac9fa29f38f5 description: "Learn how to configure permissions for backup and restore operations in SharePoint Server." Configure backup and restore permissions in SharePoint Server [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] You can configure backup and restore permissions for SharePoint Server by using the SharePoint Central Administration website or Microsoft PowerShell. The backup tool that you use depends on the kind of environment that you have deployed, your backup schedule requirements, and service level agreements that you have made with your organization. Before you begin Before you back up or restore SharePoint Server, you must make sure that the timer service account, SQL Server service account, and users who run the backup or restore operations have the correct permissions or are members of the correct Windows security groups or SharePoint groups. You must configure these permissions and group memberships when you first deploy SharePoint Server. You have to update permissions and group memberships when you add new farm components to the environment and if you want to add users who will perform backup and restore operations. Permissions for the SharePoint Timer service and SQL Server account in SharePoint Server The SharePoint Timer Server and the SQL Server service account in SharePoint Server perform backup and restore operations on behalf of users. These service accounts require Full Control permissions on any backup folders. Group memberships required to run backup and restore operations in Central Administration You must make sure all user accounts that use Central Administration to back up or restore your farm and farm components have the group memberships that are described in the following table. Farm componentMember of Administrators group on the local computerMember of Farm Administrators SharePoint group :-----:-----:----- Farm Yes No Service Application Yes No Content Database Yes No Site Collection No Yes Site, list, document library No Yes Setting permissions to run SharePoint backup and restore operations by using PowerShell You must make sure that all user accounts that use PowerShell to back up or restore your farm and farm components are added to the SharePoint_Shell_Access role for a specified database and have the permissions described in the table later in this section. You can run the Add-SPShellAdmin cmdlet to add a user account to this role. You must run the command for each user account. Moreover, you must run the command for all databases to which you want to grant access. [!NOTE] You only have to grant a user account access to back up and restore a specific farm component one time. You will have to perform this task again only when you add new farm components to your environment or when you want to add users to perform backup and restore operations. [!IMPORTANT] The Add-SPShellAdmin cmdlet grants the SPDataAccess role but this is not enough to complete the restore operation. This is because the restore-spsite cmdlet uses direct insert statements to add content rather than stored procedures which accommodate other interactions. The Add-SPShellAdmin cmdlet worked fine in SharePoint 2010 because as part of the SPDataAccess schema it added dbo permissions. For SharePoint Servers 2019, 2016 and 2013 the db_owner fixed database role permissions are required to complete restore operations from the SharePoint Management Shell. To add a user to or remove a user from the SharePoint_Shell_Access role by using PowerShell Verify that you have the following memberships: securityadmin fixed server role on the SQL Server instance. db_owner fixed database role on all databases that are to be updated. Administrators group on the server on which you are running the PowerShell cmdlets. An administrator can use the Add-SPShellAdmin cmdlet to grant permissions to use SharePoint Server cmdlets. [!NOTE] If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For additional information about PowerShell permissions, see Add-SPShellAdmin. Start the SharePoint Management Shell. At the PowerShell command prompt, type the following command: powershell Add-SPShellAdmin -Username -Database Where: \ is the GUID assigned to the database. To add a user account to all the databases in the farm, type the following command: powershell ForEach ($db in Get-SPDatabase) {Add-SPShellAdmin -Username -Database $db} Where: \ is the user whose account you want to add. To remove a user account from all the databases in the farm, type the following command: powershell ForEach ($db in Get-SPDatabase) {Remove-SPShellAdmin -Username -Database $db} Where: \ is the user whose account you want to remove. To view the user accounts currently added to the databases in the farm, type the following command: powershell ForEach ($db in Get-SPDatabase) {Get-SPShellAdmin -Database $db} For more information, see Add-SPShellAdmin. [!NOTE] We recommend that you use Microsoft PowerShell when performing command-line administrative tasks. The Stsadm command-line tool has been deprecated, but is included to support compatibility with previous product versions. You might also have to grant additional permissions to the users who run the backup or restore operation by using PowerShell. The following table shows the permissions that are required. Farm componentMember of Administrators group on the local computerMember of Farm Administrators SharePoint groupFull control on backup folder :-----:-----:-----:----- Farm Yes No Yes Service Application Yes No Yes Content Database Yes No Yes Site Collection No Yes Yes Site, list, document library Yes No Yes See also Concepts Plan for backup and recovery in SharePoint Server Prepare to back up and restore farms in SharePoint Server Overview of backup and recovery in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/configure-permissions-for-backup-and-restore.md/0
Configure backup and restore permissions in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/configure-permissions-for-backup-and-restore.md
OfficeDocs-SharePoint
1,411
26
title: "Configure the unattended service account for PerformancePoint Services" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 762017 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.assetid: 411e0fa7-2a27-4883-93ac-a2fd228e40d8 description: "Learn how to configure the unattended service account for PerformancePoint Services." Configure the unattended service account for PerformancePoint Services [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] The unattended service account is an Active Directory account that is used for accessing PerformancePoint Services data sources. This account is used by PerformancePoint Services on behalf of authorized users to provide access to external data sources for the purposes of creating and using dashboards and other PerformancePoint Services content. To configure the unattended service account, see Configure the unattended service account for PerformancePoint Services in this article. [!NOTE] PerformancePoint Services has been removed from SharePoint Server Subscription Edition. We recommend to explore Microsoft Power BI as an alternative to PerformancePoint Services. [!NOTE] The unattended service account is a universal account that provides equal data access to all authorized users. If you need more fine-grained data access, see Configure Secure Store for use with PerformancePoint Services. PerformancePoint Services uses Secure Store Service to store the unattended service account password. Before using the Unattended Service Account, make sure that Secure Store has been configured. Configure the unattended service account for PerformancePoint Services Use the following procedure to configure the unattended service account for PerformancePoint Services. To configure the unattended service account for PerformancePoint Services On the SharePoint Central Administration Web site, in the Application Management section, click Manage Service Applications, and then click the PerformancePoint Services service application. On the Manage PerformancePoint Services page, click PerformancePoint Service Application Settings. In the Secure Store and Unattended Service Account section, enter the user name and password for the account that you want to use as the unattended service account. Click OK. You will see the Secure Store Service name and the user name that represents the unattended service account. Once the unattended service account has been configured, you must grant that account access to your data sources: For SQL Server data, the account must have a SQL logon with db_datareader permissions on each database that you want to access. For SQL Server Analysis Services data, the account must have read access to the cube or an appropriate portion of the cube, depending on your needs. For Excel Services data, the account must have access to the Excel workbook in a SharePoint document library. For data in a SharePoint list, the account must have read access to the list.
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/configure-the-unattended-service-account-for-performancepoint-services.md/0
Configure the unattended service account for PerformancePoint Services
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/configure-the-unattended-service-account-for-performancepoint-services.md
OfficeDocs-SharePoint
662
27
title: "Create query rules for web content management in SharePoint Server" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 7212017 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: IT_Sharepoint_Server_Top ms.assetid: 838d2f49-554b-4918-b3db-ba376be9d236 description: "Learn how to improve search results by creating and managing query rules in SharePoint Server." Create query rules for web content management in SharePoint Server [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] Without using any custom code, Search service application administrators, site collection administrators, and site owners can help searches respond to the intent of users by creating query rules. In a query rule, you specify conditions and correlated actions. When a query meets the conditions in a query rule, the search system performs the actions specified in the rule to improve the relevance of the search results, such as by narrowing results or changing the order in which results are displayed. For example, a query rule condition could be that a term in a query matches a particular term in a SharePoint Server term set, or that a query is frequently performed on a particular result source in a search system, such as videos. When the query rule condition is satisfied, a correlated action could be to show a specific item at the top of the search results. Say you have an intranet site where all company events are maintained in a library that you have shared as a catalog, and you want to promote a first-aid seminar. To do this, you create a query rule that boosts the first-aid seminar to the top of the search results when someone searches for "seminar" or "event." You can configure query rules for one or more result sources, and you can specify the time period during which the query rule is active. For more information about query rules, see Plan result sources and query rules in Plan search for cross-site publishing sites in SharePoint Server 2016. Before you begin [!NOTE] Because SharePoint Server runs as websites in Internet Information Services (IIS), administrators and users depend on the accessibility features that browsers provide. SharePoint Server supports the accessibility features of supported browsers. For more information, see the following resources: > Plan browser support> Accessibility guidelines in SharePoint> Accessibility in SharePoint> Keyboard shortcuts> Touch. We recommend that you set up your site, integrate a catalog, and set up your category pages and catalog item pages before you begin to create query rules. This is because you can then more easily test and verify how the different query rules apply to the different Search Web Parts you have on your site. Creating query rules at different levels in a SharePoint Server farm You can create a query rule for a Search service application, a site collection, or a site. The following table shows the permissions that are required to create a query rule in each case, and where the query rule can be used. Levels and permissions for query rules When you create a query rule at this levelYou must have this permissionThe query rule can be used in :-----:-----:----- Search service application Search service application administrator All site collections in web applications that consume the Search service application Site collection Site collection administrator All sites in the site collection Site Site owner The site To add or edit a query rule, you must go to the Manage query rules page. Depending on the level at which you are creating the query rule, use one of the following procedures to go to the Manage query rules page. To go to the Manage query rules page for a Search service application Verify that the user account that performs this procedure is an administrator for the Search service application. In Central Administration, in the Application Management section, click Manage service applications. Click the Search service application to which you want to add query rules. On the Search Administration page for the Search service application, in the Quick Launch, in the Queries and Results section, click Query Rules. To go to the Manage query rules page for a publishing site collection Verify that the user account that performs this procedure is a site collection administrator for the publishing site collection. On the publishing site collection, on the Settings menu, click Site Settings. On the Site Settings page, in the Site Collection Administration section, click Search Query Rules. To go to the Manage query rules page for a site Verify that the user account that performs this procedure is a member of the Owners group for the site. On the Settings menu for the site, click Site Settings. On the Site Settings page, in the Site Administration section, click Query Rules. Create a query rule To create a query rule On the Manage Query Rules page, in the Select a Result Source menu, select a result source for the new query rule. Click New Query Rule. On the Add Query Rule page, in the General Information section, in the Rule name field, type the name for the query rule. Click to expand the Context section. In the Context section, select one of the following: To apply the query rule to all result sources, select All sources. To apply the query rule to one or more specific result sources, select One of these sources. By default, the result source that you specified in step 1 is selected. To add a result source for the query rule, do the following: Click Add Source. In the Add Source dialog, select a result source, and then click Save. To restrict the query rule to categories —for example, that a query rule should fire only when a term from your managed navigation term set is included in the query — click Show more conditions, and then specify the following: To restrict the query rule to a category, click Add category. In the Import from Taxonomy dialog, select a term that when you include it in a query will cause the query rule to fire, and then click Save. To restrict the query rule to a user segment, do the following: Click Add User Segment. In the Add User Segment dialog, in the Title field, type the title for this rule. In the Import from taxonomy dialog, select a term that represents a user segment that will cause the query rule to fire when it appears in a query. Click Save. In the Query Conditions section, select one of the conditions listed in the following table. [!NOTE] When you create query rules for catalog pages that have Web Parts that use search technology (described in this article as Search Web Parts), and you want the query that is configured in the Web Parts to act as the query condition, click Remove Condition, and then go to step 8. You should also do this if you want a query rule to fire every time that a user types anything in a search box. Query conditionDescriptionConfigurationExample :-----:-----:-----:----- Query Matches Keyword Exactly Select this option if you want the query rule to fire when a query exactly matches a word or phrase that you specify. In the Query exactly matches one of these phrases text box, type one or more phrases separated by semicolons. You type "picture; pic" in the Query contains one of these phrases box. The query rule will fire when a user types "picture" or "pic" in a search box. The rule will not fire if a user types "pictures" or "sunny picture." Query Contains Action Term Select this option if you want the query rule to fire when a query contains a term that indicates something that the user wants to do. The term must be at the beginning or end of the query. Specify the action term that will cause the query rule to fire by doing one of the following: Select Action term is one of these phrases, and type one or more phrases. Select Action term is an entry in this dictionary, and then click Import from taxonomy. In the Import from taxonomy dialog, select a term from a term set, and then click Save. You type the word "download" in the Action term is one of these phrases text box. When a user types "download Contoso Electronics datasheet" in a search box, the user is probably not searching for a document that contains the words "download," "Contoso," "Electronics," and "datasheet." Instead, the user is probably trying to download a Contoso Electronics datasheet. When a user types "download Contoso Electronics datasheet" in a search box, the query rule fires, and only the words "Contoso," "Electronics," and "datasheet" are passed to the search index. Query Matches Dictionary Exactly Select this option if you want the query rule to fire when the query exactly matches a dictionary entry. From the Query contains an entry in this dictionary menu, select a dictionary. To specify a different dictionary, click Import from taxonomy, and then from the Import from taxonomy dialog, select a term from a term set, and then click Save. In an Internet business scenario, you have a term set named Brands that contains all brand names within your catalog. The query rule will fire when a user types a name that matches a term from the Brands term set. Query More Common in Source Select this option if you want the query rule to fire if the query was frequently issued by users on a different result source that you specify. In the Query is more likely to be used in this source menu, select a result source. In the Query is more likely to be used in this source menu, you select Local Video Results. The query rule will fire if a user types the word "training" in a search box and that word was frequently typed in a search box in the Videos vertical. Result Type Commonly Clicked Select this option if you want the query rule to fire if other users frequently clicked a particular result type after they typed the same query. In the Commonly clicked results match result type menu, select a result type. In an Internet business scenario, you have a catalog of electronic products. Each product has a PDF datasheet. So when users query for a specific product, the search results will return two result types: one that links to the page that has the product details, and one that links to the PDF datasheet. You can create a query rule that will fire if the system over time recognizes that users frequently click the search result for the PDF datasheet. When you know the type of content the user is looking for, you can specify an action for this query rule. Advanced Query Text Match Select this option if you want to use a regular expression, a phrase, or a dictionary entry that will cause the query rule to fire. To match all phone numbers that are in a certain format, you specify a regular expression in the Query matches this regular expression box. To match all phone numbers that are in the format nnn-nnn-nnnn, you specify the regular expression "(?(\d{3}))?-?(\d{3})-(\d{4})". To add conditions, click Add Alternate Conditions. [!NOTE] The rule will fire when any condition is true. In the Actions section, specify the action to take when the query rule fires. Specify one of the following: To promote individual results so that they appear towards the top of search results, click Add Promoted Result (in SharePoint 2010 Products this was called Best Bets). In the Add Promoted Result dialog, in the Title field, type the name that you want to give this promoted result. In the URL field, type the URL of the result that should be promoted. Render the URL as a banner instead of as a hyperlink. Click Save. You can add several individual promoted results. When there is more than one promoted result, you can specify the relative ranking. To promote a group of search results, click Add Result Block. For more information, see Create and display a result block later in this article. To change ranked search results, click Change ranked results by changing the query. For more information, see Change ranked search results later in this article. To make the query rule active during a particular time period, click Publishing, and then specify the period. Create and display a result block A result block is several search results that are displayed as a group. In the same manner as you can promote a specific result, you can promote a result block when a specified query condition applies. For example, you can create a result block named Yellow items for all catalog items that have the color yellow. In an Internet business scenario where you have a catalog of electronic products and you want to promote yellow mp3 players, you can create a query rule that fires for all items that are tagged with the term mp3 players , where the action is to display the result block Yellow items . Result blocks are automatically displayed in the Search Results Web Part. To display results from a result block in a Content Search Web Part, you have to configure it to display the result block. When you configure the query condition for a result block, you can use query variables. Query variables are placeholders for values that you don't know when you specify the query. However, when the query is run, this information is known and can be used when the system sends the query to the index. Examples are {User.Name}, which represents the display name of the user who typed the query, or {searchBoxQuery}, which represents the query that a user typed in a search box. When you use Query Builder to configure the query, a list of query variables is shown. (See step 3 in the following procedure.) To create a result block In step 8 of the previous procedure, on the Add Query Rule page, in the Actions section, click Add Result Block. In the Block Title section, in the Title field, type a name for the result block. In the Query section, to specify the query, click Launch Query Builder. In Query Builder, specify the following: On the BASIC tab, select options from the following lists to define the query for the result block: Query optionsDescription :-----:----- Select a query Select a result source to specify which content should be searched. Keyword filter You can use keyword filters to add query variables to your query. See Query variables in SharePoint Server for a list of available query variables. You can select pre-defined query variables from the drop-down list, and then add them to the query by clicking Add keyword filter. Property filter You can use property filters to query the content of managed properties that are set to queryable in the search schema. You can select managed properties from the Property filter drop-down list. Click Add property filter to add the filter to the query. On the SORTING tab, you can specify how search results within your result block should be sorted. In the Sort by drop-down list: To sort by managed properties that are set as sortable in the search schema, select a managed property from the list, and then select Descending or Ascending. To add more sorting levels, click Add sort level. [!NOTE] Sorting of search results is case sensitive. To sort by relevance rank, select Rank, and then do the following: In the Ranking Model list, select which ranking model to use for sorting search results (this selection is optional). In the Dynamic ordering section, to specify additional ranking by adding rules that will change the order of search results when certain conditions apply, click Add dynamic ordering rule, and then specify conditional rules. On the TEST tab, you can preview the query that is sent. ValueDescription :-----:----- Query text Shows the final query that will be run by the Content Search Web Part. It is based on the original query template where dynamic variables are substituted with current values. Other changes to the query may have to be made as part of query rules. Click Show more to display additional information. ValueDescription :-----:----- Query template Shows the content of the query template that is applied to the query. Query template variables Shows the query variables that will be applied to the query, and the values of the variables that apply to the current page. You can type other values to test the effect they will have on the query. Click the Test Query button to preview the search results. In the Query section, in the Configure Query box, in the Search this Source drop-down list, select the result source to which this result block should be applied. In the Query section, in the Items drop-down list, select how many results to show in the result block. Click to expand the Settings section. The result block will only display the number of search results that you specified in the previous step. However, you can add a SHOW MORE link at the bottom of the result block that will show all search results for the result block. To add a SHOW MORE link, select "More" link goes to the following URL, and then type a URL. You can use query variables in this URL — for example, http:www.\searchresults.aspx?k={subjectTerms} . In the Routing section, in the field under Label for routing to a Content Search Web Part, type a label for routing the result block to a Content Search Web Part, or select an existing label. You will use this label in the following procedure. Click OK. To configure a Content Search Web Part to display a result block Add a Content Search Web Part to a page as described in "Add a Content Search Web Part to a page" in Configure Search Web Parts in SharePoint Server. In the Web Part, click the Content Search Web Part Menu arrow, and then click Edit Web Part. In the Web Part tool pane, in the Properties section, expand the Settings section. In the Settings section, from the Result Table list, select the label of the result block that you want to display. The label is what you specified in step 7 in the previous procedure, To create a result block. To use the query results that are returned from another Web Part on the page as input when displaying the result block, from the Query results provided by list, select a Web Part. [!NOTE] When displaying a result block in a Content Search Web Part, paging of search results is not supported through the Control Display Template. Display promoted results in a Content Search Web Part To display promoted results in a Content Search Web Part Add a Content Search Web Part to a page as described in "Add a Content Search Web Part to a page" in Configure Search Web Parts in SharePoint Server. In the Web Part, click the Content Search Web Part Menu arrow, and then click Edit Web Part. In the Web Part tool pane, in the Properties section, expand the Settings section. In the Settings section, in the Result Table list, select SpecialTermResults. Change ranked search results The ranking model calculates a ranking order of search results. You can change this ranking by promoting or demoting items within the search results. For example, for a query that contains "download toolbox," you can create a query rule that recognizes the word "download" as an action term, and change the ranked search results to promote a URL of a particular download site on your intranet. You can also change the sorting order of the search results dynamically, based on several variables such as file name extension or specific keywords. Changing ranked search results by changing the query has the advantage that the results are security trimmed and refinable. Moreover, the search results will not appear if the document is no longer available. To change ranked search results by changing the query From step 8 of the procedure Create a query rule, on the Add Query Rule page, in the Actions section, click Change ranked results by changing the query. In the Build Your Query dialog, specify the following: On the BASIC tab, select options from the following lists to change ranked search results: ValueDescription :-----:----- Select a query Select a result source to specify which content should be searched. Keyword filter You can use keyword filters to add query variables to your query. See Query variables in SharePoint Server for a list of available query variables. You can select pre-defined query variables from the drop-down list, and then add them to the query by clicking Add keyword filter. Property filter You can use property filters to query the content of managed properties that are set to queryable in the search schema. You can select managed properties from the Property filter drop-down list. Click Add property filter to add the filter to the query. On the SORTING tab, you can specify how search results should be sorted by doing the following: In the Sort by drop-down list: To sort by managed properties that are set as sortable in the search schema, select a managed property from the list, and then select Descending or Ascending. To add more sorting levels, click Add sort level. [!NOTE] Sorting of search results is case sensitive. To sort by relevance rank, select Rank, and then do the following: In the Ranking Model list, select which ranking model to use for sorting search results (this selection is optional). In the Dynamic ordering section, to specify additional ranking by adding rules that will change the order of search results when certain conditions apply, click Add dynamic ordering rule, and then specify conditional rules. On the TEST tab, you can preview the query. ValueDescription :-----:----- Query text Shows the final query that will be run by the Content Search Web Part. It is based on the original query template where dynamic variables are substituted with current values. Other changes to the query may have to be made as part of query rules. Click Show more to display additional information. ValueDescription :-----:----- Query template Shows the content of the query template that is applied to the query. Query template variables Shows the query variables that will be applied to the query, and the values of the variables that apply to the current page. You can type other values to test the effect they will have on the query. Click the Test Query button to preview the search results. Make a query rule inactive Query rules that are created at the Search service application level are inherited by site collections and sites that are in web applications that consume the Search service application. Similarly, query rules that are created at the site collection level are inherited by sites in the site collection. If you don't want a query rule to apply to a site that inherits it, you can set the query rule as inactive for the site. To make a query rule inactive on a site Verify that the user account that performs this procedure is a member of the Owners group on the publishing site. On the publishing site collection, on the Settings menu, click Site Settings. In the site collection, in the Settings menu, click Site Settings. On the Site Settings page, in the Search section, click Query Rules. On the Manage Query Rules page, on the Select a Result Source menu, select the result source that contains the query rule that you want to make inactive. In the Name column, point to the query rule that you want to make inactive, click the arrow that appears, and then click Make Inactive. Rank query rules When multiple query rules are active for a Search service application, a site collection, or a site, more than one rule can fire for a query that is performed at that level. By default, the rules do not fire in a prescribed order. You can control the order in which the rules fire by adding the query rules that you create to query groups. To do this, you select rules to add to a group, and then you specify the order in which the rules in the group will fire if they are triggered. You can also prevent query rules that rank lowest in a group from firing even if they are triggered. To rank query rules for a site collection Verify that the user account that performs this procedure is a site collection administrator for the publishing site collection. On the publishing site collection, on the Settings menu, click Site Settings. On the Site Settings page, in the Site Collection Administration section, click Search Query Rules. On the Manage Query Rules page, on the Select a Result Source menu, select the result source that contains the query rules that you want to group. For each query rule that you created that you want to add to a group, point to the rule and select the check box. [!NOTE] Query rules that you created for this site collection are listed in the Defined for this site collection section. Click Order Selected Rules. In the Order Selected Rules dialog, do either of the following, and then click OK: Select Move rules to new group with this name, and then type a name for the group. Select Move rules to existing group and select a group in the drop-down list. On the Manage Query Rules page, do the following: To change the order in which a rule in a group will fire if it is triggered, change the numeric order of the rule. To prevent query rules that are ranked lowest in the group from firing, in the row for the group's query rule that should fire last, in the Actions column, in the ContinueStop drop-down list, select Stop. See also Concepts Plan to transform queries and order results in SharePoint Server Query variables in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/create-query-rules-for-web-content-management.md/0
Create query rules for web content management in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/create-query-rules-for-web-content-management.md
OfficeDocs-SharePoint
5,488
28
title: "Test Lab Guide Demonstrate Social Features for SharePoint Server 2013" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 7102017 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.assetid: aecba264-abdb-4515-8b4f-c451cc0c0107 description: "Learn how to configure and demonstrate the new social features of SharePoint Server based on the Test Lab Guide: Configure SharePoint Server 2013 in a three-tier farm." Test Lab Guide: Demonstrate Social Features for SharePoint Server 2013 [!INCLUDEappliesto-2013-xxx-xxx-xxx-xxx-md] This document contains instructions for the following: Setting up the SharePoint Server 2013 three-tier farm test lab. Creating a My Site site collection and configure settings. Configuring Following settings. Configuring Community Sites. Configuring site feeds. Demonstrating social features. Download the test lab guide Test Lab Guide: Demonstrate Social Features for SharePoint Server 2013 See also Test Lab Guides
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/demonstrate-social-features.md/0
Test Lab Guide: Demonstrate Social Features for SharePoint Server 2013
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/demonstrate-social-features.md
OfficeDocs-SharePoint
290
29
title: "Overview of Excel Services in SharePoint Server 2013" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 6232017 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.assetid: fe776cf2-17a4-4bb6-95ea-66288f243a93 description: "Excel Services is a business intelligence tool in SharePoint Server that allows you to share data-connected workbooks across an organization." Overview of Excel Services in SharePoint Server 2013 [!INCLUDEappliesto-2013-xxx-xxx-xxx-xxx-md] Excel Services is a shared service that you can use to publish Excel workbooks on SharePoint Server 2013. The published workbooks can be managed and secured according to your organizational needs and shared among SharePoint Server 2013 users, who can render the workbooks in a browser. Excel Services is available only in the Enterprise edition of SharePoint Server 2013. Excel Services is used primarily for business intelligence scenarios. Excel workbooks can be connected to external data sources, reports created, and then the workbook can be published to a SharePoint document library. When a user opens the workbook from the the document library, it is rendered in the browser by using Excel Services. The external data connection is maintained and the data is refreshed if necessary. This allows broad sharing of reports throughout an organization. Excel Services consists of Excel Calculation Services, the Excel Web Access Web Part, and Excel Web Services for programmatic access. It supports sharing, securing, managing, and using Excel workbooks in a browser by providing the following: Global settings for managing workbooks, which include settings for security, load balancing, session management, memory utilization, workbook caches, and external data connections. Trusted file locations (which allow you to define which document libraries are trusted by Excel Services) together with session management, workbook size, calculation behavior, and external data settings of workbooks stored in those locations. An extensive list of trusted data providers for connecting to your data, plus the ability to add your own trusted data provider. Trusted data connection libraries, which allow you to define which data connection libraries in your farm are trusted by Excel Services. The ability to add your own user-defined function assemblies. While users can interact with Excel workbooks in a browser through Excel Services, the workbooks cannot be edited in the browser by using Excel Services. Programmatic options are available. Looking at several specific scenarios can help you understand how best to take advantage of Excel Services: Sharing workbooks Users can save Excel workbooks to a SharePoint Server document library to give other users browser-based access to the server-calculated version of the workbook. When the workbook is accessed, Excel Services loads the workbook, refreshes the external data if it is necessary, calculates it if it is necessary, and sends the resulting output view back through the browser. A user can interact with Excel-based data by sorting, filtering, expanding, or collapsing PivotTables, and by passing in parameters. This provides the ability to perform analysis on published workbooks. A user does not have to have Excel installed to view the workbook. Users will always view the latest version of a workbook, and they can interact with it in a browser. Security permissions can be set to limit what access is provided to which user. Building business intelligence (BI) dashboards Browser-based dashboards can be created by using Excel and Excel Services together with the Excel Web Access Web Part. PerformancePoint Services can also use Excel Services workbooks as a data source. Reuse of logic encapsulated in Excel workbooks in custom applications Besides a browser-based interface with the server, Excel Services provides a Web-service-based interface so that a published workbook can be accessed programmatically by any application that uses Web services. The Web service applications can change values, calculate the workbook, and retrieve some or all of the updated workbook by using that interface according to what security permissions are set for the published workbook. Report Building One of the most useful features of Excel Services is report building. By publishing data-connected workbooks to a SharePoint document library and making them available through Excel Services, you can make reports that you have created in Excel available to others in your organization. Instead of multiple users having separate copies of the workbooks on their computers, the workbooks can be created and changed by a trusted author in a central location that is trusted by Excel Services. The correct version of the worksheet is easier to find, share, and use from Excel, SharePoint Server, and other applications. Farms using Office Web Apps Server If your SharePoint Server farm has been integrated with Office Web Apps Server and Excel Web App, the features available in Excel Services will depend on how Excel Web App has been configured. Excel Web App runs in one of two modes: SharePoint view mode In this mode, Excel Services is used to view workbooks in the browser. Office Web Apps Server view mode In this mode, Excel Web App is used to view workbooks in the browser. The following table lists the business intelligence features available in Excel Services in each mode. BI features in Excel Services, by mode  SharePoint Server onlySharePoint Server with Excel Web App (SharePoint view mode)SharePoint Server with Excel Web App (Office Web Apps Server view mode) :-----:-----:-----:----- Excel Web Access Web Partxx Refresh OData connectionsxx View and interact with Power View reportsxx View and interact with Power Pivot data modelsxx Refresh Power Pivot data modelsxx Refresh data by using the Excel Services unattended service accountxx Refresh data by using Secure Store and Windows credentialsxxx Refresh data by using Effective User Name connectionsxx Kerberos delegationxx See also Concepts Business intelligence capabilities in Excel Service (SharePoint Server 2013)
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/excel-services-overview.md/0
Overview of Excel Services in SharePoint Server 2013
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/excel-services-overview.md
OfficeDocs-SharePoint
1,298
30
ms.date: 03132018 title: "Import a list or document library in SharePoint Server" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: b3cb17a1-939c-4314-9f83-3c6b8a309bba description: "Learn how to import a site, list, or document library in SharePoint Server." Import a list or document library in SharePoint Server [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] You can import a site, list, or document library in SharePoint Server by using PowerShell. Before you begin Although you can use either PowerShell or Central Administration to export a site, list, or document library, you can use only PowerShell to import a site, list, or document library. For information about how to export lists or libraries, see Export sites, lists, or document libraries in SharePoint Server. Before you begin this operation, review the following information: You can use importing as a method of restoring the items, or as a method of moving or copying the items from one farm to another farm. You can import a site, list, or document library from a backup of the current farm, from a backup of another farm, or from a read-only content database. To import from a read-only content database, you must first attach the read-only database. For more information, see Attach and restore read-only content databases in SharePoint Server. You cannot import a site, list or document library exported from one version of SharePoint Server to another version of SharePoint Server. Importing a site, list, or document library in SharePoint Server You can use PowerShell to manually import a site, list, or document library or as part of a script that can be run regularly. To import a site, list or document library by using PowerShell Verify that you have the following memberships: securityadmin fixed server role on the SQL Server instance. db_owner fixed database role on all databases that are to be updated. Administrators group on the server on which you are running the PowerShell cmdlets. An administrator can use the Add-SPShellAdmin cmdlet to grant permissions to use SharePoint Server cmdlets. [!NOTE] If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For additional information about PowerShell permissions, see Add-SPShellAdmin. Start the SharePoint Management Shell. At the PowerShell command prompt, type the following command: Import-SPWeb -Identity -Path [-Force] [-NoFileCompression] [-Verbose] Where: \ is the URL for the site that you are importing to. \ is the name of the file that you are exporting from. [!IMPORTANT] The site or subsite that you are importing must have a template that matches the template of the site specified by Identity. You can also use the Get-SPWeb cmdlet and pass the ID to Import-SPWeb by using the PowerShell pipeline. The value of the Path parameter specifies the path and file name of the file from which to import the list or library. To include the user security settings with the list or document library, use the IncludeUserSecurity parameter. To overwrite the list or library that you specified, use the Force parameter. You can use the UpdateVersions parameter to specify how versioning conflicts will be handled. To view the progress of the operation, use the Verbose parameter. The NoFileCompression parameter lets you specify that no file compression is performed during the import process. Using this parameter can lower resource usage up to 30% during the export and import process. If you are importing a site, list, or document library that you exported from Central Administration, or if you exported a site, list, or document library by using PowerShell and you did not use the NoFileCompression parameter in the Export-SPWeb cmdlet, you cannot use this parameter in the Import-SPWeb cmdlet. [!NOTE] There is no facility in the Import-SPWeb cmdlet to import a subset of the items within the export file. Therefore, the import operation will import everything from the file. For more information, see Import-SPWeb. [!NOTE] We recommend that you use Microsoft PowerShell when performing command-line administrative tasks. The Stsadm command-line tool has been deprecated, but is included to support compatibility with previous product versions. See also Concepts Export sites, lists, or document libraries in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/import-a-list-or-document-library.md/0
Import a list or document library in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/import-a-list-or-document-library.md
OfficeDocs-SharePoint
1,079
31
title: "Maintain user profile synchronization settings in SharePoint Server 2013" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 822017 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.assetid: 26f02074-af0b-4548-ab68-9d46dd05b8ff description: "Learn how to maintain User Profile synchronization settings in SharePoint Server after you configure User Profile synchronization." Maintain user profile synchronization settings in SharePoint Server [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] Profile synchronization in SharePoint Server enables an administrator of an instance of the User Profile service to synchronize user and group profile information that is stored in the SharePoint Server profile store with profile information that is stored in directory services across the enterprise. After you have configured User Profile synchronization, you must complete tasks to maintain those settings. These tasks include, for example, removing users whose accounts are disabled or deleted, moving or renaming a server, and starting or stopping the User Profile Synchronization service. For more information, see Plan profile synchronization for SharePoint Server 2013. To run the PowerShell cmdlets in this article, verify that you have the following memberships: securityadmin fixed server role on the SQL Server instance. db_owner fixed database role on all databases that are to be updated. Administrators group on the server on which you are running the PowerShell cmdlets. [!IMPORTANT] Each section is noted as to the version of SharePoint Server it applies to. Rename users or change user domains [!NOTE] This section applies to SharePoint Server 2013, 2016, and 2019. SharePoint Server lets you handle several different user migration scenarios. The following are examples of the scenarios handled for Active Directory Domain Services (AD DS): Account name ( sAMAccountName) changes in the AD DS where the user exists. Security Identifier (SID) changes. Distinguished name (DN) changes that include changes in the organizational unit (OU) container in the AD DS where the user account exists. For example, if a user's distinguished name is moved in AD DS from "User= EUROPE\John Smith, Manager=CN=John Rodman, OU=Users, DC=EMEA1, DC=corp, DC=contoso, DC=com" to "User= EUROPE\John Smith, Manager=CN=John Rodman, OU=Managers, DC=EMEA1, DC=corp, DC=contoso,DC=com", the MigrateUser command updates the user profile store for this user. The user profile for John Smith is updated when synchronizing user profiles from the EMEA1.corp.contoso.com AD DS to the SharePoint Server user profile store. To rename users or to change user domains Verify that the user account that is performing this procedure has the following credentials: The user account that performs this procedure is a member of the Farm Administrators group on the computer that is running the SharePoint Central Administration website. The user account that performs this procedure is a member of the Administrators group on the computer on which you installed the User Profile synchronization service. If synchronization is in progress, open Central Administration and then click Manage service applications in the Application Management section. Select the appropriate User Profile service application from the list of service applications. On the Manage service application page, click Stop Profile Synchronization. Disable the User Profile Incremental Synchronization timer job. Ensure that user migration by using stsadm -o migrateuser has succeeded. [!NOTE] Move-SPUser can also be used to migrate users. Ensure that the profile of the migrated user can be accessed by browsing to the My Site for that user, for example, http:mysiteperson.aspx?accountname=\. Run User Profile synchronization. For more information, seeStart profile synchronization manually in SharePoint Server. Recheck access to the profile of the migrated user by browsing to the My Site for that user. Enable the User Profile Incremental Synchronization timer job. Exclude users whose accounts are disabled [!NOTE] This section applies to SharePoint Server 2013. You can exclude users whose accounts are disabled in AD DS by using exclusion filters in SharePoint Server 2013. For the steps that are needed to exclude users whose accounts are disabled, see Synchronize user and group profiles in SharePoint Server 2013. Remove obsolete users and groups [!NOTE] This section applies to SharePoint Server 2013, 2016, and 2019. There are two reasons why obsolete users or groups can exist in the SharePoint Server user profile store: Obsolete users: The My Site cleanup timer job is not active. The User Profile Synchronization timer job marks for deletion users who have been deleted from the directory source. When the My Site cleanup job runs, it looks for all users marked for deletion and deletes their profiles. Respective My Sites are then assigned to the manager for the deleted user and an e-mail message notifies the manager of this deletion. Obsolete users and groups: Users and groups that were not imported by Profile Synchronization exist in the user profile store. This can occur, for example, if you upgraded from an earlier version of SharePoint Server and chose to only synchronize a subset of domains with SharePoint Server. To find and remove obsolete users and groups by using PowerShell Verify that you have the following memberships: Execute permission on the ImportExport_GetNonimportedObjects and the ImportExport_PurgeNonimportedObjects stored procedures in the profile database. Start the SharePoint Management Shell. At the PowerShell command prompt, do the following: To get the User Profile Service application object, type the following command: $upa = Get-spserviceapplication Where \ is the GUID of the User Profile synchronization service application. To view the users and groups to delete, type the following command: Set-SPProfileServiceApplication $upa -GetNonImportedObjects $true To delete the obsolete users and groups, type the following command: [!CAUTION] This action cannot be undone. Set-SPProfileServiceApplication $upa -PurgeNonImportedObjects $true For more information, see Get-SPServiceApplication and Set-SPProfileServiceApplication. Maintain profile schema changes [!NOTE] This section applies to SharePoint Server 2013. Profile schema changes include things such as adding a new user profile property, changing a user profile property mapping, or changing a Profile Synchronization connection filter. When the profile schema changes, you must first perform a full nonrecurring synchronization before scheduling recurring profile synchronization. For the steps that are needed to perform full nonrecurring profile synchronization, seeStart profile synchronization manually in SharePoint Server. Rename a server that is running the User Profile synchronization service [!NOTE] This section applies to SharePoint Server 2013. Use the following procedure to rename a profile synchronization server. To rename a server that is running the User Profile synchronization service by using PowerShell Start the SharePoint Management Shell. At the PowerShell command prompt, type the following command: Rename-SPServer -Name Where: Identity is the old name of the server. newName is the new name for the server. For more information about renaming a server by using Microsoft PowerShell, see Rename-SPServer. Move the User Profile Synchronization service to a new server [!NOTE] This section applies to SharePoint Server 2013. Use the following procedure to move the User Profile Synchronization service to a new server. To move the User Profile Synchronization service to a new server by using Central Administration Verify that the user account that is performing this procedure has the following credentials: The user account that performs this procedure is a member of the Farm Administrators group on the computer that is running the SharePoint Central Administration website. The user account that performs this procedure is a member of the Administrators group on the computer on which you installed the User Profile synchronization service. This is required to start the User Profile Synchronization service. After the User Profile Synchronization service is started you can remove the farm account from the Administrators group. On the server that is currently running the User Profile synchronization service, on the SharePoint Central Administration website, in the System Settings section, click Manage services on Server. Next to the User Profile Synchronization Service, click Stop to stop the User Profile Synchronization service. On the new User Profile synchronization server, on the SharePoint Central Administration website, in the System Settings section, click Manage services on Server. Next to the User Profile Synchronization Service, click Start to start the User Profile synchronization service. On the new User Profile synchronization server, on the SharePoint Central Administration website, in the Application Management section, click Manage service applications. On the Service Applications page, click the link for the name of the appropriate User Profile service application. On the User Profile Service Application page, in the Synchronization section, click Start Profile Synchronization. On the Start Profile Synchronization page, select Start Full Synchronization, and then click OK. Restrict User Profile synchronization communication to a specific domain controller Use the following procedure to restrict profile synchronization communication to a specific domain controller. To restrict User Profile synchronization communication to a specific domain controller by using Windows PowerShell Start the SharePoint Management Shell. To get the User Profile service application object, type the following command: $upa=Get-SPServiceApplication Where \ is the GUID of the User Profile Synchronization Service application. To restrict profile synchronization communication to a specific domain controller, type the following command: Set-SPProfileServiceApplication $upa -UseOnlyPreferredDomainControllers $true [!NOTE] It may take five minutes for the changed property value to propagate to the SharePoint Central Administration website. Resetting IIS on the Central Administration server will force the new value to be loaded immediately. For more information about resetting IIS, see IIS Reset Activity. For more information, see Get-SPServiceApplication and Set-SPProfileServiceApplication. Adjust User Profile synchronization time-outs [!NOTE] This section applies to SharePoint Server 2013. A time-out can occur on the following occasions: When trying to connect to the directory service server on the AddEdit a synchronization connection page in Central Administration. When trying to populate the list of containers on the AddEdit a synchronization connection page in Central Administration. This will occur as a JavaScript time-out error in the status bar. When clicking OK on the AddEdit a synchronization connection page in Central Administration. This causes the following error message and occurs because of a time-out by the Forefront Identity Manager web service when creating or updating a User Profile synchronization connection: "The request channel timed out while waiting for a reply after 00:01:29.9062626. Increase the timeout value passed to the call to Request or increase the SendTimeout value on the Binding. The time allocated to this operation may have been a part of a longer timeout." To adjust User Profile synchronization timeouts by using Windows PowerShell If you want to change the time-out value for connecting to the directory server, do the following: Paste the following code into a text editor, such as Notepad: $upsAppProxy = Get-SPServiceApplicationProxy $upsAppProxy.LDAPConnectionTimeout = $upsAppProxy.Update() Replace \ with the GUID of the User Profile service application proxy and \ with the new time-out value in seconds. The default time-out is 120 seconds. Save the file as an ANSI-encoded text file whose extension is .ps1. If you want to change the time-out value for the Populate Containers control, do the following: Paste the following code into a text editor, such as Notepad: $upsAppProxy = Get-SPServiceApplicationProxy $upsAppProxy.ImportConnAsyncTimeout = $upsAppProxy.Update() If you want to change the time-out value for calls into the Forefront Identity Manager web service, do the following: Replace \ with the GUID of the User Profile service application proxy and \ with the new time-out value in seconds. The default time-out is 1,000 seconds (approximately 17 minutes). Paste the following code into a text editor, such as Notepad: $upsApp = Get-SPServiceApplication $upsApp.FIMWebClientTimeOut = $upsApp.Update() Replace \ with the GUID of the User Profile service application and \ with the new time-out value in milliseconds. The default time-out is 300,000 milliseconds (5 minutes). Save the file as an ANSI-encoded text file whose extension is .ps1, such as AdjustProfileSyncTimeouts.ps1. On the Start menu, click All Programs. Click Microsoft SharePoint 2013 Products. Click SharePoint 2013 Management Shell. Change to the directory where you saved the file. At the Microsoft PowerShell command prompt, type the following command to execute a script file: ..ps1 Where \ is the name of the file to execute. For more information, see Get-SPServiceApplicationProxy and Get-SPServiceApplication.
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/maintain-profile-synchronization.md/0
Maintain user profile synchronization settings in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/maintain-profile-synchronization.md
OfficeDocs-SharePoint
2,987
32
title: "Managing a MinRole Server Farm in SharePoint Servers 2016, 2019, and Subscription Edition" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 7242018 audience: ITPro f1.keywords: - NOCSH ms.topic: conceptual ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: IT_Sharepoint_Server_Top ms.assetid: 883fac19-3048-4ef0-b473-10b3b05493b6 description: "Learn how to manage your MinRole farm deployment in SharePoint Server." Managing a MinRole Server Farm in SharePoint Servers 2016, 2019, and Subscription Edition [!INCLUDEappliesto-xxx-2016-2019-SUB-xxx-md] Learn how to manage your MinRole farm deployment in SharePoint Servers 2016, 2019, and Subscription Edition. MinRole is a new farm topology based on a set of predefined server roles introduced in SharePoint Server 2016. When configuring your SharePoint farm, you now select the role of a server when you create a new farm or join a server to an existing farm. SharePoint will automatically configure the services on each server based on the server's role. SharePoint Servers 2016, 2019, and Subscription Edition is optimized for the MinRole farm topology. MinRole administration Central Administration changes for MinRole With the addition of the MinRole feature, there are several changes to the Central Administration website. Manage servers in this farm This page shows the servers that are joined to the farm. Two columns have been added to this page: Role and Compliant. ::image type="content" alt-text="Displays Servers In Farm for the November PU 2016 in SharePoint Server 2016 (Feature Pack 1)" source="..media44119bfc-88ed-47c6-a5cb-0408b03f06eb.png" lightbox="..media44119bfc-88ed-47c6-a5cb-0408b03f06eb.png"::: The Role column displays the role that is assigned to the server in the farm. [!NOTE] In the "Servers in Farm" page, the SQL server and SMTP server will be listed as belonging to the "External" role, while in Microsoft PowerShell, their server roles are displayed as "Invalid". These two role names are equivalent. The Compliant column displays whether the server configuration is in compliance with its server role. If the server isn't in compliance, a Fix link will be provided to automatically reconfigure the server to match the expected configuration of its server role. [!NOTE] Only members of the local Administrators group on the server that hosts Central Administration have access to the Fix link. Manage services in this farm This is a new page in the System Settings category of Central Administration. It displays the state of each service in the farm. This page has three columns of interest: Auto Provision, Action, and Compliant. :::image type="content" alt-text="Displays services in a SharePoint Servers 2016 and 2019 farm." source="..media90e10233-ba14-4bc4-8213-b8866a7ae2b1.PNG" lightbox="..media90e10233-ba14-4bc4-8213-b8866a7ae2b1.PNG"::: The Auto Provision column displays whether the service is enabled in the farm. If the value Yes is displayed, service instances for this service will be started on the appropriate MinRole-managed servers in the farm. If the value No is displayed, service instances for this service will be stopped on the appropriate MinRole-managed servers in the farm. The Action column displays one of three values depending on the type of service it's and whether it's enabled in the farm: Manage Service application, Disable Auto Provision, and Enable Auto Provision. The Manage Service Application value indicates that the service is associated with a service application. This service will be enabled or disabled in the farm by its service application, typically when you create or delete the service application. Click the link to access the Service Application Management page. [!NOTE] The Manage Service Application link will only appear for services that support service applications. The Disable Auto Provision link disables the service in the farm. When you click this link, all services instances associated with this service will be stopped on the appropriate MinRole-managed servers in the farm. The Enable Auto Provision link enables the service in the farm. When you click this link, service instances for this service will be started on the appropriate MinRole-managed servers in the farm. The Compliant column displays whether the service is in compliance on every server in the farm. If this service isn't in compliance on one or more servers, a Fix link will be provided. Click this link to automatically reconfigure the service instances of this service to match the expected configuration. [!NOTE] Only members of the local Administrators group on the server that hosts Central Administration have access to the Fix link. Manage services on server This page displays all of the service instances on a server. Some things have changed as highlighted in red in the following diagram. :::image type="content" alt-text="Displays services on servers in SharePoint Servers 2016 and 2019." source="..media65dd3268-93e8-47ea-9486-7c500b8af90c.PNG" lightbox="..media65dd3268-93e8-47ea-9486-7c500b8af90c.PNG"::: In previous releases of SharePoint, this page was accessible only to members of the local Administrators group on the Central Administration server. From SharePoint Server 2016, all members of the SharePoint Farm Administrators group have access to this page. The role of the server is now displayed next to the name of the server. The Compliant column has been added to the page. It displays whether the service instance is in compliance on this server. If this service instance isn't in compliance on this server, a Fix link will be provided. Click this link to automatically reconfigure the service instance on this server to match the expected configuration. [!NOTE] Only members of the local Administrators group on the server that hosts the Central Administration have access to the Fix link. The Action column has changed. The link to start or stop a service has been removed for servers that are managed by MinRole. The only actionable item is Restart for service instances that are already started on this server. To start or stop a service click the Enable Auto Provision or Disable Auto Provision link in Manage services in this farm page. [!NOTE] Servers that are assigned to the Custom role will still display the Start and Stop links in the Action column. [!NOTE] Only members of the local Administrators group on the server that hosts the Central Administration have access to the Restart, Start, and Stop links. Manage the services in the farm by using Windows PowerShell New PowerShell cmdlets have been introduced to manage the services in the farm. Cmdlet name Description Syntax example :-----:-----:----- Get-SPService The Get-SPService cmdlet gets a service in the farm. Get-SPService -Identity "Microsoft SharePoint Foundation Sandboxed Code Service" Start-SPService The Start-SPService cmdlet enables a service in the farm. Service instances for this service will be started on the appropriate MinRole-managed servers in the farm. Start-SPService -Identity "Microsoft SharePoint Foundation Sandboxed Code Service" Stop-SPService The Stop-SPService cmdlet disables a service in the farm. Service instances for this service will be stopped on the appropriate MinRole-managed servers in the farm. Stop-SPService -Identity "Microsoft SharePoint Foundation Sandboxed Code Service" [!NOTE] An optional IncludeCustomServerRole parameter has been added to the Start-SPService and Stop-SPService Windows PowerShell cmdlets in the November 2016 Public Update for SharePoint Server 2016 (Feature Pack 1). If specified, it will also create a timer job that starts or stops service instances on servers that are assigned to the Custom server role. This is a one-time timer job. MinRole will make no further attempts to manage the service instances on servers assigned to the Custom server role. > Services that have associated service applications cannot be started or stopped by using the Start-SPService and Stop-SPService cmdlets. These services can be started or stopped by creating or deleting their associated service applications. If you use the Start-SPService or Stop-SPService cmdlets with services that have associated service applications, an error message will be shown indicating that the associated service applications should be created or deleted instead. Health monitoring A new health analyzer rule has been created to ensure that your servers are operating in their optimal MinRole configuration. The Server role configuration isn't correct rule runs every night at midnight on each server in your farm. It scans all service instances on the server to detect if any aren't in compliance. If any service instance isn't in compliance, the health rule will automatically reconfigure it to match the expected configuration. No manual intervention by the SharePoint farm administrator is required. :::image type="content" alt-text="Displays health rules for MinRole topology in SharePoint Servers 2016 and 2019." source="..mediadf3dd75f-d64f-4a1f-8d5c-57daecc9cb38.PNG" lightbox="..mediadf3dd75f-d64f-4a1f-8d5c-57daecc9cb38.PNG"::: The automatic repair functionality of the health rule can be disabled by the SharePoint farm administrator while still allowing the health rule to run. If the health rule detects that a server isn't in compliance and the automatic repair functionality is disabled, it will generate a health report in Central Administration. The health report will identify which servers aren't in compliance, offer the ability to automatically repair the servers, and provide instructions on how to manually repair the servers. The SharePoint farm administrator can control the health rule schedule, changing it to run more frequently or less frequently or disabling it so that it's never scheduled. It can also run on demand. [!NOTE] This health rule will not scan or repair servers that are assigned to the Custom role. A server assigned to the Custom role will not be managed by MinRole. Developers: How to assign services to server roles If you're a SharePoint developer intending to create an application with services, it's recommended that you assign each type of service instance to one or more server roles supported by MinRole: Assign services to server roles Implement your service by inheriting from the SPService class. Set the AutoProvision property value in the constructors of the new service class if you want to enable or disable this service by default. Implement the service instance class of the service by inheriting from the SPServiceInstance class. Override the ShouldProvision(SPServerRole serverRole) method to assign this service to specific server roles if necessary. For more information about how to subscribe a service to a specific role, see SPService class and SPServiceInstance class. Integrate with role conversion pre-validation Implement the service instance class of the service by inheriting from the SPServiceInstance class. Override the IsReadyForRoleConversion(SPServerRole newRole, out IEnumerable\ errorMessages) method to detect if your service instance is ready for role conversion to the server role specified by the newRole parameter. Return true if it's ready or false if it isn't ready. If you return false, provide a list of messages to explain why the service instance isn't ready for role conversion and instructions for resolving the issue via the errorMessages parameter.
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/managing-a-minrole-server-farm-in-sharepoint-server-2016.md/0
Managing a MinRole Server Farm in SharePoint Servers 2016, 2019, and Subscription Edition
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/managing-a-minrole-server-farm-in-sharepoint-server-2016.md
OfficeDocs-SharePoint
2,669
33
title: "New health analyzer rules for SSL certificates" ms.reviewer: ms.author: serdars author: nimishasatapathy manager: serdars ms.date: 06202022 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: IT_Sharepoint_Server_Top ms.assetid: 88317397-e0cb-47c7-9093-7872bc685213 description: "Learn how SSL certificate implements health analyzer." New health analyzer rules for SSL certificates [!INCLUDEappliesto-xxx-xxx-xxx-SUB-xxx-md] SharePoint has implemented the following four new health analyzer rules for SSL certificates: Certificate notification contacts haven't been configured health rule that provides notification through Central Administration when certificates are in use and no certificate notification contacts have been configured. This health rule will run weekly. Certificate notification contacts receive emails about SSL certificate expirations and can be configured by customers through the Configure certificate management settings page. Upcoming SSL certificate expirations health rule that provides advanced notification through both Central Administration and email of upcoming certificate expirations. This health rule will run weekly to notify certification notification contacts about certificates that are in use and will expire within the next 15 - 60 days. These thresholds are configurable by customers through the Configure certificate management settings page. SSL certificates are about to expire health rule that provides advanced notification through both Central Administration and email when certificates are about to expire. This health rule will run daily to notify certificate notification contacts about certificates that are in use and will expire within the next 15 days. This threshold is configurable by customers through the Configure certificate management settings page. SSL certificates have expired health rule that provides notification through both Central Administration and email when certificates have expired. This health rule will run daily to notify certificate notification contacts about certificates that are in use and have expired within the past 15 days. This threshold is configurable by customers through the Configure certificate management settings page.
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/new-health-analyzer-rules-for-ssl-certificates.md/0
New health analyzer rules for SSL certificates
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/new-health-analyzer-rules-for-ssl-certificates.md
OfficeDocs-SharePoint
466
34
title: "Performance planning in SharePoint Server 2013" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 12292016 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: IT_Sharepoint_Server_Top ms.assetid: 8dd52916-f77d-4444-b593-1f7d6f330e5f description: "Performance and capacity planning is the process of mapping your solution design for SharePoint Server to a farm size and set of hardware that supports your business goals." Performance planning in SharePoint Server 2013 [!INCLUDEappliesto-2013-xxx-xxx-xxx-xxx-md] Relevant performance and capacity planning articles for Project Server 2016 are available in the Project Server document library at Plan for performance and capacity (Project Server 2013). Articles about performance and capacity management The following articles about performance and capacity management are available to view online. Writers update articles on a continuing basis as new information becomes available and as users provide feedback. ContentContentDescription :-----:-----:----- Capacity management and sizing for SharePoint Server 2013 Learn about the concepts and planning considerations for managing the capacity of a SharePoint Server 2016 environment. Software boundaries and limits for SharePoint Server 2016 Learn about the tested performance and capacity limits of SharePoint Server 2016 and how limits relate to acceptable performance. Performance and capacity test results and recommendations (SharePoint Server 2013) Read articles that use test results and recommendations to estimate performance and capacity requirements for SharePoint Server 2016. Additional resources about performance and capacity management The following resources about performance and capacity management are available from other subject matter experts.  ContentDescription :-----:-----:----- Capabilities and features in SharePoint 2013 Resource Center Architecture design for SharePoint 2013 IT pros Visit the TechCenter to access videos, community sites, documentation, and more.
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/performance-planning-in-sharepoint-server-2013.md/0
Performance planning in SharePoint Server 2013
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/performance-planning-in-sharepoint-server-2013.md
OfficeDocs-SharePoint
455
35
title: "Plan for People Picker in SharePoint" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 812017 audience: ITPro f1.keywords: - NOCSH ms.topic: interactive-tutorial ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.assetid: 2093c146-c880-48c6-9526-24cdf80969ba description: "Learn how to plan for the People Picker web control in SharePoint Server." Plan for People Picker in SharePoint [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] You use the People Picker control to find and select people, groups, and claims when a site, list, or library owner assigns permissions in SharePoint Server. This article describes how to plan for People Picker. For information about how to configure People Picker, see Configure People Picker in SharePoint Server. Before reading this article, you should understand the concepts described in the following articles: Plan for user authentication methods in SharePoint Server People Picker and claims providers overview The Role of Claims SharePoint Claims-Based Identity People Picker and claims providers A claims provider lists, resolves, searches, and determines the "friendly" display of users, groups, and claims in the People Picker when claims-based authentication is used. If your web application uses claims-based authentication, you must decide whether to use one of the default claims providers or create a custom claims provider that will meet the business needs of your organization. For more information about how claims providers are related to the People Picker control, see Plan for custom claims providers for People Picker in SharePoint. Using People Picker with multiple forests or domains By default, People Picker will return users, groups, and claims from the domain on which SharePoint Server is installed, only. If you want People Picker to return query results from more than one forest or domain, you must configure People Picker to use an encrypted account and password even if you have a one- or two-way trust between the forests or domains. For more information about trusts, see Managing Trusts. To configure People Picker for a one-way trust, see Configure People Picker in SharePoint Server. Planning considerations for People Picker Planning for People Picker largely depends on what forests and domains that you want users to be able to query, and what users, groups, and claims you want to display in query results. As you plan for the forests and domains that you want users to query, consider the following questions: Do users have to query across a forest or a domain? What is the domain name system (DNS) name for each forest or domain that you want users to query? Will your forest or domain have a one-way or two-way trust with other forests or domains? If you are using a one-way trust, what credentials will be used to query the other farms or domains? Planning for the users, groups, and claims you want to display in the query results in People Picker will help you determine how to configure People Picker to return and display results from claims providers. As you plan for the users, groups, and claims you want to display in query results, consider the following questions: Are there certain Lightweight Directory Access Protocol (LDAP) filters that you want to apply to query results? Do you want to restrict the query results to users, groups, or claims in a specific site collection? Do you want to restrict the query results to users, groups, or claims in a certain Active Directory organizational unit (OU)? See also Concepts Plan for user authentication methods in SharePoint Server People Picker and claims providers overview Plan for custom claims providers for People Picker in SharePoint Other Resources Configure People Picker in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/plan-for-people-picker.md/0
Plan for People Picker in SharePoint
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/plan-for-people-picker.md
OfficeDocs-SharePoint
873
36
title: "Profile schema reference in SharePoint Server" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 352018 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: IT_Sharepoint_Server_Top ms.assetid: 05bdd5cb-2c97-40ca-bbac-bb91d300ad5c description: "Understand the XML schema for profiles for use in scripted monitoring configuration for SharePoint Server." Profile schema reference in SharePoint Server [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] When you run the BackupMonitoringSettings.ps1 Microsoft PowerShell script on a SharePoint farm, you create a file that's called a Profile. The Profile follows an XML schema. You can modify settings of elements of the schema to create a custom Profile. You can then use the custom Profile to automate configuration of the monitoring settings in a SharePoint environment. For an introduction to scripted monitoring configuration, see Overview of scripted monitoring configuration in SharePoint Server. Administrators can run the scripts before, during, and after changes to the farm, such as farm topology, major security changes, applying software updates, or running a performance test. The scripts alter the monitoring settings so that all of the necessary monitoring data are collected during the event without flooding the Logging database during normal operation. [!NOTE] You must download the PowerShell scripts to back up, restore, or modify the farm monitoring settings. The scripts are available on the TechNet Gallery at Scripted Monitoring Configuration - BackupMonitoringSettings and Scripted Monitoring Configuration - AlterMonitoringSettings. The BackupMonitoringSettings.ps1 PowerShell script creates the backup Profile from which you can create other Profiles. You can create one or more Profiles to adjust the level of monitoring during different phases of the SharePoint lifecycle. You can also use a custom Profile to configure monitoring on several farms at once. You would typically create Profiles for the following purposes: To complete the configuration of the monitoring settings on a farm after you install SharePoint Server To change the monitoring settings on a farm just before an administrative change, such as changing the settings of a Search service application on the farm As a result, you can capture more monitoring data related to that change and suppress unwanted monitoring data. Then you can return the monitoring settings to the original values after the change has been completed. To restore the monitoring settings on a farm after some administrative change To restore a previous set of monitoring settings on a farm You might do this if you are making manual adjustments to the settings and decide that you want to restore the previous settings. To restore the default settings To create a profile that you can apply to multiple farms Backing up the original settings You should always back up the default monitoring settings before altering them. That way, you can restore those settings should you need to. The backup Profile also can serve as the beginning point for the other Profiles you will create. For more information about how to back up the settings, see Run scripted monitoring configuration in SharePoint Server. Understanding the Profile schema When you run the BackupMonitoringSettings.ps1 PowerShell script, you create a Profile that conforms to the following XML schema. The elements of the file contain the associated monitoring settings from the farm. ``` ``` [!IMPORTANT] In the following tables, you cannot change values in fields that are marked as Read-Only. If you change values in Read-Only fields in your Profiles, unpredictable results may occur. Settings for the FarmDiagnosticConfig element NameValue TypeNotes :-----:-----:----- AllowLegacyTraceProviders Boolean Specifies that trace providers built for previous versions of SharePoint Products and Technologies can write to the trace session for SharePoint Server. AppAnalyticsAutomaticUploadEnabled Boolean Specifies whether aggregated app usage data is automatically uploaded to Microsoft. CustomerExperienceImprovementProgramEnabled Boolean Determines whether a Management Group has enabled the Customer Experience Improvement Program feature. ErrorReportingEnabled Boolean Gets or sets a value to indicate whether crash data collection and error reporting is enabled. ErrorReportingAutomaticUploadEnabled Boolean Specifies whether participation in the Customer Experience Improvement Program (CEIP) is enabled. The CEIP is designed to improve the quality, reliability, and performance of Microsoft products and technologies. With your permission, anonymous information about your server is sent to Microsoft to help improve SharePoint Server. DownloadErrorReportingUpdatesEnabled Boolean Specifies whether error reports are uploaded to Microsoft automatically. Error reports include the following: Information about the condition of the server when a problem occurs, The operating system version and computer hardware in use, and The digital product ID, which can be used to identify your license. Note: The IP address of your computer is also sent because you are connecting to an online service to send error reports; however, the IP address is used only to generate aggregate statistics. DaysToKeepLogs Integer Specifies the number of days to keep trace log files. The type must be a valid number between 1 and 366. The default value is 14 days. LogMaxDiskSpaceUsageEnabled Boolean Specifies whether to restrict the maximum space to use for trace log files. LogDiskSpaceUsageGB Integer Specifies the maximum amount of storage to use for trace log files, in gigabytes (GB). The default value is 1000 and only takes effect when the LogMaxDiskSpaceusageEnabled parameter is set to True. The type must be a valid number between 1 and 1000. LogLocation String:Path This is the full path to the location where you want log files to be stored. It can be a remote location. Example: "%CommonProgramFiles%\Microsoft Shared\Web Server Extensions\16\LOGS\" and "%CommonProgramFiles%\Microsoft Shared\Web Server Extensions\15\LOGS\" LogCutInterval Integer Specifies a time period to roll over to the next log file. The type must be a valid number between 0 and 1440. EventLogFloodProtectionEnabled Boolean Specifies whether the Event log flood protection feature is enabled. If multiple similar events are written to the event log, some duplicate messages are suppressed. After a period of time, a summary message shows how many events were suppressed. EventLogFloodProtectionThreshold Integer Specifies the number of events allowed in a given timeframe before an event is considered to be flooding the event log. The integer range is between 1 and 100. The default value is 5. EventLogFloodProtectionTriggerPeriod Integer Specifies in minutes the timeframe to watch for events that may be flooding. The integer range is between 1 and 1440. The default value is 2. EventLogFloodProtectionQuietPeriod Integer Specifies in minutes how much time must pass without an event firing to exit flood protection. The integer range is between 1 and 1440. The default value is 2. EventLogFloodProtectionNotifyInterval Integer Specifies in minutes how often to write a summary event that indicates how many events were suppressed due to flood protection. The integer range is between 1 and 1440. The default value is 5. ScriptErrorReportingEnabled Boolean Enables or disabled the reporting of script errors in the Log file. ScriptErrorReportingRequireAuth Boolean Specifies whether script error reporting requires authentication. ScriptErrorReportingDelay Integer Specifies the time in minutes between script error reports. The value must be a valid integer between 0 and 1440. The value is specified in minutes. The default value is 30. Use the following table for the UsageServices settings. The elements of the UsageServices settings NameValue TypeNotes :-----:-----:----- ID GUID: Read-Only A GUID, in the form 12345678-90ab-cdef-1234-567890bcdefgh. UsageLogLocation Path Specifies the path on every computer in the farm where usage log files are created. The same path must exist on all computers in the farm. LoggingEnabled Boolean Specifies that usage data is logged to usage files. UsageLogMaxFileSizeKB Integer Specifies the maximum size of a single usage file that is applied to all the usage providers. The minimum value is 512 kilobytes (KB) and the maximum value is 65536 KB. UsageLogCutTime Integer Specifies the time in minutes of usage data that is collected per usage log file. The default time is 5 minutes. The value must be an integer in the range of 1 to 1440. Use the following table for the UsageDefinition settings. The elements of the UsageDefinition settings NameValue TypeNotes :-----:-----:----- ID GUID: Read-Only A GUID, in the form 12345678-90ab-cdef-1234-567890bcdefgh. Name String: Read-Only The string name of the UsageDefinition. DaysRetained Integer Specifies the number of days to retain usage data for the usage provider in the usage service database. The default value is 14. The type must be an integer between 0 and 31. DaysToKeepUsageFiles Integer Specifies the number of days to retain usage files. The value must be less than or equal to value of the DaysRetained parameter. Enabled Boolean Enables or disables the specified usage provider. Use the following table for the LogLevel settings. The elements of the LogLevel settings NameValue TypeNotes :-----:-----:----- Area String: Read-Only The component or service that the LogLevel applies to. Identity String: Read-Only Specifies the names of the category or set of categories to set the throttle for; for example, "Unified Logging Service". Note: If the Identity parameter is not specified, the event-throttling setting is applied to all categories in the farm. EventSeverity String:[None \ ErrorCritical \Error \ Warning \Information \ Verbose] Specifies the category level to be set. The category level is any one of the following values:[None \ ErrorCritical \Error \ Warning \Information \ Verbose] TraceSeverity String:[None \ Unexpected \Monitorable \ High \Medium \ Verbose \ VerboseX] Specifies trace throttle to set the specified categories to. The trace log files are text files that are written to the trace log path that is defined on the Diagnostic Logging Settings page on the the SharePoint Central Administration website. The type must be any one of the following values::[None \ Unexpected \Monitorable \ High \Medium \ Verbose \ VerboseX] Use the following table for the TimerJob settings. The elements of the TimerJob settings NameValue TypeNotes :-----:-----:----- Identity GUID: Read-Only Specifies the timer job to update. The type must be a valid GUID, in the form 12345678-90ab-cdef-1234-567890bcdefgh. Schedule String Specifies the schedule for running the timer job. The type must be a valid SharePoint Timer service (SPTimer) schedule in the form of any one of the following schedules: Every 5 minutes between 0 and 59, Hourly between 0 and 59, Daily at 15:00:00, Weekly between Fri 22:00:00 and Sun 06:00:00, Monthly at 15 15:00:00, and Yearly at Jan 1 15:00:00 Enabled Boolean Enables or disables the timer job. Use the following table for the HealthAnalyzerRule settings. The elements of the HealthAnalyzerRule settings NameValue TypeNotes :-----:-----:----- Identity GUID: Read-Only Specifies the name or GUID of the health analyzer rule to set. Enabled Boolean Enables or disables the health analyzer rule. Create Profiles You can create an unlimited number of Profiles as. Each Profile might be used for a different purpose, such as to increase the levels of monitoring before a specific change to the environment, or to lower the levels after a change. You only need to create profile entries for the specific changes that you want to make. The other settings will remain unchanged. For example, if you want to change a few LogLevel settings, then you only need to specify those settings in the Profile. Settings that are not specified in the Profile will not be changed. You might want to use a naming convention for your Profiles so that you can organize them and more easily use them. [!IMPORTANT] Always back up the monitoring settings before making any changes to them. Always work from a copy of the backup Profile and never from the original backup file itself. See also Concepts Overview of scripted monitoring configuration in SharePoint Server Run scripted monitoring configuration in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/profile-schema-reference.md/0
Profile schema reference in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/profile-schema-reference.md
OfficeDocs-SharePoint
2,799
37
ms.date: 03132018 title: "Restore farms in SharePoint Server" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: 7942ef65-c309-402d-b4bb-d54e686fc5d9 description: "Learn how to restore a SharePoint Server farm." Restore farms in SharePoint Server [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] You can restore a SharePoint Server farm by using the SharePoint Central Administration website, Microsoft PowerShell, or SQL Server tools. The backup tool that you use depends on the kind of environment that you have deployed, the backup schedule, and service level agreements that you have made with your organization. Before you begin Farm-level recovery is performed only after a failure that involves the complete farm, or where partial recovery of part of the farm isn't possible. If you only have to restore part of the farm, a specific database, a service application, a list, or document library, or a specific document, use another recovery method. For more information about alternate forms of recovery, see Related content. Farm recovery is performed for any of the following reasons: Restoring a farm after a fire, disaster, equipment failure, or other data-loss event. Restoring farm configuration settings and data to a specific previous time and date. Moving a SharePoint Server deployment from one farm to another farm. Before you begin this operation, review the following information about how to recover a farm in SharePoint: You can't back up from one version of SharePoint Server 2019 and restore to another version of SharePoint Server 2019. The same applies to SharePoint Servers 2016 and 2013. Backing up the farm will back up the configuration and Central Administration content databases, but these can't be restored using SharePoint Server tools. For more information about how to back up and restore all of the farm databases, see Move all databases in SharePoint Server. When you restore the farm by using SharePoint Server, the restore process won't automatically start all of the service applications. You must manually start them by using Central Administration or Microsoft PowerShell. Don't use SharePoint Products Configuration Wizard to start the services because doing this will also reprovision the services and service proxies. For more information, see Start or stop a service in SharePoint Server. The identifier (ID) of each content database is retained when you restore or reattach a database by using built-in tools. Default change log retention behavior when using built-in tools is as follows: The change logs for all databases are retained when you restore a farm. The change log for content databases is retained when you reattach or restore a database. When a database ID and change log are retained, the search system continues crawling based on the regular schedule that is defined by crawl rules. When you restore an existing database and don't use the overwrite option, a new ID is assigned to the restored database, and the database change log isn't preserved. The next crawl of the database will add data from the content database to the index. If a restore is performed and the ID in the backup package is already being used in the farm, a new ID is assigned to the restored database and a warning is added to the restore log. The ability to perform an incremental crawl instead of a full crawl depends on the content database ID being the same as before and the change log token that is used by the search system being valid for the current change sign-in the content database. If the change log isn't preserved, the token isn't valid and the search system has to perform a full crawl. SharePoint Server backup backs up the Business Data Connectivity service external content type definitions but doesn't back up the data source itself. To protect the data, you should back up the data source when you back up the Business Data Connectivity service or the farm. If you restore the Business Data Connectivity service or the farm and then restore the data source to a different location, you must change the location information in the external content type definition. If you don't, the Business Data Connectivity service might be unable to locate the data source. SharePoint Server restores remote Binary Large Objects (BLOB) stores only if you're using the FILESTREAM remote BLOB store provider to put data in remote BLOB stores. If you're using another provider, you must manually restore the remote BLOB stores. If you're sharing service applications across farms, be aware that trust certificates that were exchanged aren't included in farm backups. You must back up your certificate store separately or keep the certificates in a separate location. When you restore a farm that shares a service application, you must import and redeploy the certificates, and then re-establish any inter-farm trusts. For more information, see Exchange trust certificates between farms in SharePoint Server. After a Web application that is configured to use claims-based authentication is restored, duplicate, or additional claims providers are often visible. If duplicates appear, then you must manually save each Web application zone to remove them. For more information, see Restore web applications in SharePoint Server. Additional steps are required when you restore a farm that contains a Web application that is configured to use forms-based authentication. For more information, see Restore web applications in SharePoint Server. Using PowerShell to restore a farm in SharePoint You can use Microsoft PowerShell to restore a farm. To restore a farm by using PowerShell Verify that you have the following memberships: securityadmin fixed server role on the SQL Server instance. db_owner fixed database role on all databases that are to be updated. Administrators group on the server on which you're running the PowerShell cmdlets. An administrator can use the Add-SPShellAdmin cmdlet to grant permissions to use SharePoint Server cmdlets. [!NOTE] If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For more information about PowerShell permissions, see Add-SPShellAdmin. Open the SharePoint Management Shell. At the PowerShell command prompt, type the following command: powershell Restore-SPFarm -Directory -RestoreMethod Overwrite [-BackupId ] Where: \ is the path of the folder you use for storing backup files. \ is the identifier of the backup to restore from. [!NOTE] If you are not logged on as the Farm account, you are prompted for the Farm account's credentials. If you don't specify the BackupId, the most recent backup will be used. To view the backups for the farm, at the Microsoft PowerShell command prompt, type the following command: powershell Get-SPBackupHistory -Directory -ShowBackup [-Verbose] Where: \ is the path of the folder you use for storing backup files. You can't use a configuration-only backup to restore content databases together with the configuration. To restart a service application, at the PowerShell command prompt, type the following command: powershell Start-SPServiceInstance -Identity Where \ is the GUID of the service application. For more information about how to restart service applications by using PowerShell, see Start-SPServiceInstance. For more information about how to restore the farm by using PowerShell_2nd_NoVer, see Restore-SPFarm.PShell_stsadm_deprecated Using Central Administration to restore a farm You can use the Central Administration Web site to restore a farm. To restore a farm by using Central Administration Verify that the user account that is performing this procedure is a member of the Farm Administrators SharePoint group. In Central Administration, on the home page, in the Backup and Restore section, click Restore from a backup. On the Restore from Backup—Step 1 of 3: Select Backup to Restore page, from the list of backups, select the backup job that contains the farm backup, and then click Next. You can view more details about each backup by clicking the (+) next to the backup. [!NOTE] If the correct backup job does not appear, in the Backup Directory Location text box, type the Universal Naming Convention (UNC) path of the correct backup folder, and then click Refresh. You cannot use a configuration-only backup to restore the farm. On the Restore from Backup—Step 2 of 3: Select Component to Restore page, select the check box that is next to the farm, and then click Next. On the Restore from Backup—Step 3 of 3: Select Restore Options page, in the Restore Component section, make sure that Farm appears in the Restore the following component list. In the Restore Only Configuration Settings section, make sure that the Restore content and configuration settings option is selected. In the Restore Options section, under Type of Restore, select the Same configuration option. A dialog will appear that asks you to confirm the operation. Click OK. [!NOTE] If the Restore Only Configuration Settings section does not appear, the backup that you selected is a configuration-only backup. You must select another backup. Click Start Restore. You can view the general status of all recovery jobs at the top of the Backup and Restore Job Status page in the Readiness section. You can view the status for the current recovery job in the lower part of the page in the Restore section. The status page updates every 30 seconds automatically. You can manually update the status details by clicking Refresh. Backup and recovery are Timer service jobs. Therefore, it may take several seconds for the recovery to start. If you receive any errors, you can review them in the Failure Message column of the Backup and Restore Job Status page. You can also find more details in the Sprestore.log file at the UNC path that you specified in step 3. When the restore process has completed, you may need to restart one or more service applications. In Central Administration, on the home page, in the Systems Settings section, click Manage services on server. On the Services on Server page, start any services related to service applications that you want to run by clicking Restart in the Action column next to the service application. Re-establish any trust relationships. For more information, see Exchange trust certificates between farms in SharePoint Server. Using SQL Server tools to restore a farm Although you can't restore the complete farm by using SQL Server tools, you can restore most of the farm databases. If you restore the databases by using SQL Server tools, you must restore the farm configuration by using Central Administration or PowerShell. For more information about how to restore the farm's configuration settings, see Restore farm configurations in SharePoint Server. [!NOTE] The search index is not stored in SQL Server. If you use SQL Server tools to back up and restore search, you must perform a full crawl after you restore the content database. Before you restore SharePoint Server, we recommend that you configure a recovery farm for site and item recovery. Restore the databases by following these steps: If possible, back up the live transaction log of the current database to protect any changes that were made after the last full backup. Restore the last full database backup. Restore the most recent differential database backup that occurred after the most recent full database backup. Restore all transaction log backups that occurred after the most recent full or differential database backup. Use the following procedure to restore your farm databases. To restore a farm by using SQL Server tools Verify that the user account that is performing this procedure is a member of the sysadmin fixed server role. If the SharePoint Timer service is running, stop the service and wait for several minutes for any currently running stored procedures to finish. Don't restart the service until after you restore all the databases that you have to restore. Start SQL Server Management Studio and connect to the database server. In Object Explorer, expand Databases. Right-click the database that you want to restore, point to Tasks, point to Restore, and then click Database. The database is automatically taken offline during the recovery operation and can't be accessed by other processes. In the Restore Database dialog, specify the destination and the source, and then select the backup set or sets that you want to restore. The default values for destination and source are appropriate for most recovery scenarios. In the Select a page pane, click Options. In the Restore options section, select only Overwrite the existing database. Unless your environment or policies require otherwise, don't select the other options in this section. In the Recovery state section: If you have included all the transaction logs that you must restore, select RECOVER WITH RECOVERY. If you must restore additional transaction logs, select RECOVER WITH NORECOVERY. The third option, RECOVER WITH STANDBY isn't used in this scenario. [!NOTE] For more information about these recovery options, see Restore Database (Options Page). Click OK to complete the recovery operation. Except for the configuration database, repeat steps 4 through 9 for each database that you're restoring. [!IMPORTANT] If you are restoring the User Profile database (by default named "User Profile Service_ProfileDB_\"), then also restore the Social database (by default named "User Profile Service_SocialDB_\"). Failing to do this can cause inaccuracies in the User Profile data that might be difficult to detect and fix. To restore the configuration settings, you must use the existing configuration database or manually create a new database and restore the configuration to that database. For more information about how to restore the farm configuration, see Restore farm configurations in SharePoint Server. Start the SharePoint Timer service. Start any service applications that have to be restarted. In Central Administration, on the home page, in the Systems Settings section, click Manage services on server. On the Services on Server page, start any services related to service applications that you want to run by clicking Restart in the Action column next to the service application. Related content The following list shows other recovery methods that you can use when you only need to restore part of your farm: Back up farms in SharePoint Server Restore farm configurations in SharePoint Server Restore web applications in SharePoint Server Restore content databases in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/restore-a-farm.md/0
Restore farms in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/restore-a-farm.md
OfficeDocs-SharePoint
3,123
38
title: "Search Engine Optimization (SEO) in SharePoint Server" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 12292016 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: IT_Sharepoint_Server_Top ms.assetid: fa731103-0390-4ee7-a7ad-79b355dfc3c0 description: "Learn about Search Engine Optimization (SEO) in SharePoint Server 2016." Search Engine Optimization (SEO) in SharePoint Server [!INCLUDEappliesto-xxx-2016-xxx-xxx-xxx-md] If you're a website owner, you know how important it's that users can easily find your website by using Internet search engines such as Bing or Google. The higher your website is shown in the search results list, the more likely it's that users will click on it. Just think of your own behavior when looking at search results. When was the last time that you clicked to view the second page of search results? Optimizing SharePoint Server 2016 websites for Internet search engines The white paper Optimizing SharePoint Server 2013 websites for Internet search engines explains how to apply SEO features to your SharePoint Server 2016 website so that Internet search engines will display it high in their search results list. The white paper covers subjects such as the following: SEO activities that you can do in the planning phase of setting up a new website New SharePoint Server 2016 SEO features and how you can use them SEO for websites that use cross-site publishing How to handle common SEO challenges
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/search-engine-optimization-seo.md/0
Search Engine Optimization (SEO) in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/search-engine-optimization-seo.md
OfficeDocs-SharePoint
399
39
title: "Share data connections by using Excel and Excel Services (SharePoint Server 2013)" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 772017 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.assetid: c6284830-1127-472d-9610-8a0b9b0298aa description: "Excel Services in SharePoint Server 2013 enables you to work with different external data sources to create reports, scorecards, and dashboards that remain up to date automatically." Share data connections by using Excel and Excel Services (SharePoint Server 2013) [!INCLUDEappliesto-2013-xxx-xxx-xxx-xxx-md] Excel can connect to lots of different data sources. These include SQL Server, a SharePoint list, an Access database, an Azure DataMarket Feed, an OData feed, and so on. Many of the data connections that you can use in Excel are supported as data connections in Excel Services. This means that people can refresh data in Excel Services reports, scorecards, and dashboards that use those data connections. More specifically, Excel Services supports connections to SQL Server tables, SQL Server Analysis Services cubes, and custom OLE DBODBC data providers. By storing data connections in an Excel Services trusted data connection library, people can easily access the data sources that they need without having to know the names of servers and databases. Data connections are reusable so that people can create multiple reports or workbooks using those data connections. In addition, Excel Services workbooks that use connections in a trusted data connection library can be updated so that people have easy access to current information. To learn how to create and publish external data connections using Excel Services, see the articles in this section. In this section Share a SQL Server Analysis Services data connection using Excel Services (SharePoint Server 2013) Share a SQL Server data connection using Excel Services (SharePoint Server 2013) Share an OLE DB or ODBC connection using Excel Services (SharePoint Server 2013) See also Concepts Business intelligence capabilities in Excel Service (SharePoint Server 2013) Data sources supported in Excel Services (SharePoint Server 2013)
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/share-data-connections-by-using-excel-and-excel-services-sharepoint-server-2013.md/0
Share data connections by using Excel and Excel Services (SharePoint Server 2013)
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/share-data-connections-by-using-excel-and-excel-services-sharepoint-server-2013.md
OfficeDocs-SharePoint
514
40
title: "SSL certificate management in central administration" ms.reviewer: ms.author: serdars author: nimishasatapathy manager: serdars ms.date: 06202022 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: IT_Sharepoint_Server_Top ms.assetid: 88317397-e0cb-47c7-9093-7872bc685213 description: "Learn how you can use Secure Sockets Layer (SSL) certificate management supports managing your SSL certificates in Central Administration." SSL certificate management in central administration [!INCLUDEappliesto-xxx-xxx-xxx-SUB-xxx-md] Besides managing SSL certificates through PowerShell cmdlets, SharePoint also supports managing your SSL certificates in Central Administration. You'll see a new Certificates section in the Security landing page of Central Administration. Within this section you'll find links to Manage certificates, Configure certificate management settings, and View certificate files. The Manage certificates page is the main page for managing the certificates in your SharePoint farm. From here you'll have full access to all of the certificate management functionality including creating new certificates, renewing existing certificates, viewing certificates, importing and exporting certificates, and so on. You'll be able to filter and sort the list of certificates based on various criteria such as certificate store and expiration date. The Configure certificate management settings page lets you configure various settings such as your default organization information and certificate health analyzer rule notification thresholds. The View certificate files page lists the Certificate Signing Request files and certificate export files generated by SharePoint. This makes it easy to retrieve these files even if you're accessing the Central Administration site remotely and don't have direct connectivity to the file shares that SharePoint would have access to.
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/ssl-certificate-management-in-central-administration.md/0
SSL certificate management in central administration
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/ssl-certificate-management-in-central-administration.md
OfficeDocs-SharePoint
422
41
title: "Stage 8 Assign a category page and a catalog item page to a term in SharePoint Server" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 9172016 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: IT_Sharepoint_Server_Top ms.assetid: 4e9272a2-78f0-4257-a896-26ae55ff0e51 description: "Learn how to assign a category page and a catalog item page to a term in SharePoint Server 2016." Stage 8: Assign a category page and a catalog item page to a term in SharePoint Server [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] [!NOTE] Many of the features described in this series are also available for most sites in SharePoint in Microsoft 365. Quick overview In previous stages of this series, we: Specify the full site navigation How to specify the full site navigation How to create a page based on a page layout In this stage, we'll assign these newly created pages to the terms within the Site Navigation term set. In this article, you'll learn: About managed navigation About the category page and the catalog item page How to assign a category page and a catalog item page to a term About the friendly URL for category pages Start stage 8 Before we begin the task of assigning a category page and a catalog item page to a term, let's discuss a bit more about some of the features involved. About managed navigation Managed navigation was introduced in SharePoint Server 2016. This navigation method lets you define and maintain your site navigation by using term sets. One of the benefits of using managed navigation is that it separates the site navigation from the location of your content. By using managed navigation, it's not the location of your content that defines where in the navigation your content will appear, but how you tag your content with terms from a term set. For example, in earlier versions of SharePoint, if you wanted to add a new page under "About our company," you had to add that page under the "About our company" branch within your content. By using managed navigation, you can add a page to the branch that makes the most sense to you. By tagging that page with a term, and using Search Web Parts, it will appear in the correct place in the navigation. Another benefit of managed navigation is that it creates friendly URLs. In earlier versions of SharePoint, the URL to a page contained a reference to the Pages library and any folders within that library, for example: https:www.contoso.compagesproductscomputerslaptops.aspx. By using managed navigation, URLs are based on the terms in the term set that drives your site navigation, for example: https:www.contoso.comcomputerslaptops. Stage 3: How to enable a list as a catalog in SharePoint Server explained how terms from the Product Hierarchy term set are used to create a friendly URL. [!IMPORTANT] Managed navigation is not tied to a publishing method, and can be used both for author-in-place and for cross-site publishing. For more information, see Overview of managed navigation in SharePoint Server. About the category page and the catalog item page When you display information in a catalog format, the layout and structure of the category pages should be consistent across the catalog. For example, in our Contoso scenario, we want the category page for all MP3 players to have the same layout as the category page for all camcorders. Also, regardless of the type of product a visitor views, the catalog item page should be consistent. For example always display an image of a product in the upper-left corner, followed by tables of product specifications. By combining managed navigation with category pages and catalog item pages, you don't have to create several pages for your catalog categories or for your catalog items. For example, in our Contoso scenario, we'll use only the two pages we created in Stage 7: Upload page layouts and create new pages in a publishing site in SharePoint Server. So, after all that theory, in the next section, we'll show you how you can do this. How to assign a category page and a catalog item page to a term In Stage 7: Upload page layouts and create new pages in a publishing site in SharePoint Server, we created a new category page and a new catalog item page. Now we want to associate these pages with the terms in the term set that drives site navigation. On the Contoso site, go to Site settings, and then select Term store management. In the TAXONOMY TERM STORE section, select a term, for example "Audio," and then select the TERM-DRIVEN PAGES tab. In the Target Page Settings and Catalog Item Page Settings sections, there are four references pointing to two pages: Category-Electronics.aspx and CatalogItem-Electronics.aspx. Remember Stage 5: Connect your publishing site to a catalog in SharePoint Server, when we connected our publishing site to our catalog? In that stage, a category page and a catalog item page were automatically created and added to the Pages library. What we didn't cover in Stage 5 is that references to these pages were added to this term set, as shown in the screenshot above. In the next steps, we'll change these references so they point to our newly created category page and catalog item page. In the Target page settings section, do the following: In the Change target page for this term section, select Browse. In the Select an Asset dialog box, select Pages, and then select the category page that you want to apply. In our scenario, this page is ContosoCategoryPage.aspx. By setting this reference, when visitors browse to "Audio" on the Contoso site, the page ContosoCategoryPage.aspx will be used to display information. It's important to understand that the visitors won't see the page name ContosoCategoryPage.aspx, but instead a friendly URL. More information about friendly URLs will be provided in About the friendly URL for category pages, later in this article. In the Change target page for children of this term section, repeat steps 3a and 3b. By setting this reference, when visitors browse to a child term of "Audio," for example "Speakers," the page ContosoCategoryPage.aspx is used to display information. In the Catalog Item Page Settings section, do the following: In the Change Catalog Item Page for this category section, select Browse. In the Select an Asset dialog box, select Pages, and then select the category page that you want to apply. In our scenario, this is ContosoCatalogItemPage.aspx. By setting this reference, when visitors browse to an item that is tagged with the term "Audio," the page ContosoCatalogItemPage.aspx will be used to display information. In the section Change Catalog Item Page for children of this term, repeat steps 4a and 4b. By setting this reference, when visitors browse to an item that is tagged with a child term of "Audio," for example "Speakers," the page ContosoCatalogItemPage.aspx will be used to display information. Repeat Steps 2 - 4 for all terms to which you want to assign a category page and an item details page. In our scenario, we'll do these steps to all terms within the Site Navigation term set. After applying the new category page and catalog item page to all terms, you can browse to a category page to verify that the correct page is being used. In our scenario, when we browse to "Audio," there's not much to see. This lack of much content is good, because when we created a category page in Stage 7: Upload page layouts and create new pages in a publishing site in SharePoint Server, we created an empty page. To display content, we'll have to add Search Web Parts, which We'll explain in the next article. About the friendly URL for category pages When you use managed navigation, the friendly URLs that visitors see are composed of the terms from the term set that drives site navigation. To see how friendly URLs are composed, on the Term Store Management Tool page, select a term, for example "Audio," and then select the TERM-DRIVEN PAGES tab. The friendly URL appears in the Configure Friendly URL for this term section. Similarly, when you select "Car audio," you'll see the friendly URL for this page. If you want to change a friendly URL, for example from "audio" to "audio players," you should change the actual term itself. That way, the friendly URL and the term that is used to tag your content will remain consistent. If this concept was confusing, don't worry. We'll explain more about how Search Web Parts work in the next article. Next article in this series Stage 9: Configure the query in a Content Search Web Part on a category page in SharePoint Server See also Concepts Overview of managed navigation in SharePoint Server Assign a category page and a catalog item page to a term in SharePoint Server Other Resources Plan to show catalog content in SharePoint publishing sites
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/stage-8-assign-a-category-page-and-a-catalog-item-page-to-a-term.md/0
Stage 8: Assign a category page and a catalog item page to a term in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/stage-8-assign-a-category-page-and-a-catalog-item-page-to-a-term.md
OfficeDocs-SharePoint
2,036
42
ms.date: 11302018 title: "Upgrade SharePoint 2013 to SharePoint 2016 through Workflow Manager" ms.reviewer: ms.author: toresing author: SerdarSoysal manager: serdars audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: IT_Sharepoint_Server_Top description: "Learn how to upgrade SharePoint 2013 to SharePoint Server 2016 using Workflow Manager." Upgrade SharePoint 2013 to SharePoint 2016 through Workflow Manager [!INCLUDEappliesto-2013-2016-xxx-xxx-xxx-md] Summary When you upgrade Microsoft SharePoint 2013 to Microsoft SharePoint 2016, you don't have to create a new Workflow Manager installation. You can use the same installation that was used by the SharePoint 2013 farm in the new SharePoint 2016 farm. However, you may have to create a new installation of Workflow Manager in certain circumstances. For example, if you want to move Workflow Manager to a different Windows operating system, or if the back-end database server is decommissioned. In these situations, follow the steps in Workflow Manager Disaster Recovery to create the new Workflow Manager installation by using the old databases. Make sure that you use the most recent copy of the Workflow Manager databases. Background When you use SharePoint Server together with Workflow Manager, Workflow Manager keeps a record of the SharePoint sites that have published workflows. Each site is represented in Workflow Manager as a scope. Workflow Manager also stores the workflow definitions, all workflow instances, and their statuses. SharePoint stores the workflow history and workflow task information for SharePoint workflows. When the workflow status page is loaded, SharePoint first makes a call to Workflow Manager to see whether the workflow exists. To do this, it uses the workflow instance ID. Then, SharePoint loads the rest of the workflow information. If the workflow instance ID is missing in Workflow Manager, or if an error occurs during communication with Workflow Manager, you receive an error message. How to upgrade SharePoint 2013 to SharePoint 2016 by using Workflow Manager Prerequisites The following prerequisites must be done installed this upgrade: Install the latest cumulative update for Workflow Manager by using Web Platform Installer (Web PI). Install the latest version of Workflow Manager Client on the SharePoint 2013 servers, and make sure that all workflows are functional. Install the SharePoint Server 2016 farm, and upgrade all service applications and content databases. On all SharePoint Server 2016 farm servers, install the latest version of Workflow Manager Client by using Web PI. Register Workflow Manager with SharePoint Server 2016 Use the following steps to register Workflow Manager with SharePoint Server 2016: In the SharePoint 2013 farm, on the Central Administration website click Application Management and click Manage Service Applications, and then delete Workflow Service Application Proxy. In the SharePoint Server 2016 farm, run the following Microsoft PowerShell cmdlet to pair SharePoint 2016 together with the same Workflow Manager installation: powershell Register-SPWorkflowService -SPSite - WorkflowHostUri -force Common issues you may experience after the upgrade Issue 1: Site URL is changed If your site URL is changed in SharePoint 2016 but the site ID remains the same, you must republish a workflow from the affected site by using SharePoint Designer. Issue 2: Workflows don't start on some sites If workflows don't start on some sites, republish the workflows from the affected site. Or, run the Refresh Trusted Security Token Services Metadata feed timer job. Issue 3: Workflows fail and return the "Can't get app principal permission information" error Consider the following scenario: You have SharePoint 2013 workflows and Workflow Manager configured in your farm. You have recently connected sites in the farm to a previously existing instance of Workflow Manager. In this scenario, workflows that are created after you connect to the Workflow Manager installation finish successfully. However, workflows that are created before you connect to Workflow Manager don't finish. Instead, they get stuck when they try to finish or they remain in a suspended state. For workflows that remain suspended, you receive an HTTP 500 error. Additionally, the following entry is logged in the ULS log: Can't get app principal permission information. Cause Workflow Manager already has a scope for the site on which the workflows are running. Because the scope has an incorrect SPAuthenticationRealm value in the ApplicationID field of the scope, no SPAppPrincipal class exists on the SPWeb object that matches the ApplicationID value of the scope. Therefore, the workflows fail and return an error message. Resolution To resolve this issue, use the following PowerShell commands to register the new SPAppPrincipal object. You do this on the SPWeb object whose ID matches the ApplicationID value that's stored in the scope for the SPWeb object in Workflow Manager. powershell Variables $webUrl = "http:sp.contoso.comsitesteamsiteteamweb" $oldAuthRealm = "58a2b173-0f88-4bff-935b-bf3778cd0524" authentication realm expected by Workflow Manager $newAuthRealm = "48834d17-d729-471e-b0d0-a0ec83b49de0" authentication realm of current farm Get the SPWeb and SPSite objects, and the id of the web $web = Get-SPWeb $webUrl $site = $web.site $clientId = $web.Id Create the old and new app principal ids $oldAppId = "$clientId@$oldAuthRealm" $newAppId = "$clientId@$newAuthRealm" Register the app principal with the old authentication realm Register-SPAppPrincipal -DisplayName "Old Workflow" -Site $web -NameIdentifier $oldAppId Set permissions for the app principal If app-only permissions are used in old environment, you must use the -EnableAppOnlyPolicy parameter to pass to the cmdlet for app steps to succeed $oldAppPrincipal = Get-SPAppPrincipal -Site $web -NameIdentifier $oldAppId Set-SPAppPrincipalPermission -Site $web -AppPrincipal $oldAppPrincipal -Scope SiteCollection -Right FullControl Set-SPAppPrincipalPermission -Site $web -AppPrincipal $oldAppPrincipal -Scope Site -Right FullControl List the app principals with the old and new authentication realms in the ids Get-SPAppPrincipal -Site $web -NameIdentifier $oldAppId fl Get-SPAppPrincipal -Site $web -NameIdentifier $newAppId fl Please note: if the App Principal had App-Only permissions on the SharePoint 2013 site, then you will need to pass -EnableAppOnlyPolicy to the Set-SPAppPrincipalPermission cmdlet as well. More information To get the SPAuthenticationRealm value of ApplicationID that's stored in the scope, follow these steps: Run the following SQL query: ```sql SELECT FROM [WFResourceManagementDB].[dbo].[Scopes] WITH (NOLOCK) WHERE Description like '%%' ``` Where \ is the placeholder for the ID of the SPWeb object. In the query result, click the value in the SecuritySettings column to open the XML on a separate tab in SQL Server Management Studio. In the XML file, located the ApplicationID element that contains the value. For example, locate the following element: xml SPWeb_object_ID@SPAuthenticationRealm` [!NOTE] The GUID that appears before the at sign (@) is the ID of the SPWeb object, and the GUID that appears after the at sign is the SPAuthenticationRealm value. Alternatively, you can find the SPAuthenticationRealm value in ULS log, such as in the following example log entry: 11032017 12:13:16.72 w3wp.exe (SPWFE01:0x51FC) 0x1298 SharePoint Foundation Authentication Authorization an3eg Medium Can't get app principal permission information. AppId=i:0i.tms.sp.ext\@\ 11032017 12:13:16.72 w3wp.exe (SPWFE01:0x51FC) 0x1298 SharePoint Foundation General 8nca Medium Application error when access siteteamsiteteamweb_vti_binclient.svc, Error=Object reference not set to an instance of an object at Microsoft.SharePoint.SPAppRequestContext.EnsureTenantPermissions(SPServiceContext serviceContext, Boolean throwIfAppNotExits, Boolean allowFullReset) at Microsoft.SharePoint.SPAppRequestContext.InitCurrent(HttpContext context) at Microsoft.SharePoint.ApplicationRuntime.SPRequestModule.InitCurrentAppPrincipalToken(HttpContext context) at Microsoft.SharePoint.ApplicationRuntime.SPRequestModule.PostAuthenticateRequestHandler(Object oSender, EventArgs ea) at System.Web.HttpApplication.SyncEventExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously)
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/upgrade-sharepoint-2013-to-sharepoint-2016-through-workflow-manager.md/0
Upgrade SharePoint 2013 to SharePoint 2016 through Workflow Manager
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/upgrade-sharepoint-2013-to-sharepoint-2016-through-workflow-manager.md
OfficeDocs-SharePoint
2,086
43
title: "Variations overview in SharePoint Server" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 3122018 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: IT_Sharepoint_Server_Top ms.assetid: 3f8ea55b-e483-478c-8b35-a0ef4c6890f4 description: "Learn about variations and the benefits and scenarios for using variations to create multilingual sites in SharePoint Server or SharePoint in Microsoft 365." Variations overview in SharePoint Server [!INCLUDEappliesto-2013-2016-2019-SUB-SPO-md] The variations feature in SharePoint Server and SharePoint in Microsoft 365 syncs content from a source variation site to each target variation site to make the content available to specific audiences on different sites. When users visit the root site, they're redirected to the appropriate variation site, based on the language setting of their web browser. Content on a target variation site can be translated into other languages before it's published. You can use variations only on sites that are created by using one of the Publishing site templates, or on sites for which the SharePoint Server Publishing Infrastructure feature was activated. This article contains an overview of the variations feature. It describes the elements of the variations feature; provides an overview of site, list, and page creation for variation sites; lists some limitations of variations; and describes scenarios for using variations in SharePoint Server. This article doesn't describe the tasks that are involved in planning a solution that uses variations. For info about how to plan to use variations in your solution, see Plan for variations in SharePoint Server. This article also doesn't describe how to create variation labels and hierarchies. For info about how to create a variation site, see Create a multi-language website. [!IMPORTANT] - The variations will remain supported but deprecated for the SharePoint Server 2019 release. For more info, see What's deprecated or removed from SharePoint Server 2019. - The Machine Translation Services for the variations feature will no longer be supported as of the end of July 2022. All existing instances of variations using the Machine Translation Server APIs will no longer display multilingual functionality in your Microsoft 365 environment. - If you're using modern communication sites in SharePoint, we recommend that you use the modern multilingual sites and pages feature instead of the variations feature. Use and benefits of variations Many organizations have a global reach. However, even in domestic markets, organizations have to reach a diverse user base speaking many languages or having specific information based on regional differences. These types of organizations need websites that deliver customized content to suit different cultures, different markets, and different geographic regions. Producing and maintaining different versions of a site can be difficult and time-consuming. By using the variations feature as part of a SharePoint Server 2013 solution, site architects and site administrators can simplify the process of producing and maintaining these sites. The variations feature automates the creation, management, synchronization, and translation of sites, lists, and pages, thereby eliminating the need to manually create a site and all associated lists and pages for each instance of a needed variation. Scenarios for using variations You can use variations to create sites, lists, and page content for specific languages. In this scenario, most of the content is authored in the language of the source variation site and synced to some or all of the target variation sites for translation into different languages. For example, the content might be authored in English and be synced to target variation sites for translation into German, French, and Spanish. You can also use variations to create content for specific locales. For example, a company based in North America might have target variation sites for the following locales: English (United States), English (Canada), French (Canada), and Spanish (Mexico). Most of the content is authored in English (United States), and the variation feature syncs that content to the target variation sites. Content on the French (Canada) and Spanish (Mexico) site is translated into French and Spanish, whereas content for English (Canada) is edited to account for regional differences in United States and Canadian English. Other content that is unique to a specific locale is created on the target variation sites for which it's needed. In SharePoint Server 2010, you could use variations to create sites for different mobile devices, or that used different branding. In SharePoint Server 2016, variations are used only for multilingual sites. To create sites for different mobile devices, use Device Channels. To create sites that use different branding, use cross-site publishing. Plan for cross-site publishing in SharePoint Server. Elements of variations The variations feature consists of the following elements: Variation root site The variation root site provides the URL for all source and target variation sites and contains the landing page that redirects users to the correct variation site. This site isn't the same as the root site of a site collection, although you can specify the root site of a site collection to also be the root site of the variations hierarchy. Variation labels A variation label is an identifier that names a new variation site. Variations of a site are defined by creating variation labels, one for each planned variation. [!NOTE] SharePoint Server supports up to 209 variation labels. SharePoint in Microsoft 365 supports up to 50 variation labels. Variation sites Variation sites are the sites that are created based on the defined variation labels. There are two types of variation sites: Source variation site The source variation site is the site where shared content is authored and published, and it's the site from which the shared content is synced with target variation sites. There can be only one source variation site in a single site collection. After a source variation site is selected, it can't be changed. Target variation sites The target variation sites receive most of their content from the source variation site. New content can be created on a target variation site. However, that content isn't synced with other sites and is unique to the site on which it was created. Variations hierarchy The variations hierarchy is the complete set of sites in all variation labels. Variation lists Variation lists are lists for which you specify target variation labels to receive list items. Variation pages Variation pages are the publishing pages that are stored in the Pages library of the source variation site and the target variation sites. [!IMPORTANT] We recommend that you do not add nonpublishing pages to the Pages library of a site that uses variations. If you do, the Variations Create Hierarchies Job Definition timer job might fail. Understanding variations The variations feature creates sites and syncs content and supported list items from a source variation site to one or more target variation sites. By default, the variations feature syncs publishing pages from the Pages library of the source variation site, and any lists that are configured to be synced to specific target variation sites. By default, when users visit the root site, they're redirected to the appropriate variation site, based on the language setting of their web browser. For example, if a user's default browser language is French, SharePoint Server redirects that user to the French variation site. You can customize this behavior by replacing the default redirection page, VariationRoot.aspx, with a different page. This new page can implement logic that identifies the user's preferred language. For info about how to customize variation sites redirection, see How to: Customize the Variation Root Landing Logic. Variation labels A variation label is an identifier that names a variation site. You select one variation label as the source, which represents the source variation site. The remaining variation labels are the target labels, representing the target variation sites to which content is synced. You create variation sites from variation labels by using the Create Hierarchies command on the Variation Labels page. Only one set of variation labels, the variation hierarchy, can be defined for a site collection. The corresponding variation sites can be created anywhere within the site collection hierarchy. The source variation site and the target variation sites are always created as subsites of the variation root site. Users who visit the variation root site are redirected to the appropriate variation site. The following illustration provides an example of a variation site hierarchy, and shows how publishing content is synced to target variation sites. Three variation labels, "EN," "FR," and "DE," are created on the root site https:contoso.com. When the variations hierarchy is created, the corresponding variation sites, labeled "EN," "FR," and "DE," are created one level below the variation root site. Because site "https:contoso.comEN" is specified as the source variation site, lists and pages that are authored and published on site "https:contoso.comEN" are synced to the target variation sites, "https:contoso.comFR" and "https:contoso.comDE." When you create a variation label, you select a locale for it to use. The locale setting assists with browser redirection and regional settings such as sort order and calendar. It doesn't affect the language of the user interface. If language packs were installed on the front-end web server, you can also select a language for the variation site. The language setting in SharePoint Server determines the language of the user interface on the variation site. If no language packs were installed, the option to select a language isn't available, and the variation site uses the default language of the SharePoint Server installation on the server, regardless of the locale that is selected for the variation label. For example, if SharePoint Server was installed by using the English version, and no language packs were installed, when a new variation label is created for the Japanese locale, the user interface for the new variation target site is in English, not Japanese. If you want the user interface of a target variation site to be displayed using a specific language, you should install the language pack for each language before you create the variation sites. If a language pack isn't available when a target variation site is created, the target variation site can still be created, and users can change the alternate language for a site by using the multilingual user interface. For information about the multilingual user interface, see Plan for multilingual sites in SharePoint Server. For info about how to install language packs, see Install or uninstall language packs for SharePoint Server 2016. When you create a variations hierarchy, a navigation term set is created for each variation label. By default, the term set for the source variation label is named Variations Navigation. The term set for a target variation label is named Variations Navigation ( LabelName). For example, if you have a target label named en-ca, the term set for that label will be named Variations Navigation (en-ca). By default, when the variations feature creates a target page for the first time, a corresponding navigation term is also created on the target variation site. When you export a page for translation, its associated navigation term is also exported. Variation settings The Variations Settings page contains the following options: Site, List, and Page Creation Behavior Determines whether sites, lists, and pages on the source variation site are created automatically on the target variation sites. By default, Create Everywhere is enabled. If you enable Create Selectively, the first time that you sync sites, lists, and pages from the source variation site to target variation sites, you must do so manually. Subsequent updates to items on the source variation site will be synced based on the target label sync preferences. Recreate Deleted Target Page Determines whether a page should be re-created on a target variation site if the page was deleted from the target variation site, and the page on the source variation site was republished. By default, this option is enabled. If you disable this option, deleted pages aren't re-created on target variation sites. For example, consider the case in which a content author creates a page on the source variation site that isn't relevant to a target variation site. However, because Create Everywhere is enabled, the page is created automatically on the target variation site, and the target label content owner later deletes the unwanted target page. The next time that the content author updates the source page, the page won't be re-created on the target variation site. Update Target Page Web Parts Determines whether changes that were made to Web Parts on pages on a source variation site are also made on pages on target variation sites. By default, this option is enabled. Notification Sends an email message to the contact of the target label of a target variation site when a new page or site is created or to the contact person of the specified page when a page is updated with revisions from the source variation site. If the label doesn't have a contact, then the email message is sent to the contact of the welcome page of a target variation site. By default, this option is enabled. For info about how to specify variations settings, see Create a multi-language website. Variations timer jobs The variations feature uses timer jobs to perform tasks such as creating and propagating sites and pages. A timer job runs inside OWSTIMER, a Windows service for SharePoint Server. Each timer job has its own default schedule for when the job runs. You can change the frequency with which each job runs on the Job Definitions page on the Central Administration website. The variations feature uses the following timer jobs: Variations Create Hierarchies Job Definition Creates a complete variations hierarchy by creating all variation sites, lists, and pages from the source variation site, based on the variation labels. By default, this timer job runs hourly. Variations Propagate List Items Job Definition Creates and updates list items on target variation sites after a list is configured to send items to specific target variation labels. By default, this timer job runs every 15 minutes. Variations Propagate Page Job Definition Creates and updates pages on target variation sites after a page on the source variation site is approved or after it's manually submitted by a user. By default, this timer job runs every 15 minutes. Variations Propagate Sites and Lists Job Definition Creates variation sites and lists when the Create Everywhere option is enabled. By default, this timer job runs every 30 minutes. [!NOTE] Timer jobs aren't configurable in SharePoint in Microsoft 365. For information about timer jobs, see View timer job status in SharePoint Server 2016. Understanding source variation and target variation site creation Source variation and target variation sites are always created one level below the variation root site. Each variation site is created by using the same site template that's used to create the variation root site. This same-template usage means that by default, each variation site will use the same master page as the variation root site. However, each variation site can use separate master pages, page layouts, and CSS files. This provision is useful when you want to have separate layouts for different locales. For example, you can use a right-to-left layout for one language and a left-to-right layout for another language. For info, see Overview of the SharePoint 2013 page model. When the variations hierarchy is first created, only sites that are based on the list of defined variation labels are created. If the variation root site has sites below it in a hierarchical site structure, and you want to include those sites in the hierarchical site structure of each variation site, you must manually create the hierarchical structure of those sites below the source variation site after you create the variation hierarchy. By default, the next time that the Variations Create Hierarchies Job Definition timer job runs, the sites are synced only to any new target variation sites that are created at that time. For info about how sites below the source variation site are created on existing target variation sites, see Understanding site, list, and page creation later in this article. After the variations hierarchy is first created, when you add a new label to the variations hierarchy, on the Variation Labels page, select Create Hierarchies, and a new target variation site is created for each new label. By default, if the source variation site has content in the Pages library, a list that is configured to send list items to specific target variation labels, or contains sites below it in the site hierarchy, those pages, lists, and sites are created on all new target variation sites only. Understanding site, list, and page creation By default, the following components are synced automatically to the target variation sites: Sites that are created below the source variation site Lists and pages that are published on the source variation site or on any sites below it in the site hierarchy The following list types (or lists that inherit from these types) are supported: 100 - Generic list 101 - Document library 104 - Announcements list 109 - Picture library If Create Selectively is enabled, you must manually create any sites, lists, and pages on the selected target variation sites. This section describes the ways in which sites, lists, and pages are created on target variation sites. Site creation When the Variations Create Hierarchies Job Definition timer job runs for the first time and creates the variations hierarchy from the list of variation labels, only the source variation and target variation sites are created. After the source variation site is created, you can create sites below it in the site hierarchy, and those sites are then created on the existing target variation sites the next time that the Variations Propagate Sites and Lists Job Definition timer job runs. If Create Selectively is enabled, use the Site Variation Settings page on any site that is below the source variation site to manually create a target variation of the current site on one or more target variation sites. The new site is created on the specified target variation site when the next Variations Propagate Sites and Lists Job Definition timer job runs. You can create a target variation of the current site anytime after Create Selectively is enabled. [!NOTE] When source variation and target variation sites are created, they're created by using the default site definition provided by the template selected when the source label was created. No custom site configurations or settings are synced to the new sites. If you want the source variation and target variation sites to have custom site configurations or settings, such as navigation customizations, you must make those changes on each site after you create the variations hierarchy. List and page creation List items sync with variation target sites only when the list on the source variation site is configured to specify the target variation sites to which they should be synced. By default, list syncs with specific target variation sites only after it's configured so, and when the next Variations Propagate Sites and Lists Job Definition timer job runs. If a new item is added to a list that has already been synced to target variation sites, it's synced when the next Variations Propagate List Items Job Definition timer job runs. If a new target variation label is added after the variations hierarchy is created, the list will be created on the new target variation site. By default, content approval is enabled on target lists. When a new item is synced to a target list, it must be approved before it will appear in a Content Query Web Part on the target variation site. [!NOTE] Although you can specify individual pages that you want to sync to specific target labels, you can't sync individual list items. You can only specify a complete list to sync to specific target labels. If the Publishing Site template was selected when the source variation site was created, pages on the source variation site or on any site below it in the site hierarchy must be published before they're eligible to be synced to target variation sites. If the Publishing Site with Workflow template was selected, pages must be approved for publication by using the publishing workflow before they're eligible to be synced to target variation sites. By default, after a new page is published or approved for publication, if it uses workflows, it's synced to all target variation sites when the next Variations Propagate Page Job Definition timer job runs. If the page was published previously and is changed and republished on the source variation site, and the Automatically update target variation pages setting is selected for the target labels, the page is synced to all target variation sites when the next Variations Propagate Page Job Definition timer job runs. [!NOTE] On target variation sites, a page that is synced from the source variation site is always assigned a minor version number. If the page is new to the target site, it's assigned version 0.1. If the page already exists on the target variation site, the synced page is assigned the next available minor version number. For example, if a target variation site has version 2.1 of a page and a new variation of that page is synced to the target site, the page becomes version 2.2. Pages and additional resources, such as images that are approved for publishing on the source variation site, are synced to the target variation site together with their Approval status set to Draft, and they must be approved before they can be viewed by readers of the site. If Create Selectively is enabled, a user must create the page for a specific variation label by using the Create new targets command in the Variations group on the Publish tab of the page on the source variation site. The new page is synced to one or more target variation sites when the next Variations Propagate Page Job Definition timer job runs. If the page was published previously and is changed and republished on the source variation site, it's synced only to the specified target variation site when the next Variations Propagate Page Job Definition timer job runs. For info about how to enable Create Selectively for variation pages, see Create a multi-language website. By default, when a page that was synced from the source variation site is deleted from a target variation site, that page is re-created on the target variation site the next time it's published on the source variation site and the next time that the Variations Propagate Page Job Definition timer job runs. If Recreate Deleted Target Page is disabled, deleted pages aren't re-created on the target variation sites. For info about how to create variation source pages and how to work with content on variation target pages, see Create a multi-language website. Limitations of variations The following list contains info about the limitations of the variations feature in SharePoint Server: Variations feature is a single-tier hierarchy. The source and target variation sites exist at the same level within the site hierarchy, one level down from the variations root site. However, you can have only one source variation site per site collection. A site can't be both a source and a target site. You can sync content from a source variation site to one or more target variation sites, but you can't sync content from one target variation site to another target variation site. For example, if you have a source variation site in English (United States), and a target variation site in French (France), which has a French (Canada) site below it, the variations feature will only sync content from the English (United States) source variation site to the French (France) target variation site. The variations feature can't also sync content from the French target variation site to the French (Canada) site below it. You can use variations together with cross-site publishing to reuse content from one variation site in the context of another variation site. For example, you could reuse content from the French (France) site on the French (Canada) site. For more info, see Plan the logical architecture for cross-site publishing in SharePoint Server. Content syncing is unidirectional. The variations feature syncs content from a source variation site to one or more target variation sites. You can't use the variations feature to sync content from a target variation site back to a source variation site. Also, target variation sites can't sync content to other target variation sites. See also Concepts Plan for variations in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/variations-overview.md/0
Variations overview in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/variations-overview.md
OfficeDocs-SharePoint
5,117
44
title: "Estimate capacity and performance for Web Content Management (SharePoint Server 2013)" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 8252017 audience: ITPro f1.keywords: - NOCSH ms.topic: overview ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: IT_Sharepoint_Server_Top ms.assetid: d819b0d0-aa83-4e40-82d8-7a32195cc669 description: "Learn how to determine the number and types of computers that you need to publish content and manage web content in SharePoint Server." Estimate capacity and performance for Web Content Management (SharePoint Server 2013) [!INCLUDEappliesto-2013-xxx-xxx-xxx-xxx-md] Enterprises often use SharePoint Server 2013 to publish content that anonymous users access on an Internet site or that authenticated users access on an intranet site. This article contains capacity and performance data to help plan the number of computers to use and the types of computers that are required to publish content and manage web content in SharePoint Server 2013. SharePoint publishing includes different types of publishing sites and associated methods that are available for each site. The publishing features of SharePoint Server 2013 are intended to help create branded Internet, intranet, and extranet sites. For more information about SharePoint Server 2013 publishing, see Overview of publishing to Internet, intranet, and extranet sites in SharePoint Server. Introduction This article discusses the following scenarios: Internet presence site Provides information to customers, partners, investors, and potential employees. This type of site lets anonymous Internet users find information about a corporation. Typically, these sites are branded and the company tightly controls the content. Internet business site Promotes products and services that a company offers to customers. These sites can show a catalog of products that the company offers. Intranet site A company publishes this site internally inside an organization. These sites share information for authenticated users and companies either tightly manage the site to restrict access or open to all jnternal users. Extranet site Provides access to targeted content to remote employees, partners, and customers. These sites can provide access to knowledge bases that use authored content tagged with metadata to categorize articles. Users can search or browse for specific information such as troubleshooting and support articles. Cross-Site Collection Publishing and the Content Search Web Part enable content reuse across site collections in these scenarios. These features and functionality affect how you plan for capacity. For more information, see Overview of cross-site publishing in SharePoint Server. [!NOTE] Cross-Site Collection Publishing is known as cross-site publishing in this article. Managed navigation in SharePoint Server 2013 provides taxonomy-driven navigation for a publishing site. For more information, see Overview of managed navigation in SharePoint Server. The capacity and performance data in this article contain two parts. The first part is the new cross-site publishing method and managed navigation. The second part uses the author-in-place model. [!NOTE] The scenarios that are addressed in this article can be achieved by both cross-site publishing and author-in-place sites. The cross-site publishing and managed navigation features do not depend on one another and can be used independently. The following two key metrics are addressed in the models that are used in this article: Throughput The number of page views per second that the site can sustain Server Response Time The time that is required for the server to process a request, affecting the time it takes for a user to view the page. The server response times that we provide in this document are the 95th and 50th percentile values. These values mean that 95 percent and 50 percent of requests are faster than the value that is provided, respectively. We measure these values by using the "Duration" recorded in the SharePoint Usage database for a given request. Content freshness The time that is required for an updated item to be reflected in search results is a good metric to consider when you work with cross-site publishing scenarios. The scenarios in this article use the following two states: Green Zone The servers are under 60 percent utilization. This should be the target for most of the time that the servers are running. Red Zone The servers are close to full utilization. This can be considered a state where the SharePoint site is under more load than usual. In the Red Zone, the server response time values start to increase as the server tries to meet the demand of incoming requests. Prerequisite information Before you read this article, make sure that you understand the key concepts behind SharePoint Server 2013 capacity management. The following articles help you learn about the recommended approach to capacity management and provide context to help you understand how to make effective use of the information in this article. Note that some other new features that affect publishing scenarios functionally do not appear in this article. These scenarios include device channels, SEO optimization, display templates and query rules. Additionally, the functionality and configuration of a cross-site publishing site is not described in detail in this article. For more information, see Plan for cross-site publishing in SharePoint Server and Configure web content management solutions in SharePoint Server. For more information about capacity and performance to help understand the data in this article, see Performance planning in SharePoint Server 2013. Cross-site publishing using managed navigation This section provides our test data for two areas: cross-site publishing with anonymous users and author-in-place publishing. Cross-site publishing with Anonymous users Test results in this section are based on a basic cross-site publishing site model to provide capacity planning guidance. When you plan a SharePoint deployment for anonymous users to access a web site, use this guidance and adjust your deployment specifications accordingly. The test case in our tests uses the cross-site publishing feature. This scenario provides content in multiple site collections that are marked as catalogs, and then crawled by the SharePoint Search Service application. Web Parts that use search technology, for example the Content Search Web Part, and the Catalog-Item Reuse Web Part, display content on pages in another site. For more information, see Overview of cross-site publishing in SharePoint Server. We used the following characteristics in the model site, which we built to test cross-site publishing: Publishing web site that has approximately 5 million pages or items. The items are associated with about 1,000 categories. The content is located in other site collections in one or more catalogs. The web site uses managed navigation that is linked to the categories that the items are associated with. For the baseline deployment topology described later in this list, the web site receives up to 80 page views per second on average. Peak periods reach up to 100 page views per second. To scale this throughput number up, add computes to the topology. To scale this throughput number down, remove computers from the topology. The Search crawler runs with continuous crawls for a 1-minute interval with five updates per second to the catalog. The web site has the following page and traffic patterns: Home page that has three Content Search Web Parts and a Refinement Panel Web Part (receives 15 percent of the traffic). Category pages that have three Content Search Web Parts, one Taxonomy Refinement Panel Web Part, and one Refinement Panel Web Part receive 45 percent of the traffic. Catalog Item pages that have Catalog-Item Reuse Web Part and two Content Search Web Parts receive 40 percent of the traffic. Each Content Search and Catalog-Item Reuse Web Part issues a synchronous query. The catalog item pages do not use the Anonymous Search Results Cache because they receive a small amount of traffic. The farm has the binary large object (BLOB) cache turned on for the computers that are run as the front-end web servers. The server topology that we used to test this scenario is in the following diagram: Figure 1: Test lab server topology one computer that hosts SQL Server with all of the databases that SharePoint uses one computer that hosts SharePoint service applications, distributed cache service, search analytics processing, and search administration roles one computer that hosts the search crawler and content processing (CPC) roles three computers that host search index nodes with query processing and serve as front-end web servers [!NOTE] The computers in this test are physical computers that run Windows Server 2008 R2. Refer to the Search capacity planning and Capacity planning for SharePoint Server 2013 for recommendations about how to use virtual machines and Windows Server 2012. [!IMPORTANT] The configuration for our test lab topology is optimized for search-driven publishing scenarios. This configuration is different from collaboration types of SharePoint deployments. For example, our configuration uses the front-end web servers as search index servers to get the best performance. > In our test lab topology we learned that the computer that hosts the application server was underutilized. As a result, we put the Distributed Cache Service on this application server instead of on a dedicated server. You may decide to host the Distributed Cache Service on a dedicated server in your environment. For best performance we do not recommend that you host the Distributed Cache Service on a front-end web server that has the search index server role. Test lab reports We used the topology in figure 1 for our test lab with physical computers and a Visual Studio Team System (VSTS) load test. For more information, see Visual Studio Team System. Technical specifications for the test computers are in the following tables. [!NOTE] We did not use browser caching or dependent requests, such as images or JavaScript files in our VSTS tests. Depending on how you customize your publishing site, the amount of dependent requests that occur can vary considerably. > The pages that we used in our tests made almost 50 page load time 1 (PLT1) request types (empty browser cache) and about 3 requests for PLT2 request types (subsequent requests with results from the browser cache). Usually, SharePoint BLOB Cache serves requests for these items and will not alter our performance numbers significantly. Server ComponentsServers running SharePoint Server ------ ProcessorsIntel Xeon CPUs @2.27GHz (2 processors, 8 cores total, 16 threads total) RAM24 GB Operating systemWindows Server 2008 R2 Enterprise SP1, 64-bit Size of the SharePoint drive200 GB on internal disk Storage used for Search Index78 GB on an external disk array (2 x Dell PERC H700 SCSI) Number of network adapters2 Network adapter speed1 gigabit AuthenticationNone - Anonymous Software versionSharePoint Server 2013 Server ComponentsDatabase servers ------ ProcessorsIntel Xeon CPUs L5520 @2.27GHz (2 processors, 8 cores total, 16 threads total RAM24 GB Operating systemWindows Server 2008 R2 Enterprise SP1, 64-bit Disk Array2 x Dell H700 SCSI Number of network adapters2 Network adapter speed1 gigabit AuthenticationNTLM Software versionMicrosoft SQL Server 2008 R2 SP1 Results from a 10 minute run are as follows: Test FeaturesGreen ZoneRed Zone --------- Number of VSTS users (simulating concurrent users):60100 Server Response Time 50th percentile:219 ms.302 ms. Server Response Time 95th percentile:412 ms.635 ms. Page views per second:7898 This is a cross-site publishing scenario that displays content from the search index. It may be interesting to examine the number of queries that the servers that host search queries serve, and the number of queries that the Anonymous Results Cache serves. In this model, the Anonymous Results Cache serves about 60 percent of the queries. The Anonymous Results Cache is discussed later in this article. Test FeaturesGreen ZoneRed Zone --------- Total queries per second:235294 Queries served from Anonymous Results Cache:145182 Queries served from Search:90112 The values for the average CPU and peak memory usage for these computers while the tests were running are as follows: Test FeaturesGreen ZoneRed Zone --------- Average CPU (Search index nodes per front-end web server)59%80% Average CPU (application server including Distributed Cache)8%9% Average CPU (Search CPC nodes)5%5% Average CPU (SQL Server)Not measuredNot measured Peak Memory usage (Search index nodes per front-end web server)7.5 GB7.5 GB Peak memory usage (application server including Distributed Cache)10.1 GB10 GB Peak memory usage (Search CPC nodes)6.5 GB6.5 GB Note that the memory usage may differ somewhat because various timer jobs run on the server during normal usage. We found that the indexfront-end web server nodes were using as much as 12 GB of memory after a two week test run with a sustained load. How Search Web Parts display content on cross-site publishing pages If a publishing page contains a Search Web Part, such as the Content Search Web Part, the browser starts to process the page before the search query is complete. This improves the perceived latency of the page. After the search query finishes, the complete results of the query are sent to the browser, and the connection to the browser is closed. Users might think that the search results are loaded asynchronously. However, the queries are still issued from the server while the page is being requested. Note that there is a separate asynchronous mode for the Content Search Web Part, where the queries are issued from the browser after a page is loaded. Effect of load changes on your cross-site publishing site We varied the number of VSTS users (similar to the number of concurrent users who access the site) who were used in the load test. The following graph shows that the server response time increases as load increases, and there is some incremental increase in number of pages served per second. We recommend that the server response times are kept under 750 ms. to make sure that users have a responsive experience with the SharePoint deployment. Figure 2: Chart showing throughput and server response times with different loads Scaling out your cross-site publishing site If the SharePoint deployment is expected to receive more or less traffic compared to the baseline case described earlier, you may want to change the number of computers that are running with the Index and front-end web server role on the farm to accommodate that traffic. The following graph shows the results for scaling out the same cross-site publishing site that has different load patterns and varying number of computers that are used as front-end web servers with Index nodes, starting with a single computer in the front-end web server role with Index nodes, and going up to six computers: Figure 3: Scaling out your deployment In each of the configurations, we adjusted the load to have server response times at similar values compared to the baseline in the previous section. Note that as number of computers increased, the complexity of the topology starts to overtake the gains. Each additional computer has less throughput compared to computers that are already in the environment. These numbers are provided to show the pattern for scaling out. Actual performance will change depending on how the SharePoint deployment is built. Guidelines to plan your site The majority of our performance testing used the deployment that was described in the earlier sections. The guidelines in the following list are meant to help you make correct capacity planning decisions when your deployments differ from the ones we used in our test lab. More items in the search index generally mean higher latency. Each index partition can contain up to 10 million items. Typical websites rarely have more than 10 million items to show. So they only need one partition as in the topology we described earlier. You can use more index partitions to either host more than 10 million items or to have more, smaller, and faster index partitions. If you plan to use multiple index partitions, refer to Scale search for Internet sites in SharePoint Server to correctly size your search topology. Each control or Web Part that you add to a page (or page layout) will add some overhead to the server response time for the page. Avoid using more than five synchronous Content Search Web Parts or Catalog-Item Reuse Web Parts on a page. While processing a request for a page, SharePoint Server 2013 executes as many as five queries in parallel and returns the results. If more than five queries are on a page, SharePoint Server 2013 executes the first five queries before it starts to execute the next set of five queries. If pages require more than five Content Search Web Parts or Catalog-Item Reuse Web Parts, you might run the additional Content Search Web Parts in asynchronous mode or use query rules and result blocks. Content Search Web Parts and Catalog-Item Reuse Web Parts have an asynchronous mode. The query that is associated with the Web Part is executed after the browser loads the page. Use this mode for slow queries so that the rest of the page appears faster for users. Otherwise, we recommend that you use synchronous queries for best page load times. A Refinement Panel Web Part that has many refiners increases the time to process a query. You can change the number of refiners to show for a managed property. For more information, see Configure refiners and faceted navigation in SharePoint Server If you use the Taxonomy Refinement Panel Web Part when you have a deep hierarchy of navigation nodes, the time to process a query increases. We do not recommend the use of the Taxonomy Refinement Panel Web Part on a page that has more than 200 navigation nodes under it. The large number of navigation nodes may cause slow server response times and decrease throughput. If you must design a SharePoint deployment for high availability, you must add the following: An additional computer that runs with the service applications in distributed cache roles in case the existing computer is not available Additional computers to sustain the load if one or more of the front-end web server computers with Index nodes are not available An additional computer in the CPC roles to make sure updates are still reflected in your site when the computer that has the CPC role is not available A SQL Server topology that continues to serve database queries if one of the database servers is not available Search crawl speed and content freshness In our testing, we also conducted tests for the process that updates the catalog content that was being published. We then observed the amount of time that elapsed before an updated item appeared in the publishing site. In our experiments, we made five updates per second to the catalog and set the continuous crawls on the catalog to a one minute interval. We observed that the average time for the changes to appear in the publishing site were about two minutes. The minimum time was just under a minute and the maximum time was three minutes. We did not see a significant change from these numbers when we increased the number of computers that were running with the CPC role. For the full crawl of the catalog, however, an increase in the number of computers that are running with the CPC role increased the number of items processed per second. The following graph shows the relationship of items processed per second and the number of computers in the farm with the CPC role. Note that this test data was obtained from a SharePoint deployment other than the one used in the baseline tests. The findings should apply to the SharePoint deployments because the addition of more CPC nodes results in improved full crawl times. Figure 4: Effect of content processing (CPC) computers on a full crawl Therefore, if you require faster full crawls for your catalogs, you can increase the number of computers that use the CPC role in your deployment. Load on Managed Metadata Service application Our testing shows that publishing scenarios that involve sites that use managed navigation do not have significant memory or CPU requirements on the Managed Metadata service application. For a deployment such as the one we have described earlier, the Managed Metadata service application can be run on a computer that is running other SharePoint service applications. The Managed Navigation feature makes one connection to the service application when it receives the first request for a site. Subsequent requests use values that front-end web servers cache. Therefore there is no load on the Managed Metadata service application while front-end web servers fulfill requests. Anonymous Search Results Cache The Anonymous Search Results Cache stores results of a query, refinement data for the query, and additional result tables that are returned from the SharePoint Distributed Cache Service. Each cache entry depends on the parameters of a query, such as the sort order of the results, the requested refiners, and any dynamic reordering rules. The cache affects all queries that a web application handles including queries from Search Web Parts and the queries from CSOM clients. For more information, see Overview of search architecture in SharePoint Server and Scale search for Internet sites in SharePoint Server. This cache is not used for queries that are authenticated because of security concerns. We recommend that you configure the Distributed Cache Service to run only on the computer that runs the SharePoint service applications for best results. The Distributed Cache Service should not run on the computers that are in the front-end web server roles. By default, the Anonymous Search Results Cache refreshes items every 15 minutes. You can use Microsoft PowerShell to configure the cache duration on the web application where the cache is configured: $webapp.Properties["SearchResultsCacheTTL"] = $webapp.Update() If you want search results to be fresher than the default value, you lower the value. Note that this increases the number of queries that the Search service will need to serve. We recommend that you always use the cache on publishing pages that receive heavy traffic. Some examples for these types of pages are the site home page and category pages that use Search Web Parts. We do not recommend caching for catalog item pages. Because an individual catalog item page would be accessed much less frequently than a home page, and it may not be worthwhile to store the item in cache. When we turned off the Anonymous Search Results Cache in our test environment that has the same load patterns, server response times increased significantly and throughput in number of page views per second declined. Here's a graph that shows this relationship: Figure 5: Effect of Anonymous Search Results Cache By default, Content Search Web Parts are configured to use the Anonymous Search Results Cache. Catalog-Item Reuse Web Parts, which are used on catalog item pages, are not configured to use it due to the sparse access patterns these pages generally exhibit. To configure the caching behavior for an individual Web Part to use (or not use) the Anonymous Search Results Cache, set the value of the sub-property "TryCache" in the DataProviderJSON property of the Web Part. If the value is "true", the query uses the cache. If the value is "false", the query does not use the cache for Anonymous search queries. Effect of output cache Output caching is an effective way to reduce the load on SharePoint Server 2013 in publishing scenarios. For more details about how Output Cache works in this article, see Output Caching and Cache Profiles. A SharePoint deployment may benefit from Output Caching to reduce the load on SharePoint content databases and the search service application. Here are some example situations: You are receiving lots of traffic on some of your pages. You are receiving lots of traffic on SharePoint content databases, Computers that serve search queries are running with high CPU utilization. We recommend that you use Output Caching for very popular pages on your site, such as the site's home page or top level category pages and certain item pages that receive large amounts of traffic. [!IMPORTANT] There is a known issue in SharePoint Server 2013 when pages that have Output Caching enabled also contain Content Search Web Parts. To avoid this issue in your deployment, install SharePoint Server 2013 update: March 12, 2013. The following graph shows some results from our test environment where we use Output Caching on the homepage and category pages that receive 60 percent of the site traffic. Figure 6: Effect of output caching for home page and category pages [!NOTE] Content Search Web Parts have a setting to run in asynchronous mode. Output Caching does not apply to the load from asynchronous Content Search Web Parts. Usage analytics processing To have information about usage analytics ready to use, SharePoint Server 2013 processes information that's in the usage database. In our topology Analytics Processing occurs on the node that contains the Search Admin node, Distributed Cache service and other service applications. For more information, see Overview of analytics processing in SharePoint Server We took some analytics processing time measurements by using the cross-site publishing site that we used in our earlier tests. We measured the time that SharePoint Server 2013 takes to process a large amount of click events on the pages in the site. While these results are from a cross-site publishing site, they also apply to sites that use the author-in-place publishing method. For our tests around usage analytics processing, we generated the following mock events, every day for a week: 27.5 million click events spread across 3 million list items and 400,000 users. Zipf distribution was used so that some items and users have many events, but others have less. This generated a total of 7.5 million events per day, simulating different users generating different traffic patterns for the site. We triggered the analysis runs seven times to simulate one week of traffic. We ran the Usage Analytics job every day for the data that we accumulated over six days. Then we measured the time, the seventh day took. The seventh day will be the day that takes longest to process as the complete week's items are processed and the relationship graph is updated. The runtime and disk usage for Day 8 will resemble Day 1. The analytics processing did not have a significant impact on the computer that it ran on, and we continued to successfully serve queries and keep content fresh on the search-driven site. The results are summarized in the following table: Test ScheduleUpdate Relationship GraphRuntime (hours)Total Peak Disk UsageUsage Analytics Peak Disk Usage --------------- Day 1No02:352.65 GB Day 2No02:43 Day 3No03:23 Day 4No04:39 Day 5No06:08 Day 6No07:35 Day 7Yes08:2982.4 GB4 GB The following graph displays the runtimes for the different days: Figure 7: Runtime hours per day Cross-site publishing with authenticated users SharePoint publishing is used commonly on intranet sites. By using SharePoint Server 2013, these sites can also be powered by cross-site publishing. The following sections show some important distinctions to consider when you plan for a cross-site publishing site that uses authenticated users. Other than the exceptions mentioned in the following sections, the rules that apply to anonymously accessed sites still apply to sites that authenticated users access. Lack of Anonymous Search Results Cache As mentioned in the Anonymous Search Results Cache section earlier, this cache only takes effect for users who are accessing the SharePoint site anonymously. Compared to anonymously accessed sites that use the Anonymous Search Results Cache, the throughput capacity of sites that are accessed by authenticated users will be significantly lower. Typically intranet sites rarely receive loads as high as the ones mentioned in the previous section (up to 100 page views second). However, this is an important distinction to consider. Use of the output cache can ease the lack of Anonymous Search Results Cache to a degree for these scenarios. Cross-site publishing sites that expect multiple page views per second should consider enabling output cache for their sites. [!IMPORTANT] Content Search Web Parts have a setting that, if it is enabled, causes them to run in asynchronous mode. The output cache does not apply to the load from asynchronous Content Search Web Parts. Larger Search index Depending on the size of the enterprise that deploys SharePoint Server 2013, intranet deployments for SharePoint Server 2013 will typically index larger numbers of documents. This means that the required Search topology to index those documents will be different compared to the topology described in the previous section. Refer to Plan search in SharePoint Server to size your SharePoint deployment appropriately. Author-in-place publishing This section provides guidance and results that use SharePoint Server 2013, but it does not detail the different features that affect capacity planning. For details in this area, see Web content management in SharePoint Server. Author-in-place publishing with anonymous users For our tests we worked with a website that has the following characteristics: Website with up to 20,000 article pages divided into 20 folders of 1,000 pages each across 50 sites in a single site collection. The site uses structured navigation. The site generally receives a minimum of 50 to 100 page views per second. Traffic patterns hit the following mix of pages: 20 pages that contain a single Content Query Web Part that issues content database queries of varying scopes (20 percent of the traffic) 30 pages that include multiple Content Query Web Parts that issue content database queries of varying scopes (30 percent of the traffic) 1,600 articles with 40k of text and two images (receives 50 percent of the traffic) The recommended server topology is in the following diagram: Figure 8: Author-in-place publishing test topology 1 computer hosting SQL Server 1 computer hosting SharePoint service applications as the front-end web server Test lab results We used the topology shown in the previous diagram in our test lab by using physical computers and a Visual Studio Team System load test. The following table shows the technical specifications that we used in the computers that we tested: Server ComponentsSharePoint Servers ------ ProcessorsIntel Xeon CPUs @2.33GHz (2 processors, 8 cores total, 8 threads total) RAM24 GB Operating systemWindows Server 2008 R2 Enterprise, 64-bit Number of network adapters2 Network adapter speed1 Gbps AuthenticationNone - Anonymous Load balancer typeWindows software load balancer Software versionSharePoint Server 2013 Server ComponentsDatabase server ------ ProcessorsIntel Xeon CPUs MP7130M @2.79GHz (2 processors, 8 cores total, 16 threads total RAM16 GB Operating systemWindows Server 2008 R2 Enterprise, 64-bit Disk array2 x Dell PERC 5E Number of network adapters1 Network adapter speed1 gigabit or Gbps AuthenticationNTLM Software versionMicrosoft SQL Server 2008 R2 SP1 The following table shows our results for a 10-minute run: Test FeaturesGreen ZoneRed Zone --------- Number of VSTS users:515 Server Response Time 50th percentile:69 ms.112 ms. Server Response Time 95th percentile:92 ms.221 ms. Page views per second:5793 Average CPU (application server and front-end web server)5597 Average CPU (SQL Server)79 Peak memory usage (application server and front-end web server)8.9 GB8.9 GB Effect of output cache Output caching is an effective way to reduce the load on SharePoint Server 2013 in publishing scenarios. For more information, see Plan for caching and performance in SharePoint Server. The following table shows our results for a 10-minute run with output cache enabled and a 90 percent hit ratio: Test FeaturesGreen ZoneRed Zone --------- Number of VSTS users:515 Server Response Time 50th percentile:2 ms.2 ms. Server Response Time 95th percentile:74 ms.88 ms. Page views per second:190418 Average CPU (application server and front-end web server)5885 Average CPU (SQL Server)57 Peak memory usage (application server and front-end web server)9.2 GB9.4 GB The test results show that using output caching can significantly increase the throughput of a SharePoint publishing site and reduce server response times. For requests served from the output cache, the response times are almost instant. The following graph shows a summary of our testing results: Figure 9: Effect of output caching with 90% cache hit ratio Effect of managed navigation In SharePoint Server 2013, publishing sites can also use managed navigation. For details on how to set this up, see Overview of managed navigation in SharePoint Server. We ran the same set of tests for our test site using managed navigation as we used for the structured navigation tests. Our tests show that there is no significant difference in performance when sites use managed navigation or structured navigation. Test FeaturesGreen ZoneRed Zone --------- Number of VSTS users:515 Server Response Time 50th percentile:70 ms.111 ms. Server Response Time 95th percentile:95 ms.215 ms. Page views per second:5694 Average CPU (application server and front-end web server)5497 Average CPU (SQL Server)79 Peak memory usage (application server and front-end web server)8 GB8 GB The following graph shows the different types of navigation for the same site: Figure 10: Managed Navigation versus Structured Navigation Effect of adding computers (scaling out) If you find that you need more throughput from a SharePoint deployment, scaling out (increasing the number of computers that host SharePoint Server 2013) is an option to consider. The following graph shows how throughput increases as we add more computers to the farm: Figure 11: Effect on throughput by adding front-end web servers In our tests we increased the load on the server running SharePoint Server 2013 for each computer that was added so that the server response times were approximately the same (around 11 milliseconds for the green zone, around 250 milliseconds for the red zone). Author-in-place publishing sites with authenticated users The SharePoint publishing feature is used commonly in the intranet where users who access a site are authenticated. This section shows our tests that used authenticated users and the effects. The following table shows the test results of author-in-place publishing sites that authenticated users accessed by using claims-based authentication with NTLM. Note that these tests use identical hardware as the tests in the previous section. Test FeaturesGreen ZoneRed Zone --------- Number of VSTS users:515 Server Response Time 50th percentile:76 ms.107 ms. Server Response Time 95th percentile:103 ms.194 ms. Page views per second:54100 Average CPU (application server and front-end web server)5097 Average CPU (SQL Server)69 Peak Memory Usage (application server and front-end web server)9.5 GB9.5 GB The numbers show that there is no significant difference between anonymous versus authenticated requests according to the server response times and throughput. The following graph shows the different types of requests for the same site: Figure 12: Anonymous requests versus authenticated requests Effect of Output Cache in authenticated scenarios Authenticated requests to the server require a roundtrip to the content database to make sure that the account that is accessing the content has permissions to view the content. This means that the output caching performance characteristics of authenticated sites are different compared to anonymous sites. The following table shows the results we received for a 10 minute run with output cache enabled and a 90 percent cache hit ratio: Test FeaturesGreen ZoneRed Zone --------- Number of VSTS users:618 Server Response Time 50th percentile:17 ms.29 ms. Server Response Time 95th percentile:87 ms.118 ms. Page views per second:114236 Average CPU (application server and front-end web server)5097 Average CPU (SQL Server)710 Peak Memory Usage (application server and front-end web server)9.9 GB10 GB The following graph shows the summary of these results: Figure 13: Effect of authenticated output caching See also Concepts Web content management in SharePoint Server Configure web content management solutions in SharePoint Server Configure cache settings for a web application in SharePoint Server Plan for caching and performance in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/web-content-management-capacity-and-performance.md/0
Estimate capacity and performance for Web Content Management (SharePoint Server 2013)
OfficeDocs-SharePoint/SharePoint/SharePointServer/administration/web-content-management-capacity-and-performance.md
OfficeDocs-SharePoint
7,614
45
title: "Create and run queries in the eDiscovery Center" description: "Once you have defined your sources, and placed them on hold if necessary, you can run queries to narrow down and extract exactly the content you need for a particular case." ms.reviewer: ms.author: robmazz author: robmazz manager: laurawi ms.date: 2122018 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - tier1 - purview-compliance - M365-collaboration - ediscovery Create and run queries in the eDiscovery Center [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] Once you have defined your sources, and placed them on hold if necessary, you can run queries to narrow down and extract exactly the content you need for a particular case. Efficient queries can make it much easier for you and other people involved in the case to manage the content, because it reduces the overall volume and helps ensure that the content that you deliver is more likely to be relevant. Before creating queries, you should add content sources to your case. Find more information about working with content sources in the article Add content to a case and place sources on hold in the eDiscovery Center. [!NOTE] A query can contain a maximum of 100 SharePoint sources, and 500 keywords. By default, a query searches across all content sources, you can choose which discovery sets or sources a query searches if you don't need to search them all, which can make your queries run faster. You can also refine your queries in other ways. For more information, see Searching and using keywords in the eDiscovery Center. If your case is not already open, in an eDiscovery Center, click Cases, and then open the case you want to create queries for. The case should already have content sources, such as Web sites. In the Search and Export section, under Queries, click New Item. Type a descriptive name for your query. In the Query box, type the keywords you want to use to narrow down your query. Find tips for writing queries in the See Also section. To narrow down content by a date range, enter the Start Date and End Date. [!NOTE] If you type the dates in the Start Date and End Date boxes, use the format mm dd yyyy ; for example, you would use 03012013 to specify March 1, 2013. Use the mm dd yyyy format even if the regional settings on the local computer are configured with a different format, such as dd mm yyyy . Alternatively, select the start and end dates using the date picker. To limit results to the author of a document or list item, or to a specific sender of e-mail messages, type the names or e-mail addresses in the AuthorSender box. If you have multiple sources and Discovery Sets, but don't need to search them all, click Modify Query Scope. Then, specify the discovery sets or content sources you want. To narrow down your query by specific types of content, click the SharePoint tabs, and then select the checkboxes for the type of content you want. For example, you can select only only PowerPoint slides for SharePoint. To analyze or further refine your query, click Advanced Query Options., and do one or more of the following. To examine the syntax and structure of your query, view the SharePoint Query sections, and the table of sources that shows filters, queries, and refiners, When you are ready to run your query, click Search. The results are ranked based on relevance, such as how frequently a search term appears. [!NOTE] Once you add queries or content sources to an eDiscovery case, changing the regional settings for the site is not supported. Add more content sources while creating a query On the Query:New Item page, in the Sources section, click Modify Query Scope. In the dialog that appears, click All case content, Click Add location for SharePoint content. Specify the person's Web site you want to add. Click OK. [!NOTE] If you update a query and rerun it, only the first page of the new results will be refreshed. If you are viewing multiple pages of query results, and you are not viewing the first page, the page will not be refreshed with the new results. Find more information about eDiscovery For more information about eDiscovery cases, see the following articles: Scenario: eDiscovery in SharePoint Server 2013 and Exchange Server 2013 Plan and manage cases in the eDiscovery Center Add content to a case and place sources on hold in the eDiscovery Center Searching and using keywords in the eDiscovery Center Export content and create reports in the eDiscovery Center
OfficeDocs-SharePoint/SharePoint/SharePointServer/governance/create-and-run-queries-in-the-ediscovery-center.md/0
Create and run queries in the eDiscovery Center
OfficeDocs-SharePoint/SharePoint/SharePointServer/governance/create-and-run-queries-in-the-ediscovery-center.md
OfficeDocs-SharePoint
1,083
46
title: "Plan for managed metadata in SharePoint Server" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 762017 audience: ITPro f1.keywords: - NOCSH ms.topic: conceptual ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: e580fcae-b768-4b81-afda-c037fbd7bd6d description: "Learn about the decisions you need to make when planning for the managed metadata service in SharePoint Server." Plan for managed metadata in SharePoint Server [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] With managed metadata in SharePoint Server, you can create a unified taxonomy of terms that you can use throughout your SharePoint farm. In this article, we walk through the configuration decisions that you need to make before you configure the managed metadata service application in SharePoint Server. Before you configure managed metadata, it's important to understand what managed metadata is and how it works in SharePoint. Before you read this article, be sure to read Introduction to managed metadata. Overview of the managed metadata service in SharePoint Server In SharePoint Server, managed metadata is implemented through a service application and the Managed Metadata Web Service which runs on the Application and Front-end server roles. You can create multiple managed metadata service applications if you need separate term sets for specialized use. Each service application has its own database containing the term set, terms, keywords, and so on. Connections between the managed metadata service application and your web applications are handled by managed metadata connections. Using these connections, you can map different managed metadata service applications to different web applications if needed, and configure features around keyword and term set creation policy. If you have multiple SharePoint Server farms and you want to share a managed metadata term store between them, you can publish the service application in one farm and the others can connect to it. (This requires a trust relationship between the farms.) The managed metadata service application also provides publishing functionality for content type hubs, which are site collections where you can create standard content types and share them among other site collections. Let's look at the decisions you need to make in these areas before you configure managed metadata. Isolating managed metadata term sets Within a term store, you can create a variety of different term sets. These term sets can be organized in groups. Groups offer some security isolation by allowing you to specify who can manage or contribute to them. However, users in general can access and use the terms themselves. If you want to restrict a term set to a specific group of users, there are two options. First, you can allow users to create term sets that are local to a site collection. These term sets are not available in other site collections. (We talk more about this option in the next section.) Second, you can provide even greater isolation for a term set by housing it in a separate managed metadata service application. This gives you a completely separate term store in a separate database which you can restrict to a particular set of users. If you have sensitive areas such as legal or human resources where you want a term store with limited access, consider this second approach. Users of this term store can also be granted access to your main term store. Decision Before you configure managed metadata in SharePoint, decide if you need more than one managed metadata service application for separate term stores. (You can add additional service applications in the future if needed.) Using local term sets Beyond the global term sets that are available across all of your site collections, you can create term sets that are local to a site collection. When users create a managed metadata column for a SharePoint list, they have the option of creating a new term set rather than using global term sets. This can be useful if you want to use managed metadata in ways that don't apply to the entire organization, such as metadata for a particular project or event. By using local term sets, different teams and business groups can create their own managed metadata without needing to request formal updates to a global term set, and you can keep the global term sets focused on core areas of your business. If you don't want to allow users to create their own term sets that are local to site collections, you can disable this feature when you configure the managed metadata connection. Decision Before you configure managed metadata in SharePoint, decide if you want to allow local term sets to be created. (You can have different setting for each service application that you create.) Managed metadata keywords and folksonomy When you use a managed metadata column in a SharePoint list, users who fill out that column have to use one of the available values defined in the term store that the column is connected to. This is one of the primary advantages of managed metadata - you're certain the value was chosen from a predefined list. However, SharePoint managed metadata also supports tagging functionality where users can tag SharePoint items such as documents with keywords that they create that aren't in the existing term store. These keywords accumulate in a list in the term store, and you can manage them by consolidating them and adding them to existing or new term sets as needed. This allows a folksonomy approach where users can create new keywords as needed without having to go through a formal process to update a particular term set. If you want to take a more formal approach to managed metadata, you can disable this feature when you configure the managed metadata connection. In this case, users will have to pick from the existing terms in the term store and add those values to specific fields that you create for that purpose. Decision Before you configure managed metadata in SharePoint, decide if you want to allow tagging of SharePoint items with keywords that are not in the term store. Publishing managed metadata content types In addition to sharing managed metadata, you can also use the managed metadata service to share content types. By specifying a site collection as the content type hub when you configure the managed metadata service application, you can share all content types in the site collection's content type gallery, making them available to other site collections. Decision Before you configure managed metadata in SharePoint, decide if you want to create a content type hub. (You can also add one later.) See also For the SharePoint in Microsoft 365 version of this article, see Introduction to managed metadata.
OfficeDocs-SharePoint/SharePoint/SharePointServer/governance/managed-metadata-planning.md/0
Plan for managed metadata in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/governance/managed-metadata-planning.md
OfficeDocs-SharePoint
1,405
47
title: "Plan document versioning, content approval, and check-out controls in SharePointServer" ms.reviewer: ms.author: toresing author: tomresing manager: serdars ms.date: 312018 audience: ITPro f1.keywords: - NOCSH ms.topic: conceptual ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.assetid: b607e000-9436-4cbb-b6aa-9e76d70a4314 description: "Learn how to use versioning, content approval, and check-out in SharePoint Server to control document versions throughout their life cycle." Plan document versioning, content approval, and check-out controls in SharePointServer [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] This article describes how to plan to use versioning, content approval, and check-out in SharePoint Server to control document versions throughout their life cycle. About versioning, content approval, and check-outs SharePoint Server includes the following features that can help you control documents in a document library: Versioning is the method by which successive iterations of a document are numbered and saved. Content approval is the method by which site members who have approver permissions control the publication of content. Check-out and Check-in are the methods by which users can better control when a new version of a document is created and also comment on changes that they made when they check a document in. You configure settings for the content governance features discussed in this article in document libraries. To share these settings across libraries in your solution, you can create document library templates that include your content governance settings. This makes sure that new libraries will reflect your content governance decisions. For more information about versioning, see Enable and configure versioning for a list or library. Plan versioning The default versioning control for a document library depends on the site collection template. However, you can configure versioning control for a document library depending on your particular requirements. Each document library can have a different versioning control that best suits the kind of documents in the library. SharePoint Server has three versioning options: No versioning Specifies that no earlier versions of documents are saved. When versioning is not being used, earlier versions of documents are not retrievable, and document history is also not retained because comments that accompany each iteration of a document are not saved. Use this option on document libraries that contain unimportant content or content that will never change. Create major versions Specifies that numbered versions of documents are retained by using a simple versioning scheme (such as 1, 2, 3). To control the effect on storage space, you can specify how many earlier versions to keep, counting back from the current version. In major versioning, every time that a new version of a document is saved, all users who have permissions to the document library will be able to view the content. Use this option when you do not want to differentiate between draft versions of documents and published versions. For example, in a document library that is used by a workgroup in an organization, major versioning is a good choice if everyone on the team must be able to view all iterations of each document. Create major and minor (draft) versions Specifies that numbered versions of documents are retained by using a major and minor versioning scheme (such as 1.0, 1.1, 1.2, 2.0, 2.1). Versions ending in .0 are major versions and versions ending with non-zero extensions are minor versions. Previous major and minor versions of documents are saved together with current versions. To control the effect on storage space, you can specify how many previous major versions to keep, counting back from the current version. You can also specify how many major versions being kept should include their respective minor versions. For example, if you specify that minor versions should be kept for two major versions and the current major version is 4.0, then all minor versions starting at 3.1 will be kept. In major and minor versioning, any user who has read permissions can view major versions of documents. You can specify which users can also view minor versions. Typically, we recommend that you grant permissions to view and work with minor versions to the users who can edit items, and restrict users who have read permissions to viewing only major versions. Use major and minor versioning when you want to differentiate between published content that can be viewed by an audience and draft content that is not yet ready for publication. For example, on a human resources Web site that describes organizational benefits, use major and minor versioning to restrict employees' access to benefits descriptions while the descriptions are being revised. Plan content approval Use content approval to formalize and control making content available to an audience. For example, an enterprise that publishes content as one of its products or services might require a legal review and approval before publishing the content. A document draft awaiting content approval is in the Pending status. When an approver reviews the document and approves the content, it becomes available for viewing by users who have read permissions. A document library owner can enable content approval for a document library and, optionally, can associate a workflow with the library to run the approval process. The way that documents are submitted for approval varies depending on the versioning settings in the document library: No versioning If versioning is not being used and changes to a document are saved, the document's status becomes Pending. SharePoint Server keeps the earlier version of the document so that users who have read permissions can still view it. After the pending changes are approved, the new version of the document is made available for viewing by users who have read permissions and the earlier version is not retained. If versioning is not being used and a new document is uploaded to the document library, it is added to the library in the Pending status and is not viewable by users who have read permissions until it is approved. Create major versions If major versioning is being used and changes to a document are saved, the document's status becomes Pending and the previous major version of the document is made available for viewing by users who have read permissions. After changes to the document are approved, a new major version of the document is created and made available to users who have read permissions, and the previous major version is saved to the document's history list. If major versioning is being used and a new document is uploaded to the document library, it is added to the library in the Pending status and is not viewable by users who have read permissions until it is approved as version 1. Create major and minor (draft) versions If major and minor versioning is being used and changes to a document are saved, the author has the choice of saving a new minor version of the document as a draft or creating a new major version, which changes the document's status to Pending. After the changes to the document are approved, a new major version of the document is created and made available to users who have read permissions. In major and minor versioning, both major and minor versions of documents are kept in a document's history list. If major and minor versioning is being used and a new document is uploaded to the document library, it can be added to the library either in the Draft status as version 0.1 or the author can immediately request approval. In this case, the document's status becomes Pending. Plan check-out and check-in You can require users to check out documents from a document library before they edit the documents. The benefits of requiring check-out and check-in include the following: Better control of when document versions are created. When a document is checked out, the author can save the document without checking it in. Other users of the document library will be unable to see these changes, and a new version is not created. A new version (visible to other users) is only created when an author checks in a document. This gives the author more flexibility and control. Better capture of metadata. When a document is checked in, the author can write comments that describe the changes that were made to the document. This creates an ongoing historical record of the changes that were made to the document. If your solution requires users to check in and check out documents to edit them, you can use features in Office client applications that support these actions. Users can check out documents, undo check-outs, and check in documents from Office client applications. When a document is checked out, it is locked for exclusive editing by the user. When the user saves edits to this file, the changes are uploaded and saved to the server. The changes are private to the user and not visible to others. When the user is ready to check in the document, the latest changes are made visible to others and published. From Office client applications, users can also choose to leave checked-out documents on the server by changing content editing options. [!NOTE] You should not check out a document when you use the co-authoring functionality.
OfficeDocs-SharePoint/SharePoint/SharePointServer/governance/versioning-content-approval-and-check-out-planning.md/0
Plan document versioning, content approval, and check-out controls in SharePointServer
OfficeDocs-SharePoint/SharePoint/SharePointServer/governance/versioning-content-approval-and-check-out-planning.md
OfficeDocs-SharePoint
1,934
48
title: "Configure hybrid SharePoint taxonomy and hybrid content types" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 01232018 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - Ent_O365_Hybrid - IT_Sharepoint_Server - IT_SharePoint_Hybrid_Top - Strat_SP_gtc - M365-collaboration - SPO_Content ms.custom: admindeeplinkSPO ms.assetid: 0809325c-9b99-46bf-b98d-6d2f5e3d2a4b description: "In this article, we look at how to configure hybrid SharePoint taxonomy and hybrid content types." Configure hybrid SharePoint taxonomy and hybrid content types [!INCLUDEappliesto-2013-2016-2019-SUB-SPO-md] In this article, we look at how to configure hybrid SharePoint taxonomy and hybrid content types. Hybrid SharePoint taxonomy allows you to have a shared taxonomy between SharePoint Server and SharePoint in Microsoft 365. Hybrid content types allow you to have a shared set of content types between SharePoint Server and SharePoint in Microsoft 365. Before you follow the procedures in this article, be sure to read Plan hybrid SharePoint taxonomy and hybrid content types. This feature is available in SharePoint Server 2013 and SharePoint Server 2016 with the following SharePoint updates: Hybrid taxonomy requires the November 2016 public update or later. Hybrid content types require the June 2017 public update or later. The functionality and configuration procedures are the same for both versions of SharePoint Server. Video demonstration This video shows a walkthrough of configuring hybrid taxonomy and hybrid content types. Video: Configure hybrid taxonomy and content types [!VIDEO https:www.microsoft.comvideoplayerembedde549889-8831-4c29-a3f4-ffe8104dc0a5?autoplay=false] Migrate your taxonomy from SharePoint Server If you have an existing taxonomy in SharePoint Server, the best practice is to copy any term groups you want to be part of the shared taxonomy to SharePoint in Microsoft 365 before you configure hybrid SharePoint taxonomy. You can migrate more taxonomy groups from SharePoint Server to SharePoint in Microsoft 365 to add to the shared taxonomy later, but if you do, you may need to run the Hybrid Configuration Wizard again to include them in the shared taxonomy. The migration process copies taxonomy groups from SharePoint Server to SharePoint in Microsoft 365 by using the Copy-SPTaxonomyGroups PowerShell cmdlet. Active Directory groups While the copy process preserves most user information associated with term sets - such as owner and stakeholders - note, that the copy process doesn't work with Active Directory groups. If you use Active Directory groups in your term sets, there are two options for copying your taxonomy groups: You can replace the Active Directory groups with individual users within your taxonomy groups. The individual users are copied when you copy your taxonomy groups. You can copy your taxonomy groups with the Active Directory groups in place. You'll see a PowerShell warning and the Active Directory group assignments are lost if you proceed. You can then assign a Microsoft 365 group in place of the Active Directory group after you've copied the taxonomy groups. Copying taxonomy groups Copying taxonomy groups is done using the Copy-SPTaxonomyGroups PowerShell cmdlet. You need the following information to run the cmdlet: The name of your managed metadata service application in SharePoint Server. The URL of the SharePoint Server site where your taxonomy store is located. The URL of the SharePoint in Microsoft 365 site where your term store is located (http:\.sharepoint.com). Taxonomy groups in SharePoint Server to be copied to SharePoint in Microsoft 365. Your Microsoft 365 global admin credentials. [!NOTE] If you receive an HTTP 400 error when attempting to use the Copy-SPTaxonomyGroups cmdlet with correct credentials, switch to a cloud-based global admin instead of an Active Directory synchronized account. A list of the taxonomy groups that you want to copy. Run the cmdlet as a farm admin from one of the servers in your SharePoint in Microsoft 365 farm. Use the following syntax to copy your taxonomy groups: $credential = Get-Credential Copy-SPTaxonomyGroups -LocalTermStoreName "" -LocalSiteUrl "" -RemoteSiteUrl "SharePointOnlineSiteURL" -GroupNames "Group1","Group2" -Credential $credential For example: $credential = Get-Credential Copy-SPTaxonomyGroups -LocalTermStoreName "Managed Metadata Service" -LocalSiteUrl "https:sharepoint" -RemoteSiteUrl "https:contoso.sharepoint.com" -GroupNames "Engineering","Marketing" -Credential $credential You can also run Copy-SPTaxonomyGroups and you'll be prompted for the needed parameters. Copying content types If you're planning to use hybrid content types, you can copy your SharePoint Server content types to SharePoint in Microsoft 365 by using the Copy-SPContentTypes cmdlet. For example: Copy-SPContentTypes -LocalSiteUrl http:localsite -LocalTermStoreName "managed metadata service application proxy" -RemoteSiteUrl https:contoso.sharepoint.com -ContentTypeNames @("ContentTypeA", "ContentTypeB") -Credential $credential The content types are copied into https:contoso.sharepoint.comsitescontentTypeHub. If this site doesn't exist, it's created for you, and the Site Collection Feature Content Type Syndication Hub is enabled. The site URL is hard coded and can't be changed. Configure hybrid SharePoint taxonomy Configuration of hybrid SharePoint taxonomy is done using the Hybrid Configuration Wizard in the SharePoint admin center. The Hybrid Configuration Wizard has many prerequisites. Be sure to read Hybrid Configuration Wizard in the SharePoint admin center before you follow the procedures in this section. We also recommend that you back up your term store before you proceed. Make the timer service account a term store admin For taxonomy replication to work properly, the account that runs the SharePoint Timer Service must be a term store admin in SharePoint Server. (To find this account, check the Log On As account for the SharePoint Timer Service on your server.) Use the following procedure to add this account as a term store administrator. To add a term store admin In the Central Administration website, under Application Management, select Manage service applications. Select the link for the Managed Metadata service application. Add the timer service account to the Term Store Administrators box, and then select Save. Configure hybrid SharePoint taxonomy using the Hybrid Configuration Wizard The next step is to configure hybrid SharePoint taxonomy by running the Hybrid Configuration Wizard in the SharePoint admin center. To configure hybrid SharePoint taxonomy Sign in a server in your SharePoint Server farm as the farm administrator. From your SharePoint Server computer, open a web browser. Go to More features in the SharePoint admin center, and sign in with an account that has admin permissions in Microsoft 365. Under Hybrid picker, select Open. Follow the wizard, and when prompted, select Hybrid Taxonomy. Provide the following information when prompted: The URL of your SharePoint Server root site (for example, https:sharepoint). The name of your SharePoint Server managed metadata service application (for example, Managed Metadata Service). The names of the taxonomy groups that you want to replicate (for example, Engineering;Marketing). If you don't specify groups, then all groups except system and special groups are configured for replication. After you've configured hybrid SharePoint taxonomy, the taxonomy replication timer job will poll SharePoint in Microsoft 365 on a daily basis for changes to the taxonomy. Running the taxonomy replication timer job Hybrid SharePoint taxonomy uses a timer job called Taxonomy Groups Replication to copy taxonomy information from SharePoint in Microsoft 365 to SharePoint Server. The SharePoint in Microsoft 365 APP Identity is used to authenticate to Microsoft 365. By default, this timer job replicates taxonomy on a daily basis. Like other timer jobs in SharePoint in Microsoft 365, you can configure the Taxonomy Groups Replication job to run on a different schedule, or you can run it manually, by searching for it in the timer job list in Central Administration. Stopping replication of taxonomy groups If at any time you want to stop taxonomy replication between SharePoint in Microsoft 365 and SharePoint Server, you can do so by using PowerShell. The Stop-SPTaxonomyReplication cmdlet stops taxonomy replication. For example: $credential = Get-Credential Stop-SPTaxonomyReplication -Credential $credential The Stop-SPContentTypeReplication cmdlet stops content type replication: Stop-SPContentTypeReplication If you wish to reenable taxonomy replication again, you must run the Hybrid Configuration Wizard again. If you simply want to reconfigure which taxonomy groups you're replicating, there's no need to stop replication. You can just run the Hybrid Configuration Wizard again and specify the new taxonomy groups that you want to replicate. See also Other Resources TechNet Forums - Hybrid Taxonomy
OfficeDocs-SharePoint/SharePoint/SharePointServer/hybrid/configure-hybrid-sharepoint-taxonomy-and-hybrid-content-types.md/0
Configure hybrid SharePoint taxonomy and hybrid content types
OfficeDocs-SharePoint/SharePoint/SharePointServer/hybrid/configure-hybrid-sharepoint-taxonomy-and-hybrid-content-types.md
OfficeDocs-SharePoint
2,117
49
title: "Hybrid self-service site creation" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 9122017 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - Ent_O365_Hybrid - IT_Sharepoint_Server - IT_SharePoint_Hybrid_Top - Strat_SP_gtc - M365-collaboration - SPO_Content ms.custom: admindeeplinkSPO ms.assetid: 27d3e6b8-7922-4015-a5fd-8c240eaa6357 description: "Hybrid self-service site creation redirects the default self-service site creation page in SharePoint Server to the SharePoint in Microsoft 365 Group Creation page. By configuring this feature, you can help your users to create their sites in SharePoint in Microsoft 365 instead of SharePoint Server." Hybrid self-service site creation [!INCLUDEappliesto-2013-2016-2019-SUB-SPO-md] Hybrid self-service site creation redirects the default self-service site creation page in SharePoint Server (_layouts15scsignup.aspx) or (_layouts16scsignup.aspx) to the SharePoint in Microsoft 365 Group Creation page. By configuring this feature, you can help your users to create their sites in SharePoint in Microsoft 365 instead of SharePoint Server. Hybrid self-service site creation respects your hybrid audience settings. If you use a hybrid audience, members of the hybrid audience will be redirected to SharePoint in Microsoft 365 for self-service site creation, while on-premises only users will continue to be directed to self-service site creation in SharePoint Server. This setting can be configured independently for each web application in your farm. Hybrid self-service site creation is available in SharePoint Server 2013 with the March 2017 PU. Hybrid self-service site creation is available in SharePoint 2016 with November 2017 PU. Configure hybrid self-service site creation using the Hybrid Configuration Wizard Configuring hybrid self-service site creation is done by using the Hybrid Configuration Wizard in the SharePoint admin center. [!NOTE] If you've previously configured other hybrid features with the Hybrid Configuration Wizard, you can go directly to the SharePoint Central Administration website to manage hybrid self-service site creation. In this case, the hybrid connection has been made and there's no need to run the Hybrid Configuration Wizard again. To configure hybrid self-service site creation Log on to a server in your SharePoint Server farm as the farm administrator. From your SharePoint Server computer, open a web browser. Go to More features in the SharePoint admin center, and sign in with an account that has admin permissions in Microsoft 365. Under Hybrid picker, select Open. On the hybrid picker page, select Hybrid Picker. Follow the wizard, and when prompted, select Hybrid self-service site creation. When prompted, select the web application with which you want to use hybrid self-service site creation. When the Hybrid Configuration Wizard completes, hybrid self-service site creation will be enabled for the web application that you selected. Manage hybrid self-service site creation Once you have configured hybrid self-service site creation, you can manage it in the SharePoint Central Administration website. To manage hybrid self-service site creation In Central Administration, select Application Management. On the Application Management page, under Site Collections, select Configure self-service site creation. In the Web Application section, select the web application where you want to manage hybrid self-service site creation, and then select or clear the Create Site Collections in SharePoint check box. [!NOTE] While hybrid users of this web application will be redirected to SharePoint in Microsoft 365 for self-service site creation, the other settings on this page continue to apply to any on-premises only users. Select OK.
OfficeDocs-SharePoint/SharePoint/SharePointServer/hybrid/hybrid-self-service-site-creation.md/0
Hybrid self-service site creation
OfficeDocs-SharePoint/SharePoint/SharePointServer/hybrid/hybrid-self-service-site-creation.md
OfficeDocs-SharePoint
903
50
ms.date: 01132020 title: "Remove SharePoint Server hybrid scenarios" ms.reviewer: troys ms.author: serdars author: SerdarSoysal manager: serdars audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - Ent_O365_Hybrid - IT_Sharepoint_Server - M365-collaboration description: "Removing SharePoint hybrid scenarios in SharePoint Server" Removing SharePoint hybrid scenarios [!INCLUDEappliesto-2013-2016-2019-SUB-SPO-md] This guide will walk you through removing SharePoint Hybrid functionality from your SharePoint in Microsoft 365 farm. Cloud Hybrid Search Cloud Hybrid Search may be removed by deleting the Search Service Application. In the Central Administration website, select Application Management. In the Application Management page, select Manage service applications. In the Service Application page, highlight your Cloud Hybrid Search Service Application. The name of the Service Application may vary, but the Type will be Search Service Application. [!NOTE] The Type is identical to the standard SharePoint in Microsoft 365 Search Service Application. On the ribbon, select Delete. You may then create a new non-Cloud Search Service Application. For info about how to create and manage your Search Service Application, see the SharePoint Server documentation in Search. OneDrive and sites After you have configured OneDrive and Sites hybrid, you can manage it in the SharePoint in Microsoft 365 Central Administration website. In the Central Administration website, select Microsoft 365. On the Microsoft 365 page, select Configure hybrid OneDrive and Sites features. On the Configure hybrid OneDrive and Sites features page, under the Select hybrid features, select None, and then select OK. Setting the option to None also removes the Hybrid app launcher feature. SharePoint hybrid taxonomy and hybrid content types See Stopping replication of taxonomy groups. Hybrid self-service site creation See Manage hybrid self-service site creation. Removing the Azure Access Control Service Application Proxy and SharePoint in Microsoft 365 Application Principal Management Service Application Proxy The final step to removing hybrid is to delete the Azure Access Control Service Application Proxy and SharePoint Application Principal Management Service Application Proxy created by the Hybrid Configuration Wizard. In the Central Administration website, select Application Management. In the Application Management page, select Manage service applications. In the Service Applications page, highlight the Service Application named ACS. On the ribbon, select Delete. In the Service Applications page, highlight the Service Application named SharePoint App Management Proxy. On the ribbon, select Delete. Perform an iisreset on all SharePoint Servers in the farm.
OfficeDocs-SharePoint/SharePoint/SharePointServer/hybrid/removing-sharepoint-hybrid-scenarios.md/0
Removing SharePoint hybrid scenarios
OfficeDocs-SharePoint/SharePoint/SharePointServer/hybrid/removing-sharepoint-hybrid-scenarios.md
OfficeDocs-SharePoint
612
51
title: "Account permissions and security settings in SharePoint Servers" ms.reviewer: wesleywu ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 10262023 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top - SP2019 ms.assetid: 55b99d80-3fa7-49f0-bdf4-adb5aa959019 description: "Learn about the permissions and security settings to use with a deployment of SharePoint Server." Account permissions and security settings in SharePoint Servers [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] This article describes SharePoint administrative and services account permissions for the following areas: Microsoft SQL Server, the file system, file shares, and registry entries. [!IMPORTANT] Do not use service account names that contain the symbol $ with the exception of using a Group Managed Service Account for SQL Server. Learn more about SharePoint admin role in Microsoft 365. About account permissions and security settings in SharePoint Servers The SharePoint Products Configuration Wizard (Psconfig) and the Farm Configuration Wizard, both of which are run during a complete installation, configure many of the SharePoint baseline account permissions and security settings. Service account recommendations The following sections describe recommendations on SharePoint Service accounts. Service account recommendations Microsoft recommends using a minimal number of Service Application Pool accounts in the farm. This recommendation is to reduce memory usage and increase performance while maintaining the appropriate level of security. Use an elevated, personally identifiable account for SharePoint installation, maintenance, and upgrades. This account will hold the roles required as outlined in SharePoint Farm Administrator account. Each SharePoint admin should use a separate account so that their activity performed on the farm is clearly identified. If possible, use a security group, SharePoint Farm Administrators Groups, to unify all individual SharePoint Farm Administrator accounts and to grant permissions as outlined in SharePoint Farm Administrator account. This usage of a security group simplifies the management of the SharePoint Farm Administrator accounts significantly. The SharePoint Farm Service account should only run the SharePoint Timer service, SharePoint Insights (if applicable), the IIS Application Pools for Central Administration, SharePoint Web Services System (used for the topology service), and SecurityTokenServiceApplicationPool (used for the Security Token Service). A single account should be used for all Service Applications, named Service Application Pool account. This usage of a single account allows the administrator to use a single IIS Application Pool for all Service Applications. In addition, this account should run the following Windows Services: SharePoint Search Host Controller, SharePoint Server Search, and Distributed Cache (AppFabric Caching Service). A single account should be used for all Web Applications, named Web Application pool account. This usage of a single account allows the administrator to use a single IIS Application Pool for all Web Applications, except the Central Administration Web Application which is run by the SharePoint farm service account. Except for the Claims to Windows Token Service account, no Service Application Pool account should have Local Administrator access to any SharePoint server, nor any elevated SQL Server role, for example, the sysadmin fixed role. The SharePoint Farm Administrator account will require the dbcreator and securityadmin fixed roles unless you pre-provision SharePoint databases and manually assign permissions to each database. Service Application Pool accounts, except for the account running the Claims to Windows Token Service, should have Deny logon locally and Deny logon through Remote Desktop Services in the Local Security Policy\User Rights Assignment. These values are set via secpol.msc. Use separate accounts for Content access (Search crawler), Portal Super Reader, Portal Super User, and User Profile Service Application Synchronization, if applicable. The Claims to Windows Token Service account is a highly privileged account on the farm. Prior to deploying this service, verify if it's required. If necessary, use a separate account for this service. Service accounts recommendations overview Service account nameWhat is it used for?How many should be used? ------------ SharePoint Farm Administrator accountPersonally identifiable account for a SharePoint admin1-n SharePoint Farm Service account Timer Service, Insights, IIS App for CA, SP Web Services System, Security Token Service App Pool1 Default content access accountSearch crawling internal and external sources1 Content access accountsSearch crawling internal and external sources1-n Web Application Pool accountAll Web Applications without Central Administration1 SharePoint Service Application Pool accountAll Service Applications1 Portal Super ReaderObject caching1 Portal Super UserObject caching1 User Profile Service Application SynchronizationUsed for Active Directory Import1-n SharePoint administrative accounts One of the following SharePoint components automatically configures most of the SharePoint administrative account permissions during the setup process: The SharePoint Products Configuration Wizard (Psconfig). The Farm Configuration Wizard. The SharePoint Central Administration website. Microsoft PowerShell. SharePoint Farm Administrator account This account is used to set up each server in your farm by running the SharePoint Products Configuration Wizard (Psconfig), the initial Farm Configuration Wizard, and PowerShell. For the examples in this article, the SharePoint Farm Administrator account is used for farm administration, and you can use Central Administration to manage it. Some configuration options, for example, configuration of the SharePoint Server Search query server, require local administration permissions. The SharePoint Farm Administrator account has the following requirements: It must have domain user account permissions. It must be a member of the local Administrators group on each server in the SharePoint farm. This account must have access to the SharePoint databases. If you use any PowerShell operations that affect a database, the SharePoint Farm Administrator account must be a member of the db_owner role. This account must be assigned to the securityadmin and dbcreator SQL Server security roles during setup and configuration. [!NOTE] The securityadmin and dbcreator SQL Server security roles might be required for this account during a complete version-to-version upgrade because new databases might have to be created and secured for services. After you run the configuration wizards, machine-level permissions for the SharePoint Farm Administrator account include: Membership in the WSS_ADMIN_WPG Windows security group. After you run the configuration wizards, database permissions include: db_owner on the SharePoint server farm configuration database. db_owner on the SharePoint Central Administration content database. [!CAUTION] If the account that you use to run the configuration wizards doesn't have the appropriate special SQL Server role membership or access as db_owner on the databases, the configuration wizards won't run correctly. SharePoint Farm Service account The SharePoint Farm service account, which is also referred to as the database access account, is used as the application pool identity for Central Administration and as the process account for the SharePoint Timer Service. The server farm account has the following requirement: It must have domain user account permissions. Extra permissions are automatically granted to the SharePoint Farm Service account on SharePoint servers that are joined to a server farm. After you run Setup, machine-level permissions include: Membership in the WSS_ADMIN_WPG Windows security group for the SharePoint Timer Service. Membership in WSS_RESTRICTED_WPG for the Central Administration and Timer service application pools. Membership in WSS_WPG for the Central Administration application pool. After you run the configuration wizards, SQL Server and database permissions include: Dbcreator fixed server role. Securityadmin fixed server role. db_owner for all SharePoint databases. Membership in the WSS_CONTENT_APPLICATION_POOLS role for the SharePoint server farm configuration database. Membership in the WSS_CONTENT_APPLICATION_POOLS role for the SharePoint_Admin content database. SharePoint Application Pool accounts This section describes the SharePoint Application Pool accounts that are set up by default during installation. Default content access account The default content access account is used within a specific service application to crawl content, unless a different authentication method is specified by a crawl rule for a URL or URL pattern. This account requires the following permission configuration settings: The default content access account must be a domain user account that has read access to external or secure content sources that you want to crawl by using this account. For SharePoint Server sites that aren't part of the server farm, you have to explicitly grant this account full read permission to the Web Applications that host the sites. This account must not be a member of the Farm Administrators group. Content access accounts Content access accounts are configured to access content by using the Search administration crawl rules feature. This type of account is optional, and you can configure it when you create a new crawl rule. For example, external content (such as a file share) might require this separate content access account. This account requires the following permission configuration settings: The content access account must have read access to external or secure content sources that this account is configured to access. For SharePoint Server sites that aren't part of the server farm, you have to explicitly grant this account full read permission to the Web Applications that host the sites. Web Application Pool account The Web Application Pool account must be a domain user account. This account must not be a member of the Farm Administrators group. This account should be used for all Web Applications without Central Administration. The following machine-level permission is configured automatically: This account is a member of WSS_WPG. The following SQL Server and database permissions are configured automatically: This account is assigned to the WSS_CONTENT_APPLICATION_POOLS role that is associated with the farm configuration database. This account is assigned to the WSS_CONTENT_APPLICATION_POOLS role that is associated with the SharePoint Admin content database. The application pool accounts for Web Applications are assigned to the SPDataAccess role for the content databases. SharePoint Service Application Pool account The SharePoint Service Application Pool account must be a domain user account. This account must not be a member of the Administrators group on any computer in the server farm. The following machine-level permission is configured automatically: This account is a member of WSS_WPG. The following SQL Server and database requirementspermissions are configured automatically: This account is assigned to the SPDataAccess role for the content databases. This account is assigned to the SPDataAccess role for a search database that is associated with the Web Application. This account must have read and write access to the associated service application database. This account is assigned to the WSS_CONTENT_APPLICATION_POOLS role that is associated with the farm configuration database. This account is assigned to the WSS_CONTENT_APPLICATION_POOLS role that is associated with the SharePoint_Admin content database. SharePoint database roles This section describes the database roles that installation sets up by default or that you can configure optionally. WSS_CONTENT_APPLICATION_POOLS database role The WSS_CONTENT_APPLICATION_POOLS database role applies to the application pool account for each Web Application that is registered in a SharePoint farm. This role applicability enables Web Applications to query and update the site map and have read-only access to other items in the configuration database. Setup assigns the WSS_CONTENT_APPLICATION_POOLS role to the following databases: The SharePoint Config database (the configuration database) The SharePoint Admin Content database Members of the WSS_CONTENT_APPLICATION_POOLS role have the execute permission for a subset of the stored procedures for the database. In addition, members of this role have the select permission to the Versions table (dbo.Versions) in the SharePoint_AdminContent database. For other databases, the accounts planning tool indicates that access to read these databases is automatically configured. In some cases, limited access to write to a database is also automatically configured. To provide this access, permissions for stored procedures are configured. SharePoint_SHELL_ACCESS database role The secure SharePoint_SHELL_ACCESS database role on the configuration database replaces the need to add an administration account as a db_owner on the configuration database. By default, the setup account is assigned to the SharePoint_SHELL_ACCESS database role. You can use a PowerShell command to grant or remove memberships to this role. Setup assigns the SharePoint_SHELL_ACCESS role to the following databases: The SharePoint_Config database (the configuration database). One or more of the SharePoint Content databases. This database is configurable by using the PowerShell command that manages membership and the object that is assigned to this role. Members of the SharePoint_SHELL_ACCESS role have the execute permission for all stored procedures for the database. In addition, members of this role have the read and write permissions on all the database tables. SPREADONLY database role The SPREADONLY role should be used for setting the database to read-only mode instead of using sp_dboption. This role as its name suggests should be used when only read access is required for usage and telemetry data. [!NOTE] The sp_dboption stored procedure isn't available in SQL Server 2012. For more information about sp_dboption, see sp_dboption (Transact-SQL). The SPREADONLY SQL role will have the following permissions: Grant SELECT on all SharePoint stored procedures and functions. Grant SELECT on all SharePoint tables. Grant EXECUTE on user-defined types where schema is dbo. SPDataAccess database role The SPDataAccess role is the default role for database access and should be used for all object model-level access to databases. Add the application pool account to this role during upgrades or new deployments. [!NOTE] The SPDataAccess role replaced the db_owner role in SharePoint Server 2016. The SPDataAccess role will have the following permissions: Grant EXECUTE or SELECT on all SharePoint stored procedures and functions. Grant SELECT on all SharePoint tables. Grant EXECUTE on user-defined types where schema is dbo. Grant INSERT on AllUserDataJunctions table. Grant UPDATE on Sites view. Grant UPDATE on UserData view. Grant UPDATE on AllUserData table. Grant INSERT and DELETE on NameValuePair tables. Grant create table permission. Group permissions This section describes the permissions of groups that the SharePoint Servers 2016 and 2019 setup and configuration tools create. WSS_ADMIN_WPG WSS_ADMIN_WPG has read and write access to local resources. The application pool accounts for the Central Administration and Timer services are in WSS_ADMIN_WPG. The following table shows the WSS_ADMIN_WPG registry entry permissions: [!NOTE] SharePoint 2013 uses the registry path "15.0" instead of "16.0" and file system path "15" instead of "16". Some paths listed in the subsequent tables don't apply to SharePoint Foundation 2013. Key namePermissionsInheritDescription :-----:-----:-----:----- HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\VSSFull controlNot applicableNot applicable HKEY_LOCAL_MACHINE\Software\Microsoft\Office\16.0\Registration{90150000-110D-0000-1000-0000000FF1CE}Read, writeNot applicableNot applicable HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Office ServerReadNoThis key is the root of the SharePoint Server registry settings tree. If this key is altered, SharePoint Server functionality will fail. HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Office Server\16.0Full controlNoThis key is the root of the SharePoint Server 2016 registry settings. HKEY_LOCAL_MACHINE\Software\Microsoft\Office Server\16.0\LoadBalancerSettingsRead, writeNoThis key contains settings for the document conversion service. Altering this key will break document conversion functionality. HKEY_LOCAL_MACHINE\Software\Microsoft\Office Server\16.0\LauncherSettingsRead, writeNoThis key contains settings for the document conversion service. Altering this key will break document conversion functionality. HKEY_LOCAL_MACHINE\Software\Microsoft\Office Server\16.0\SearchFull controlNot applicableNot applicable HKEY_LOCAL_MACHINE\Software\Microsoft\Shared Tools\Web Server Extensions\16.0\SearchFull controlNot applicableNot applicable HKEY_LOCAL_MACHINE\Software\Microsoft\Shared Tools\Web Server Extensions\16.0\SecureFull controlNoThis key contains the connection string and the ID of the configuration database to which the machine is joined. If this key is altered, the SharePoint Server installation on the machine won't function. HKEY_LOCAL_MACHINE\Software\Microsoft\Shared Tools\Web Server Extensions\16.0\WSSFull controlYesThis key contains settings used during setup. If this key is altered, diagnostic logging might fail, and setup or post-setup configuration might fail. The following table shows the WSS_ADMIN_WPG file system permissions. File system pathPermissionsInheritDescription :-----:-----:-----:----- %AllUsersProfile%\ Microsoft\SharePointFull controlNoThis directory contains the file-system-backed cache of the farm configuration. If this directory is altered or deleted, processes might fail to start, and the administrative actions might fail. C:\Inetpub\wwwroot\wssFull controlNoThis directory (or the corresponding directory under the Inetpub root on the server) is used as the default location for IIS Websites. If this directory is altered or deleted, SharePoint sites will be unavailable, and administrative actions might fail, unless custom IIS Website paths are provided for all IIS Websites extended with SharePoint Server. %ProgramFiles%\Microsoft Office Servers\16.0Full controlNoThis directory is the installation location for SharePoint Server 2016 binaries and data. The directory can be changed during installation. If this directory is removed, altered, or removed after installation, all SharePoint Server functionalities will fail. Membership in the WSS_ADMIN_WPG Windows security group is required for some SharePoint Server services to be able to store data on the disk. %ProgramFiles%\Microsoft Office Servers\16.0\WebServicesRead, writeNoThis directory is the root directory where back-end Web services are hosted, for example, Excel and Search. If this directory is removed or altered, the SharePoint Server features that depend on these services will fail. %ProgramFiles%\Microsoft Office Servers\16.0\DataFull controlNoThis directory is the root location where local data is stored, including search indexes. If this directory is removed or altered, the Search functionality will fail. WSS_ADMIN_WPG Windows security group permissions are required to enable the Search functionality to save and secure data in this folder. %ProgramFiles%\Microsoft Office Servers\16.0\LogsFull controlYesThis directory is the location where the runtime diagnostic logging is generated. If this directory is removed or altered, the Logging functionality won't function properly. %ProgramFiles%\Microsoft Office Servers\16.0\Data\Office ServerFull controlYesSame as the parent folder. %windir%\System32\drivers\etc\HOSTSRead, writeNot applicableNot applicable %windir%\TasksFull controlNot applicableNot applicable %COMMONPROGRAMFILES%Microsoft Shared\Web Server Extensions\16ModifyYesThis directory is the installation directory for core SharePoint Server files. If the access control list (ACL) is modified, feature activation, solution deployment, and other features won't function correctly. %COMMONPROGRAMFILES%\Microsoft Shared\Web Server Extensions\16\ADMISAPIFull controlYesThis directory contains the SOAP services for Central Administration. If this directory is altered, remote site creation and other methods exposed in the service won't function correctly. %COMMONPROGRAMFILES%\Microsoft Shared\Web Server Extensions\16\CONFIGFull controlYesThis directory contains files used to extend IIS Websites with SharePoint Server. If this directory or its contents are altered, Web Application provisioning won't function correctly. %COMMONPROGRAMFILES%\Microsoft Shared\Web Server Extensions\16\LOGSFull controlNoThis directory contains setup and runtime tracing logs. If the directory is altered, diagnostic logging won't function correctly. %windir%\tempFull controlYesThis directory is used by platform components on which SharePoint Server depends. If the ACL is modified, Web Part rendering and other deserialization operations might fail. %windir%\System32\logfiles\SharePointFull controlNoThis directory is used by SharePoint Server usage logging. If this directory is modified, usage logging won't function correctly. This registry key applies only to SharePoint Server. %systemdrive\program files\Microsoft Office Servers\16 folder on Index serversFull controlNot applicableThis permission is granted for a %systemdrive\program files\Microsoft Office Servers\16 folder on Index servers. WSS_WPG WSS_WPG has read access to local resources. All application pool and services accounts are in WSS_WPG. The following table shows WSS_WPG registry entry permissions: Key namePermissionsInheritDescription :-----:-----:-----:----- HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Office Server\16.0ReadNoThis key is the root of the SharePoint Server registry settings. HKEY_LOCAL_MACHINE\Software\Microsoft\Office Server\16.0\DiagnosticsRead, writeNoThis key contains settings for the SharePoint Server diagnostic logging. If this key is altered, the Logging functionality will break. HKEY_LOCAL_MACHINE\Software\Microsoft\Office Server\16.0\LoadBalancerSettingsRead, writeNoThis key contains settings for the document conversion service. If this key is altered, the document conversion functionality will break. HKEY_LOCAL_MACHINE\Software\Microsoft\Office Server\16.0\LauncherSettingsRead, writeNoThis key contains settings for the document conversion service. If this key is altered, the document conversion functionality will break. HKEY_LOCAL_MACHINE\Software\Microsoft\Shared Tools\Web Server Extensions\16.0\SecureReadNoThis key contains the connection string and the ID of the configuration database to which the machine is joined. If this key is altered, the SharePoint Server 2016 installation on the machine won't function. HKEY_LOCAL_MACHINE\Software\Microsoft\Shared Tools\Web Server Extensions\16.0\WSSReadYesThis key contains settings that are used during setup. If this key is altered, diagnostic logging might fail and setup or post-setup configuration might fail. HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurePipeServers\winregReadNoThis key contains settings that control remote access to the registry. The following table shows the WSS_WPG file system permissions: File system pathPermissionsInheritDescription :-----:-----:-----:----- %AllUsersProfile%\ Microsoft\SharePointReadNoThis directory contains the file-system-backed cache of the farm configuration. If this directory is altered or deleted, processes might fail to start and the administrative actions might fail. C:\Inetpub\wwwroot\wssRead, executeNoThis directory (or the corresponding directory under the Inetpub root on the server) is used as the default location for IIS Websites. If this directory is altered or deleted, SharePoint sites will be unavailable and administrative actions might fail, unless custom IIS Website paths are provided for all IIS Websites extended with SharePoint Server. %ProgramFiles%\Microsoft Office Servers\16.0Read, executeNoThis directory is the installation location for the SharePoint Server binaries and data. It can be changed during installation. If this directory is removed, altered, or moved after installation, all SharePoint Server functionalities will fail. WSS_WPG read and execute permissions are required to enable IIS Websites to load SharePoint Server binaries. %ProgramFiles%\Microsoft Office Servers\16.0\WebServicesReadNoThis directory is the root directory where back-end Web services are hosted, for example, Excel and Search. If this directory is removed or altered, the SharePoint Server features that depend on these services will fail. %ProgramFiles%\Microsoft Office Servers\16.0\LogsRead, writeYesThis directory is the location where the runtime diagnostic logging is generated. If this directory is removed or altered, the Logging functionality won't function properly. %COMMONPROGRAMFILES%\Microsoft Shared\Web Server Extensions\16\ADMISAPIReadYesThis directory contains the SOAP services for Central Administration. If this directory is altered, remote site creation and other methods exposed in the service won't function correctly. %COMMONPROGRAMFILES%\Microsoft Shared\Web Server Extensions\16\CONFIGReadYesThis directory contains files used to extend IIS Websites with SharePoint Server. If this directory or its contents are altered, Web Application provisioning won't function correctly. %COMMONPROGRAMFILES%\Microsoft Shared\Web Server Extensions\16\LOGSModifyNoThis directory contains setup and runtime tracing logs. If the directory is altered, diagnostic logging won't function correctly. %windir%\tempReadYesThis directory is used by platform components on which SharePoint Server depends. If the ACL is modified, Web Part rendering and other deserialization operations might fail. %windir%\System32\logfiles\SharePointReadNoThis directory is used by SharePoint Server usage logging. If this directory is modified, usage logging won't function correctly. The registry key applies only to SharePoint Server. %systemdrive\program files\Microsoft Office Servers\16Read, executeNot applicableThe permission is granted for %systemdrive\program files\Microsoft Office Servers\16 folder on Index servers. Local service The following table shows the local service registry entry permission: Key namePermissionsInheritDescription :-----:-----:-----:----- HKEY_LOCAL_MACHINE\Software\Microsoft\Office Server\16.0\LoadBalancerSettingsReadNoThis key contains settings for the document conversion service. If this key is altered, the document conversion functionality will break. The following table shows the local service file system permission: File system pathPermissionsInheritDescription :-----:-----:-----:----- %ProgramFiles%\Microsoft Office Servers\16.0\BinRead, executeNoThis directory is the installed location of the SharePoint Server binaries. If this directory is removed or altered, all the SharePoint Server functionalities will fail. Local system The following table shows the local system registry entry permissions: Key namePermissionsInheritDescription :-----:-----:-----:----- HKEY_LOCAL_MACHINE\Software\Microsoft\Office Server\16.0\LauncherSettingsReadNoThis key contains settings for the document conversion service. If this key is altered, the document conversion functionality will break. This registry key applies only to SharePoint Server. HKEY_LOCAL_MACHINE\Software\Microsoft\Shared Tools\Web Server Extensions\16.0\SecureFull controlNoThis key contains the connection string and the ID of the configuration database to which the machine is joined. If this key is altered, the SharePoint Server installation on the machine won't function. HKEY_LOCAL_MACHINE\Software\Microsoft\Shared Tools\Web Server Extensions\16.0\Secure\FarmAdminFull controlNoThis key contains the encryption key that is used to store secrets in the configuration database. If this key is altered, service provisioning and other features will fail. HKEY_LOCAL_MACHINE\Software\Microsoft\Shared Tools\Web Server Extensions\16.0\WSSFull controlYesThis key contains settings that are used during setup. If this key is altered, diagnostic logging might fail and setup or post-setup configuration might fail. The following table shows the local file system permissions: File system pathPermissionsInheritDescription :-----:-----:-----:----- %AllUsersProfile%\ Microsoft\SharePointFull controlNoThis directory contains the file-system-backed cache of the farm configuration. If this directory is altered or deleted, processes might fail to start and administrative actions might fail. C:\Inetpub\wwwroot\wssFull controlNoThis directory (or the corresponding directory under the Inetpub root on the server) is used as the default location for IIS Websites. If this directory is altered or deleted, SharePoint sites will be unavailable and administrative actions might fail, unless custom IIS Website paths are provided for all IIS Websites extended with SharePoint Server. %COMMONPROGRAMFILES%\Microsoft Shared\Web Server Extensions\16\ADMISAPIFull controlYesThis directory contains the SOAP services for Central Administration. If this directory is altered, remote site creation and other methods exposed in the service won't function correctly. %COMMONPROGRAMFILES%\Microsoft Shared\Web Server Extensions\16\CONFIGFull controlYesThis directory contains configuration files used to provision Web applications and service applications. If this directory or its contents are altered, Web Application provisioning won't function correctly. %COMMONPROGRAMFILES%\Microsoft Shared\Web Server Extensions\16\LOGSFull controlNoThis directory contains setup and runtime tracing logs. If this directory is altered, diagnostic logging won't function correctly. %windir%\tempFull controlYesThis directory is used by platform components on which SharePoint Server depends. If the ACL is modified, Web Part rendering and other deserialization operations might fail. %windir%\System32\logfiles\SharePointFull controlNoThis directory is used by SharePoint Server for usage logging. If this directory is modified, usage logging won't function correctly. This registry key applies only to SharePoint Server. Network service The following table shows the network service registry entry permission: Key namePermissionsInheritDescription :-----:-----:-----:----- HKEY_LOCAL_MACHINE\Software\Microsoft\Office Server\16.0\Search\SetupReadNot applicableNot applicable Administrators The following table shows the administrators registry entry permissions: Key namePermissionsInheritDescription :-----:-----:-----:----- HKEY_LOCAL_MACHINE\Software\Microsoft\Shared Tools\Web Server Extensions\16.0\SecureFull controlNoThis key contains the connection string and the ID of the configuration database to which the machine is joined. If this key is altered, the SharePoint Server installation on the machine won't function. HKEY_LOCAL_MACHINE\Software\Microsoft\Shared Tools\Web Server Extensions\16.0\Secure\FarmAdminFull controlNoThis key contains the encryption key that is used to store secrets in the configuration database. If this key is altered, service provisioning and other features will fail. HKEY_LOCAL_MACHINE\Software\Microsoft\Shared Tools\Web Server Extensions\16.0\WSSFull controlYesThis key contains settings that are used during setup. If this key is altered, diagnostic logging might fail and setup or post-setup configuration might fail. The following table shows the administrators file system permissions: File system pathPermissionsInheritDescription :-----:-----:-----:----- %AllUsersProfile%\ Microsoft\SharePointFull controlNoThis directory contains the file-system-backed cache of the farm configuration. If this directory is altered or deleted, processes might fail to start and administrative actions might fail. C:\Inetpub\wwwroot\wssFull ControlNoThis directory (or the corresponding directory under the Inetpub root on the server) is used as the default location for IIS Websites. If this directory is altered or deleted, SharePoint sites will be unavailable and administrative actions might fail, unless custom IIS Website paths are provided for all IIS Websites that are extended with SharePoint Server. %COMMONPROGRAMFILES%\Microsoft Shared\Web Server Extensions\16\ADMISAPIFull controlYesThis directory contains the SOAP services for Central Administration. If this directory is altered, remote site creation and other methods exposed in the service won't function correctly. %COMMONPROGRAMFILES%\Microsoft Shared\Web Server Extensions\16\CONFIGFull controlYesThis directory contains configuration files used to provision Web applications and service applications. If this directory or its contents are altered, Web Application provisioning won't function correctly. %COMMONPROGRAMFILES%\Microsoft Shared\Web Server Extensions\16\LOGSFull controlNoThis directory contains setup and runtime tracing logs. If the directory is altered, diagnostic logging won't function correctly. %windir%\tempFull controlYesThis directory is used by platform components on which SharePoint Server depends. If the ACL is modified, Web Part rendering and other deserialization operations might fail. %windir%\System32\logfiles\SharePointFull controlNoThis directory is used by SharePoint Server for usage logging. If this directory is modified, usage logging won't function correctly. This registry key applies only to SharePoint Server. WSS_RESTRICTED_WPG WSS_RESTRICTED_WPG can read the encrypted farm administration credential registry entry. WSS_RESTRICTED_WPG is only used for encryption and decryption of passwords that are stored in the configuration database. The following table shows the WSS_RESTRICTED_WPG registry entry permission: Key namePermissionsInheritDescription :-----:-----:-----:----- HKEY_LOCAL_MACHINE\Software\Microsoft\Shared Tools\Web Server Extensions\16.0\Secure\FarmAdminFull controlNoThis key contains the encryption key that is used to store secrets in the configuration database. If this key is altered, service provisioning and other features will fail. Users group The following table shows the users group file system permissions: File system pathPermissionsInheritDescription :-----:-----:-----:----- %ProgramFiles%\Microsoft Office Servers\16.0Read, executeNoThis directory is the installation location for SharePoint Server binaries and data. It can be changed during installation. If this directory is removed, altered, or moved after installation, all SharePoint Server functionality will fail. %ProgramFiles%\Microsoft Office Servers\16.0\WebServices\RootRead, executeNoThis directory is the root directory where back-end root Web services are hosted. The only service initially installed on this directory is a search global administration service. If this directory is removed or altered, some search administration functionalities that use the server-specific Central Administration Settings page won't work. %ProgramFiles%\Microsoft Office Servers\16.0\LogsRead, writeYesThis directory is the location where the runtime diagnostic logging is generated. If this directory is removed or altered, Logging won't function properly. %ProgramFiles%\Microsoft Office Servers\16.0\BinRead, executeNoThis directory is the installed location of SharePoint Server binaries. If this directory is removed or altered, all the SharePoint Server functionalities will fail. All SharePoint Server service accounts The following table shows the SharePoint Server service accounts' file system permission: File system pathPermissionsInheritDescription :-----:-----:-----:----- %COMMONPROGRAMFILES%\Microsoft Shared\Web Server Extensions\16\LOGSModifyNoThis directory contains setup and runtime tracing logs. If this directory is altered, diagnostic logging won't function correctly. All SharePoint Server service accounts must have write permission to this directory. See also Concepts Install SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/install/account-permissions-and-security-settings-in-sharepoint-server-2016.md/0
Account permissions and security settings in SharePoint Servers
OfficeDocs-SharePoint/SharePoint/SharePointServer/install/account-permissions-and-security-settings-in-sharepoint-server-2016.md
OfficeDocs-SharePoint
7,924
52
title: "Hardware and topology requirements for SharePoint Server Subscription Edition" ms.reviewer: ms.author: serdars author: nimishasatapathy manager: serdars ms.date: 6222021 audience: ITPro f1.keywords: - NOCSH ms.topic: interactive-tutorial ms.service: sharepoint-server-itpro ms.localizationpriority: high ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top - SP2019 ms.custom: ms.assetid: 4d88c402-24f2-449b-86a6-6e7afcfec0cd description: "Find out the minimum hardware requirements that you need for installing and running SharePoint Server Subscription Edition." Hardware and topology requirements for SharePoint Server Subscription Edition [!INCLUDE appliesto-xxx-xxx-xxx-SUB-xxx-md] [!IMPORTANT] If you contact Microsoft Customer Support Services about a production system that does not meet the minimum hardware specifications described in this document, support will be limited until the system is upgraded to the minimum requirements. Hardware requirements for SharePoint servers The values in the following table are minimum values for installations on servers that are running SharePoint Server in a multiple server farm installation. Ensure the following before you proceed with deployment of SharePoint environment: For all installation scenarios: You have sufficient hard disk space for the base installation. You have sufficient hard disk space for diagnostics such as logging, debugging, creating memory dumps, and so on. For production environment You have additional free disk space for day-to-day operations. Maintain two times as much as free space as you have RAM Installation scenarioDeployment type and scaleProcessorRAMHard disk --------------- Single server role that uses SQL ServerDevelopment or evaluation installation with the minimum recommended services for development environments.64-bit, 4 cores16 GB80 GB for system drive 100 GB for second drive Single server role that uses SQL ServerPilot or user acceptance test installation running all available services.64-bit, 4 cores24 GB80 GB for system drive 100 GB for second drive and additional drives SharePoint server in a multi-tier farmDevelopment or evaluation installation with a minimum number of services.64-bit, 4 cores12 GB80 GB for system drive 80 GB for second drive SharePoint server in a multi-tier farmPilot or user acceptance test installation running all available services.64-bit, 4 cores16 GB80 GB for system drive 80 GB for second drive and additional drives [!NOTE] Hard disk space and number of drives depends on the amount of content and the way you choose to distribute data for a SharePoint environment. Hardware requirements: Location of physical servers Some enterprises have datacenters that are in close proximity to one another and are connected by high-bandwidth fiber optic links. In this environment, you can configure the two datacenters as a single farm. This distributed farm topology is called a stretched farm. Stretched farms for SharePoint Server Subscription Edition are supported. For a stretched farm architecture to work as a supported high-availability solution, the following prerequisites must be met: There is a highly consistent intra-farm latency of <1 ms one way, 99.9% of the time over a period of 10 minutes. Intra-farm latency is commonly defined as the latency between the front-end web servers and the database servers. The bandwidth speed must be at least 1 gigabit per second. To provide fault tolerance in a stretched farm, use the standard best practice guidance to configure redundant service applications and databases. [!NOTE] The intra-farm latency of <1 ms one way, 99.9% of the time over a period of ten minutes is also required for SharePoint environments with servers that are located in the same datacenter. The bandwidth speed should also be in this case at least 1 gigabit per second. Deployment requirements for farm topology SharePoint Server supports the same farm topologies as SharePoint Server 2019. For more information, see Planning for a MinRole server deployment in SharePoint Server 2019.
OfficeDocs-SharePoint/SharePoint/SharePointServer/install/hardware-and-topology-requirements-for-sharepoint-server-subscription-editon.md/0
Hardware and topology requirements for SharePoint Server Subscription Edition
OfficeDocs-SharePoint/SharePoint/SharePointServer/install/hardware-and-topology-requirements-for-sharepoint-server-subscription-editon.md
OfficeDocs-SharePoint
913
53
title: "System requirements for SharePoint 2013" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 7182017 audience: ITPro f1.keywords: - NOCSH ms.topic: conceptual ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: 1609167e-e47a-41d6-b8fc-7a8d0ba8f7ce description: "Introduces articles that describe hardware, software, and other requirements for SharePoint." System requirements for SharePoint 2013 [!INCLUDEappliesto-2013-xxx-xxx-xxx-xxx-md]. Before you install SharePoint 2013, you must make sure that you have installed all required hardware and software. To effectively plan your deployment, you must understand the level of support that is provided for the web browsers that you will be using in your environment and how support for IP versions 4 and 6 is implemented in SharePoint 2013. You must also understand the URL and path length restrictions in SharePoint 2013. The following articles help you prepare for the installation of SharePoint 2013 by providing information about the prerequisites that you must have in order to run SharePoint 2013. ContentDescription :-----:-----:----- Hardware and software requirements for SharePoint 2013 Describes the hardware and software requirements that you must meet to successfully install SharePoint 2013. Plan browser support in SharePoint 2013 Describes levels of support for web browsers to use with SharePoint 2013. IP support in SharePoint 2013 Describes SharePoint 2013 support for IP version 4 (IPv4) and IP version 6 (IPv6). Software boundaries and limits for SharePoint 2013 This article provides a starting point for planning the performance and capacity of your SharePoint 2013 farm, and includes performance and capacity testing results and guidelines for acceptable performance. Capacity management and sizing overview for SharePoint Server 2013 This article walks you through the concepts involved in capacity management and sizing SharePoint 2013 farms, and provides an overview of the planning process.
OfficeDocs-SharePoint/SharePoint/SharePointServer/install/system-requirements-for-sharepoint-2013.md/0
System requirements for SharePoint 2013
OfficeDocs-SharePoint/SharePoint/SharePointServer/install/system-requirements-for-sharepoint-2013.md
OfficeDocs-SharePoint
489
54
ms.date: 03132018 title: "Updated Product Servicing Policy for SharePoint Server Subscription Edition" ms.reviewer: ms.author: v-smandalika author: v-smandalika manager: serdars audience: ITPro f1.keywords: - NOCSH ms.topic: reference ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top - SP SE description: "This article explains the updated product servicing policy of SharePoint Server Subscription Edition." Updated Product Servicing Policy for SharePoint Server Subscription Edition [!INCLUDEappliesto-xxx-xxx-xxx-SUB-xxx-md] While in support, Microsoft releases new Public Update (PU) builds for SharePoint Server Subscription Edition each month which contain the latest functionality, performance, and stability improvements for the product. Policy overview To ensure that customers have a high-quality experience, Microsoft is adopting the following product servicing policy for SharePoint Server Subscription Edition: The RTM build and all PUs for SharePoint Server Subscription Edition released before January 10, 2023 will be supported until December 12, 2023. Starting with the January 10, 2023 PU for SharePoint Server Subscription Edition, each PU build will be supported for one year from its release date. Support for a PU build ends on the second Tuesday of the same month of its release in the following year. To remain supported, you must be running a supported build of SharePoint Server Subscription Edition. If you contact the Microsoft Support team and your SharePoint Server farm isn't running the minimum supported build or higher of SharePoint Server Subscription Edition, you might be required to upgrade to a supported build before the Microsoft Support team can offer any further assistance. The product servicing policy timeline for SharePoint Server Subscription Edition is described in the following table: SharePoint Server Subscription Edition BuildRelease DateSupport End Date --------- RTM (16.0.14326.20450)112202112122023 December 2021 PU - December 2022 PU12142021-1213202212122023 January 2023 PU1102023192024 February 2023 PU21420232132024 March 2023 PU31420233122024 April 2023 PU4112023492024 Future PUsRelease DateRelease Date + 1 Year (second Tuesday of the Month)
OfficeDocs-SharePoint/SharePoint/SharePointServer/product-servicing-policy/updated-product-servicing-policy-for-sharepoint-server-se.md/0
Updated Product Servicing Policy for SharePoint Server Subscription Edition
OfficeDocs-SharePoint/SharePoint/SharePointServer/product-servicing-policy/updated-product-servicing-policy-for-sharepoint-server-se.md
OfficeDocs-SharePoint
551
55
title: "Configure properties of the Search Results Web Part in SharePoint Server" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 392018 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: IT_Sharepoint_Server_Top ms.assetid: d0e68cf4-80f6-4f1f-af49-a3b9c43408ac description: "Learn how to configure the query and properties of the Search Results Web Part, and how to disable stemming in the Web Part." Configure properties of the Search Results Web Part in SharePoint Server [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] The Search Results Web Part displays the search results of a query entered in a Search Box Web Part. By default, the Search Results Web Part is used on all search vertical pages (results.aspx, peopleresults.aspx, conversationresults.aspx, videoresults.aspx). The Search Results Web Part displays the actual search results and it also passes the search results to the Refinement Web Part and the Search Navigation Web Part on the same page. The Search Results Web Part uses a query that is specified in the Web Part to display search results. By default, the query defined in this Web Part uses the query variable {searchboxquery}. The query variable is a placeholder for a value. When a query is run, the placeholder is replaced with a value. For example, when a user types the search phrase yellow in the Search Box Web Part, the {searchboxquery} variable in the Search Results Web Part will resolve to search all items that contain the phrase yellow. By changing the properties and query in the Search Results Web Part you can do the following: Limit search results to a result source. Add query variables or property filters that customize search results for different users. Promote or demote items or pages within the search results. Change the sorting of the search results. Change the display template. Before you begin [!NOTE] Because SharePoint Server runs as websites in Internet Information Services (IIS), administrators and users depend on the accessibility features that browsers provide. SharePoint Server supports the accessibility features of supported browsers. For more information, see the following resources: - Plan browser support - Accessibility for SharePoint 2013 - Accessibility features in SharePoint 2013 Products - Keyboard shortcuts - Touch Configure properties of the Search Results Web Part To configure the properties of a Search Results Web Part Verify that the user account that performs this procedure is a member of the Designers SharePoint group on the Enterprise Search Center site. On the search results page, click the Settings menu, and then click Edit Page. In the Search Results Web Part, click the Search Results Web Part Menu arrow, and then click Edit Web Part. In the Web Part tool pane, in the Search Criteria section, click Change query. On the BASICS tab, do one of the following: To define your query by using Keyword Query Language (KQL), select options as described in the following list: Select a query Select a result source to specify which content should be searched. By default, the following result sources are set for the different search vertical pages: Everything (results.aspx): Local SharePoint Results (System) People (peopleresults.aspx): Local People Results (System) Conversations (conversationresults.aspx): Conversations (System) Videos (videoresults.aspx): Local Video Results (System) Keyword filter You can use keyword filters to add query variables to your query. For a list of available query variables,see Query variables in SharePoint Server. You can select pre-defined query variables from the drop-down list, and then add them to the query by clicking Add keyword filter. Property filter You can use property filters to query the content of managed properties that are set to queryable in the search schema. You can select managed properties from the Property filter drop-down list. Click Add property filter to add the filter to the query. Query text By default, the query variable {searchboxquery} is defined for this field. You can change the query text by using KQL. For more information about KQL, see Keyword Query Language (KQL) syntax reference. Alternatively you can use the Keyword filter and Property filter lists to build the query. The keyword query can consist of free-text keywords, property filters, or operators. Use braces to enclose query variables. The query variables will be replaced with an actual value when the query is run. Keyword queries have a maximum length of 2,048 characters. To define your query by using pre-defined variables, click Switch to Quick Mode. Select options as described in the following list: Select a query Select a result source to specify which content should be searched. If you have shared a document library or list as catalog, the catalog result source will be displayed in this drop-down list. Restrict by app Select an option from the list to restrict results to a specific site, library, list, or URL. Restrict by tag You can limit results to content that is tagged with a term from a term set. Select one of the following options: Don't restrict by any tag Search results will not be limited based on tags (default). Restrict by navigation term of current page Search results will be limited to content that is tagged with the term of the current page. The current tag is displayed as the last part of the friendly URL. This option is only meaningful for sites that use managed navigation. Restrict by current and child navigation Search results will be limited to content that is tagged with the term of the current page (displayed as the last part of the friendly URL), and content that is tagged with sub-terms of the current page. This option is only meaningful for sites that use managed navigation. Restrict on this tag Search results will be limited to content that is tagged with the tag that you type inside the box. Note: When you switch to from Quick Mode to Advanced Mode, the result source that you selected from Select a query is replaced by a different result source. This result source could affect the search results. Therefore, make sure that you check the search results that are displayed in the SEARCH RESULT PREVIEW section, and add query configuration in the Query text field if you need to. The REFINERS tab lists the managed properties that are enabled as refiners in the search schema. You can specify that the search results returned in the Search Results Web Part should be limited to one or more values from the refiners. Select a refiner in the list, and then click Add to add it to the query. Click Show more if you want to define grouping of results. Under Group results, you can specify that the results should be grouped based on one or more managed properties. This is useful when you are displaying several variants for a given item, and want to group them under a single result. On the SORTING tab, you can specify how search results should be sorted. This tab is available only if you use Advanced Mode. If you use Quick Mode, you can define sorting options in the result source. In the Sort by drop-down list: Select a managed property from the list of managed properties that are set as sortable in the search schema, and then select Descending or Ascending. To add more sorting levels, click Add sort level. [!NOTE] Sorting of search results is case sensitive. [!IMPORTANT] If your result source contains sorting, you should not specify sorting in the Search Results Web Part. This is because the sorting in the result source overrides the sorting that you specify in the Search Results Web Part. Select Rank to sort by relevance rank. You can then specify which ranking model to use or specify dynamic ordering rules. (Optional) Select which ranking model to use for sorting in the Ranking Model list. Under Dynamic ordering, you can specify additional ranking by adding rules that will change the order of results when certain conditions apply. Click Add dynamic ordering rule, and then specify conditional rules. On the SETTINGS tab, specify the settings that are listed in the following list. Query Rules Select whether to use Query Rules or not. URL Rewriting Select if the URL rewrite to the item details page should continue to be relative for each catalog item as defined when you set up the catalog connection. This option is only meaningful for sites that use managed navigation and have connected to a catalog that uses anonymous access for the catalog pages. If you select Don't rewrite URLs, the URLs for catalog items are pointed directly to the library item of the connected catalog. Loading Behavior Select when the search results returned by the Search Results Web Part appear on the web page. The default option is Async option: Issue query from the browser. Queries will be issued from the end-users browser after the complete page is received (asynchronous). If you select the synchronous option, Sync option: Issue query from the server, queries are issued from the server, and the search results are included in the page response that is sent back from SharePoint (synchronous). Synchronous loading makes search vulnerable to cross-site request forgery attacks and you should only choose this option after carefully considering whether this vulnerability can be exploited, learn more. On the TEST tab, you can preview the query that is sent by the Search Results Web Part. Query text Shows the final query that will be run by the Search Results Web Part. It is based on the original query template where dynamic variables are substituted with current values. Other changes to the query may be made as part of a query rule. Click Show more to display additional information. Query template Shows the content of the query template that is applied to the query. Refined by Shows the refiners applied to the query as defined on the REFINERS tab. Grouped by Shows the managed property on which search results should be grouped as defined on the REFINERS tab. Applied query rules Shows which query rules are applied to the query. The Query template variables section shows the query variables that will be applied to the query, and the values of the variables that apply to the current page. You can type other values to test the effect they will have on the query. Click the Test Query button to preview the search results. You can also test how the query works for different user segment terms. Click Add user segment term to add terms to be added to the query. Click the Test query button to preview the search results. Query text Shows the final query that will be run by the Search Results Web Part. It is based on the original query template where dynamic variables are substituted with current values. Other changes to the query may be made as part of a query rule. In the Web Part tool pane, in the Display Templates section, the default selection is Use result types to display items. This selection will apply different display templates according to the result type of the search result. For example, if the result type of a search result is a PDF file, the display template PDF Item will be applied. If the result type of a search result is an image, the Picture Item display template will be applied. To apply one display template to all result types of the search results, select Use a single template to display items, and then select the display template that you want to apply. In the Web Part tool pane, in the Settings section, in the Results Settings, to further specify how search results should be shown, change the values in the following fields: Number of results per page The number of search results to be displayed per page. Show ranked results Clear the check box if you want to show only promoted blocks (such as promoted results or personal favorites) or result controls (such as result counts) instead of the ranked results. Show promoted results Clear the check box if you do not want to show search results that you have promoted by using Query rules. Show "Did you mean?" Clear the check box if you do not want to show query spelling corrections as Did you mean suggestions. For more information about query spelling corrections, see Manage query spelling correction in SharePoint Server. Show personal favorites Clear the check box if you do not want to show personal favorites. Show View Duplicates link Select the check box if you want to show a View Duplicates link. Show link to search center Select the check box if you want to show a link to the Search Center. In the Web Part tool pane, in the Settings section, in the Results control settings section, to specify how search results should be shown, change the values in the following fields: Show advanced link Clear the check box if you don't want to show a link to the Advanced Search page in the Web Part. Show result count Clear the check box if you don't want to show the number of results found in the Web Part. Show language dropdown Clear the check box if you don't want to show the language drop-down in the Web Part. Show sort dropdown Select the check box if you want to show the sort drop-down in the Web Part. Show paging Clear the check box if you don't want to show paging in the Web Part. Show preferences link Clear the check box if you don't want to show a link to the preferences page in the Web Part. Show AlertMe link Clear the check box if you don't want to show a link to the Alert Me page in the Web Part. For more information about search alerts, see Enable search alerts in SharePoint Server. To disable stemming in a Search Results Web Part Stemming means that nouns and adjectives in a query are expanded to different possible inflections. For example, if a person enters the English word "foot" in a query, it is automatically expanded to {"feet"}. Similarly, the word "overview" is expanded to {"overviews"}. To disable stemming in a Search Results Web Part Verify that the user account that performs this procedure is a member of the Designers SharePoint group on the Enterprise Search Center site. On the search results page, click the Settings menu, and then click Edit page. In the Search Results Web Part, click the Search Results Web Part Menu arrow, click Export..., and then save the Web Part to your computer. Open the Web Part in a text editor — for example, Notepad. Change the value for EnableStemming to false, and then save the file with a new name — for example, Search_Results_NoStemming.webpart. On the search results page, in the Main Zone, click Add a Web Part. In the Categories section, click the Upload a Web Part arrow. In the Upload a Web Part section, click Browse to find the Web Part file that you have edited, and then click Upload. To add the customized Search Results Web Part to the search results page, do the following: Browse to the search results page. Click the Settings menu, and then click Edit Page. In the Web Part Zone where you want to add the Web Part, click Add a Web Part. In the Categories list, select Imported Web Parts. In the Parts list, select the Web Part that you uploaded, and then click Add. To remove the default Search Results Web Part from the search results page, do the following: Browse to the search results page. Click the Settings menu, and then click Edit Page. In the Web Part, click the Search Results Web Part menu arrow, and then click Delete. [!NOTE] Disabling stemming turns it off for all languages except the following: Arabic, Estonian, Finnish, Hebrew, Hungarian, Korean, Latvian, and Slovak. See also Query variables in SharePoint Server Configure result sources for search in SharePoint Server Plan to transform queries and order results in SharePoint Server Blog series: How to change the way search results are displayed in SharePoint Server 2013
OfficeDocs-SharePoint/SharePoint/SharePointServer/search/configure-properties-of-the-search-results-web-part.md/0
Configure properties of the Search Results Web Part in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/search/configure-properties-of-the-search-results-web-part.md
OfficeDocs-SharePoint
3,513
56
title: "Delete items from the search index or from search results in SharePoint Server" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 372018 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: IT_Sharepoint_Server_Top ms.assetid: 48d39f84-9698-4249-b7e0-b885c462622e description: "Learn how to remove an item from the search index or SharePoint Server search results by removing the URL." Delete items from the search index or from search results in SharePoint Server [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] If you want to remove the metadata of an item from the search index or from the search results, you remove the URL of that item. To remove a URL from the search index, use the Remove the Item from the Index option that is available through the crawl log. To remove a URL from search results, use the Search Result Removal feature that allows for bulk URL removal. This can provide a more efficient method if many search results should be removed. [!NOTE] If your SharePoint environment is hybrid and uses cloud hybrid search, you index your on-premises content in your search index in Office 365. See Learn about cloud hybrid search for SharePoint for guidance on deleting the metadata of an on-premises item and deleting on-premises search results from the search index in Office 365. For SharePoint Server 2019, removing the URL of an item affects both the classic and modern search experiences. Remove an item from the search index To remove an item from the search index Verify that the user account that is performing this procedure is an administrator for the Search service application. On the SharePoint Server Central Administration home page, in the Application Management section, click Manage service applications. On the Manage Search Applications page, click the Search service application. On the Search Administration page, in the Diagnostics section, click Crawl Log. On the Crawl Log page, click URL View. Do one of the following: If you know the URL of the item that you want to remove, type the URL in the box. If you do not know the URL of the item that you want to remove, search for it by using the filters Content Source, Status or Message. Click Search. Find and point to the URL of the item that you want to remove, click the arrow and then click Remove the item from the Index. In the confirmation dialog that appears, click OK to confirm that you want to remove the item from the index. Verification: the text Removed from the search index by Admin appears under the URL in the crawl log. Remove an item from the search results To remove an item from the search results Verify that the user account that is performing this procedure is an administrator for the Search service application. On the SharePoint Server Central Administration home page, in the Application Management section, click Manage service applications. On the Manage Search Applications page, click the Search service application. On the Search Administration page, in the Queries and Results section, click Search Result Removal. On the Exclude URLs From Search Results page, in the URLs to remove box, type the URLs of the items that you want to remove from the search results. Click Remove Now.
OfficeDocs-SharePoint/SharePoint/SharePointServer/search/delete-items-from-the-search-index-or-from-search-results.md/0
Delete items from the search index or from search results in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/search/delete-items-from-the-search-index-or-from-search-results.md
OfficeDocs-SharePoint
784
57
title: "How to display values from custom managed properties in search results - option 2 in SharePoint Server" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 372018 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.assetid: e2e13d19-21ca-44ef-bd49-0f6120137186 description: "Learn a second option for displaying values from custom managed properties in SharePoint Server." How to display values from custom managed properties in classic search results - option 2 in SharePoint Server [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] In How to display values from custom managed properties in search results - option 1 in SharePoint Server we showed a simple method to add a custom icon and values from two custom-managed properties to your classic search results. In this article, we'll look at a fuller method for changing the way classic search results are displayed that includes if statements and hit highlighting. In this article, you'll learn: Strategy for killing three birds with one stone - search results version How to display values from custom managed properties with hit highlighting, and get automatically improved relevancy Strategy for killing three birds with one stone - search results version First, let's state what we want to achieve: Display values from two custom managed properties. Apply hit highlighting to the two custom managed properties. Get automatically improved relevancy for our classic search results. Before we look at details about how to achieve these goals, let's look at the strategy we want to follow. If this gets a bit complex, please try to hang in there. Hopefully it will be clear by the end. First, remember how we can think about hit highlighting: The managed properties that are listed in the Hit-highlighted properties (JSON) section of the Search Results Web Part and the "magical summary" property are passed to the HitHighlightedProperties property. All values of the HitHighlightedProperties property are passed to the HitHighlightedSummary property. A truncated version of the values in HitHighlightedSummary appears in the Search Results Web Part with three dots at the end. Also remember that each item display template contains a reference to the Item_CommonItem_Body display template, and that this template contains an onlick method that will result in automatically improved relevance based on the user's click behavior. So our strategy is this: create variables in the item display template that will be passed on and rendered by the Item_CommonItem_Body display template. Specifically, that means that we have to do the following: Add the custom managed properties that we want to display in our classic search results to the Hit-highlighted properties in the Search Results Web Part. Add the custom managed properties to an item display template. In the item display template, create a variable that will be used by the property HitHighlightedSummary to display our two custom-managed properties with hit highlighting. In the item display template, leave the reference _=ctx.RenderBody(ctx)=_ so that the Item_ComonItem_Body display template will render the search result. This makes sure that we get automatically improved relevancy. OK, now let's take it step-by-step, with examples of how we did this for our Search Center scenario. How to display values from custom managed properties with hit highlighting, and get automatically improved relevancy First, you have to find the managed property names that correspond to the custom site columns that you want to use. We looked at how to do this in How to display values from custom managed properties in search results - option 1 in SharePoint Server. Next, you have to do some configuration on the Search Results Web Part. Here are the steps: On the Search results page, select the Settings menu, and then select Edit Page. In the Search Results Web Part, select Web Part Menu > Edit Web Part. In the Web Part tool pane, select to expand the Display Templates section, and then select Use a single template to display items. This lets you change the Hit-highlighted properties (JSON) field. In the Hit-highlighted properties (JSON) field, use the following format to add the custom managed properties that you want to add hit highlighting to: "\" In our Search Center scenario, we wanted to apply hit highlighting to the ContentSummaryOWSMTXT and the owstaxIdTechnicalSubject managed properties. Select Apply to save the changes. The Display Templates section closes. To reopen the section, select Display Templates, and select Use result types to display items. Select OK and save the page. Next, you have to add the custom managed properties to an item display template. Here's what you should do: Open the item display template that belongs to the result type for which you want to customize search results. In our Search Center scenario, this was TechNet content. In the item display template, in the ManagedPropertyMapping tag, use the following syntax to add the custom managed properties that you want to display: '':' In our Search Center scenario, we wanted the values from the managed properties ContentSummaryOWSMTXT and owstaxIdTechnicalSubject to appear in the search result. To make the file easier to maintain, we named the current item properties the same as the managed properties. Next, you have to create variables in the item display template that will be used and rendered by the Item_Common_Item_Body display template. Here's what you should do: Because you have no guarantee that the values of your custom properties will contain any of the entered query words, that is, hit highlighting won't be used, you have to create variables that guarantee that the value of your custom properties will be displayed regardless of hit highlighting. The following screenshots show how we created two such variables for our custom properties ContentSummaryOWSMTXT and owstaxIdTechnicalSubject. In addition, we added a similar variable for the Title property. If you don't add this, the search results won't be rendered. The last step that you have to do in the item display template is to create a variable that will override the HitHighlightedSummary property used to display the values. Save the item display template. > [!NOTE] > You don't have to do this step if you are using SharePoint in Microsoft 365. Go to Site settings > Search Result Types. A Property Sync alert appears. This alert appears because we added managed properties to an item display template (what we did in Step 9). To update the result types with the newly added managed properties, choose Update. [!IMPORTANT] If you don't do this update, the newly added managed properties won't display in your search results. After we made these changes, when users entered a query in the Search Center, the search result included: A custom icon The value of Title with hit highlighting The value of ContentSummaryOWSMTXT with hit highlighting The value of owstaxIdTechnicalSubject (The query words didn't match the property value, but because of the variable that we created in step 10, the value appears.) A link to the item in the list We wanted to make one little change to how the value for owstaxIdTechnicalSubject appears. We wanted to give users a bit more context as to what this value represents. Therefore, we decided to add the text "Technical Subject:" before the value. Also, because this value isn't always present for all list items, we decided it should only display when a value was present. To do this, we made a change to the variable that overrides the HitHighlightedSummary property: Note that we added a slightly different color to the text "Technical Subject:". With this addition, the final search result is displayed as follows: In How to create a new result type in SharePoint Server, we had decided we wanted six different result types. After creating the TechNet content result type and display template, it was easy to copy this work over to the other five result types. And here's the result: So now that we have changed the way classic search results are displayed, the next step is to change the values that are displayed in the hover panel. Next article in this series How to display values from custom managed properties in the hover panel in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/search/how-to-display-values-from-custom-managed-properties-in-search-resultsoption-2.md/0
How to display values from custom managed properties in classic search results - option 2 in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/search/how-to-display-values-from-custom-managed-properties-in-search-resultsoption-2.md
OfficeDocs-SharePoint
1,875
58
title: "Manage the search schema in SharePoint Server" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 382018 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.custom: admindeeplinkSPO ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: IT_Sharepoint_Server_Top ms.assetid: 81890ff0-e2f9-4752-8e8e-2e8502c76311 description: "Learn how to view, add, edit, map, and delete crawled properties, crawled property categories and managed properties in the search schema." Manage the search schema in SharePoint Server [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] The search schema in SharePoint Server determines how content is collected in and retrieved from the search index in SharePoint Server. Crawled properties are metadata that is extracted from content during crawls. Metadata can be structured content (such as the title or the author from a Word document), or unstructured content (such as a detected language or extracted keywords). You decide which crawled metadata to index by mapping the crawled property to a managed property. Users can only search on managed properties. You can map multiple crawled properties to a single managed property or map a single crawled property to multiple managed properties. [!NOTE] The search schema applies to both the classic and the modern search experiences, except for the following settings which don't apply to modern search: - Refinable. Modern search has built-in refiners. - Sortable. Not supported in modern search. - Custom entity extraction. Modern search has built-in refiners. - Company name extraction. Not supported in modern search. Before you begin Before you begin this operation, review the following information about prerequisites: Create a Search service application. Add one or more content sources and run a full crawl. To view crawled properties and managed properties Verify that the user account that is performing this procedure is an administrator for the Search service application. In Central Administration, in the Application Management section, click Manage service applications. Click the Search service application. On the Search Administration page, in the Quick Launch, under Queries and Results, click Search Schema. On the Managed Properties page, you see an overview of all the managed properties, the settings on the managed properties and the crawled properties they are mapped to. To view crawled properties, click Crawled Properties. To view crawled property categories, click Categories. To add a managed property Verify that the user account that is performing this procedure is an administrator for the Search service application. In Central Administration, in the Application Management section, click Manage service applications. Click the Search service application. On the Search Administration page, in the Quick Launch, under Queries and Results, click Search Schema. On the Managed Properties page, click New Managed Property. On the New Managed Property page, in the Property name box in the Name and description section, enter the name of the new managed property. You can also enter a description. In the Type section, select one of the following options for the property: Text Integer Decimal Date and Time YesNo Double precision float Binary In the Main characteristics section, select one or several of the following: Searchable Advanced searchable settings (optional, if Searchable is selected) Queryable Retrievable Allow multiple values Refinable Sortable Alias Token normalization Complete matching Language neutral tokenization Finer query tokenization [!IMPORTANT] If you want to be able to use this managed property as a refiner, you must select both Refinable and Queryable. In the Mappings to crawled properties section, click Add a mapping. On the Crawled property selection page, select a crawled property to map to the managed property and then click OK. Repeat this step to map more crawled properties. On the New Managed Property page, in the Mappings to crawled properties section, specify if you want to include: All content from all crawled properties mapped to this managed property Content from the first crawled property that contains a value and, optionally, in which order. In the Company name extraction section, you can optionally select the check box to enable company name extraction. In the Custom entity extraction section, you can optionally select the check box to enable custom entity extraction. See Create and deploy custom entity extractors in SharePoint Server for the procedures. Click OK. You have to perform a full crawl of the content source or sources that contain this new managed property to include it in the search index. If the new managed property is in a SharePoint Server library or list, you have to reindex that library or list.For more information see, Overview of the search schema in SharePoint Server. To edit a managed property Verify that the user account that is performing this procedure is an administrator for the Search service application. In Central Administration, in the Application Management section, click Manage service applications. Click the Search service application. On the Search Administration page, in the Quick Launch, under Queries and Results, click Search Schema. On the Managed Properties page, find the managed property that you want to edit, or enter its name in the Filter box. Point to the managed property that you want to edit, click the arrow, and then click EditMap property. On the Edit Managed Property page, edit the settings and then click OK. Some changes on managed property settings require a full crawl to take effect. See the table Search schema changes that require content to be reindexed for an overview of which changes require you to reindex the content. To delete a managed property Verify that the user account that is performing this procedure is an administrator for the Search service application. In Central Administration, in the Application Management section, click Manage service applications. Click the Search service application. On the Search Administration page, in the Quick Launch, under Queries and Results, click Search Schema. On the Managed Properties page, find the managed property that you want to delete, or enter its name in the Filter box. Point to the managed property that you want to delete, click the arrow, and then click Delete. Click OK. If you delete a managed property:Users can no longer run queries using this property.A query rule that uses this property no longer works.A custom search application or web part that uses this property no longer works.To delete this property from the search index, you'll have to perform a full crawl. If the deleted property was in a SharePoint Server library or list, you'll have to reindex that library or list. To map a crawled property to a managed property Verify that the user account that is performing this procedure is an administrator for the Search service application. In Central Administration, in the Application Management section, click Manage service applications. Click the Search service application. On the Search Administration page, in the Quick Launch, under Queries and Results, click Search Schema. On the Crawled Properties page, find the crawled property that you want to map to a managed property, or enter its name in the Filters box. Point to the crawled property that you want to map, click the arrow, and then click EditMap property. On the Edit Crawled Property page, in the Mappings to managed properties section, click Add a Mapping. On the Managed property selection page, select one managed property to map to the crawled property and then click OK. Repeat this step to map more managed properties to this crawled property. In the Include in full-text index section, check the box if you want to include the content of this crawled property in the full-text index. On the Edit Crawled Property page, click OK. You have to perform a full crawl of the content source that includes the crawled property that you've mapped to a managed property for the new mapping to take effect. If the new mapping is for a SharePoint Server library or list, you have to reindex that library or list. To view or edit crawled property categories Verify that the user account that is performing this procedure is an administrator for the Search service application. In Central Administration, in Application Management section, click Manage service applications. Click the Search service application. On the Search Administration page, in the Quick Launch, under Queries and Results, click Search Schema. On the Categories page, find the crawled property category that you want to view or edit. To view which crawled properties belong to a category, and which managed properties they are mapped to, click the crawled property category in the Categories page. To edit a category, point to the crawled property category that you want to edit, click the arrow, and then click Edit category. [!CAUTION] If you edit a crawled property category, your changes apply to all of the crawled properties within the category. Changing a crawled property category can influence performance and how items are saved in the search index. You also have to reindex the content. Add a managed property using tenant administration or site collection administration Tenant administrators and site collection administrators can create a search schema that is specific for their tenant or site collection. For more information how to manage the search schema for tenants and site collections, see Manage the search schema in SharePoint. You can create new managed properties for a tenant or a site collection and map crawled properties to them. Alternatively, you can reuse existing, unused managed properties that do not have crawled properties mapped to them, and rename them using an Alias. Then, you must map crawled properties to the renamed managed property with the defined alias. When you create a new managed property in the tenant or site collection administration, there are some limitations. For example, the property can only be of type Text or YesNo, and it can't be refinable or sortable. If you need a property of a different type, or one that has different characteristics than what is available, follow the steps under To create a managed property by renaming an existing one. When you have added a new property to a list or to a library on a SharePoint Server site, or when you have changed properties that are used in a list or library, the content must be re-crawled before your changes will be reflected in the search index. Because your changes are made in the search schema, and not to the actual site, the crawler will not automatically reindex the list or the library. To make sure that your changes are crawled and reindexed, you can specifically request a reindexing of the list or library. When you do this, the list or library content will be recrawled and reindexed so that you can start using your new managed properties in queries, query rules and display templates. See the table Search schema changes that require content to be reindexed for an overview of which managed property setting changes require you to reindex the content. To create a managed property for a tenant or a site collection Verify that the user account that is performing this procedure is an administrator for the tenant or for the site collection. Go to the Search Schema page for the tenant or for a site collection. For the tenant, go to More features in the SharePoint admin center, and sign in with an account that has admin permissions in Microsoft 365. Under Search, select Open, and then select Manage Search Schema. For the site collection, on your site, go to Settings, click Site settings and then under Site Collection Administration, click Search Schema. On the Managed Properties page, click New Managed Property. On the New Managed Property page, in the Property name box in the Name and description section, enter the name of the new managed property. You can also enter a description. In the Type section, select one of the following options for the property: Text YesNo In the Main characteristics section, select one or several of the available options. In the Mappings to crawled properties section, click Add a mapping. On the Crawled property selection page, select a crawled property to map to the managed property and then click OK. Repeat this step to map more crawled properties. On the New Managed Property page, in the Mappings to crawled properties section, specify if you want to include: All content from all crawled properties mapped to this managed property Content from the first crawled property that contains a value and, optionally, in which order. Click OK. To create a managed property by renaming an existing one Verify that the user account that is performing this procedure is an administrator for the tenant or for the site collection. Go to the Search Schema page for the tenant or for a site collection. For the tenant, go to More features in the SharePoint admin center, and sign in with an account that has admin permissions in Microsoft 365. Under Search, select Open, and then select Manage Search Schema. For the site collection, on your site, go to Settings, click Site settings and then under Site Collection Administration, click Search Schema. On the Managed Properties page, find an unused managed property. By unused, we mean that the property is not mapped to a crawled property: the Mapped Crawled Properties column is empty. See the Default unused managed properties table for more details. Point to the managed property, click the arrow, and then click EditMap property. On the Edit Managed Property page, in the Main characteristics section, under Alias, enter a name in the field. In the Mappings to crawled properties section, click Add a mapping. On the Crawled property selection page, select a crawled property to map to the managed property and then click OK. Repeat this step to map more crawled properties to this managed property. Click OK. To reindex a list or library Verify that the user account that is performing this procedure is an administrator for the tenant or for the site collection. Browse to the list or library that you want to recrawl, and then do one of the following: If you want to perform a full crawl of a library, click the LIBRARY tab, and then, on the ribbon, in the Settings group, click Library Settings. If you want to perform a full crawl of a list, click the LIST tab, and then, on the ribbon, in the Settings group, click List Settings. On the Settings page, in the General Settings section, click Advanced settings. On the Advanced Settings page: If you want to reindex a library: in the Reindex Library section, click Reindex Document Library. If you want to reindex a list: in the Reindex List section, click Reindex List. Click OK. The full reindex of the list or library will be performed during the next scheduled crawl. Default unused managed properties The following table provides an overview of the default unused managed properties that you can reuse and rename using an Alias. Managed property type Count Managed property characteristics Managed property name range :------------------------ :-------- :------------------------------------------------------- :--------------------------------------- Date 10 Queryable Date00 to Date09 Date 20 Multivalued, Queryable, Refinable, Sortable, Retrievable RefinableDate00 to RefinableDate19 Date (SharePoint Server 2019) 2 Queryable, Refinable, Sortable, Retrievable RefinableDateInvariant00 to RefinableDateInvariant01 Date (SharePoint Server 2019) 5 Queryable, Refinable, Sortable, Retrievable RefinableDateSingle00 to RefinableDateSingle04 Decimal 10 Queryable Decimal00 to Decimal09 Decimal 10 Multivalued, Queryable, Refinable, Sortable, Retrievable RefinableDecimal00 to RefinableDecimal09 Double 10 Queryable Double00 to Double09 Double 10 Multivalued, Queryable, Refinable, Sortable, Retrievable RefinableDouble00 to RefinableDouble09 Integer 50 Queryable Int00 to Int49 Integer 50 Multivalued, Queryable, Refinable, Sortable, Retrievable RefinableInt00 to RefinableInt49 String (SharePoint Server 2013) 100 Multivalued, Queryable, Refinable, Sortable, Retrievable RefinableString00 to RefinableString99 String (SharePoint Server 2019) 200 Multivalued, Queryable, Refinable, Sortable, Retrievable RefinableString00 to RefinableString199 How to use an Alias: an example Say that you want to create a managed property that contains employee numbers, and you want users to be able to search for these by typing "EmployeeID:12345", where "12345" is an example employee number. As this managed property is not of the type Text or YesNo, you'll follow the steps in To create a managed property by renaming an existing one with this input: Choose an unused managed property of the type integer, see the table Default unused managed properties. Use any unused property from Int00 to Int49 if you only want users to be able to query on the employee number, or from RefinableInt00 to RefinableInt49 if you want users to be able to query, refine, sort etc. on employee number. Give the property an Alias, in this example EmployeeID. Map the EmployeeID property to the crawled property that contains the employee numbers. Search schema changes that require content to be reindexed Managed property setting Action Requires a full crawl to reindex :-------------------------------------- :-------------------------------- :----------------------------------- Mapping a crawled to a managed property AddDelete mapping Yes Token normalization EnableDisable Yes Complete matching EnableDisable Yes Lanugage neutral tokenization EnableDisable Yes Company name extraction EnableDisable Yes Custom entity extraction EnableDisable Yes Searchable EnableDisable Yes Queryable Enable Yes Queryable Disable No Retrievable Enable Yes Retrievable Disable No Refinable Enable (if not already Sortable) Yes Refinable Disable No Sortable Enable (if not already Refinable) Yes Sortable Disable No Alias AddDelete No
OfficeDocs-SharePoint/SharePoint/SharePointServer/search/manage-the-search-schema.md/0
Manage the search schema in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/search/manage-the-search-schema.md
OfficeDocs-SharePoint
4,014
59
title: "Plan search in SharePoint Server" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 362018 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: IT_Sharepoint_Server_Top ms.assetid: e8c0fccd-8364-4352-8778-c9c46a668b70 description: "Create a well-designed plan to install and configure search in SharePoint Server." Plan search in SharePoint Server [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] The following articles provide information about the planning you need to do to deploy search in SharePoint Server. Articles about planning search The following articles about planning search in SharePoint Server are available to view online. Writers update articles on a continuing basis as new information becomes available and as users provide feedback. Content Description :-----:-----:----- Overview of search architecture in SharePoint Server Learn about the different search components and databases and their role within the search topology. Plan enterprise search architecture in SharePoint Server 2016 Learn how to plan a small, medium or large enterprise search architecture. Scale enterprise search in SharePoint Server Learn which approach to use to scale your enterprise search architecture for performance and availability. Scale search for Internet sites in SharePoint Server Determine hardware requirements and review considerations to scale out search topologies for Internet sites for performance and availability. Best practices for organizing content for search in SharePoint Server Learn how to organize SharePoint Server content and metadata to make the content easier to find. Understanding result sources for search in SharePoint Server Use a result source in SharePoint Server to specify a provider to get search results from, and optionally to narrow a search to a subset of those results. Plan crawling and federation in SharePoint Server Plan to crawl or federate results from different kinds of content and plan to apply the appropriate settings. Overview of the search schema in SharePoint Server Learn how the search schema is used to build up the search index. The search schema contains the mapping from crawled properties to managed properties and the settings on the managed properties. Plan to transform queries and order results in SharePoint Server Learn how to transform user queries to provide more relevant SharePoint Server search results and to improve the way search results are displayed. Overview of search result ranking in SharePoint Server Learn how SharePoint Server uses ranking models to calculate the relevance rank of search results and how you can influence the order of search results by using query rules, the search schema and ranking models. Overview of analytics processing in SharePoint Server Learn how the Analytics Processing Component analyzes content and user actions to improve search relevance.
OfficeDocs-SharePoint/SharePoint/SharePointServer/search/search-planning.md/0
Plan search in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/search/search-planning.md
OfficeDocs-SharePoint
629
60
title: "Enable TLS 1.1 and TLS 1.2 support in SharePoint Server 2016" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 6292017 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top - IT_Sharepoint16 ms.assetid: 559dddb1-95c9-4242-99ca-cf9cf1cbd0c3 description: "This article describes how to enable Transport Layer Security (TLS) protocol versions 1.1 and 1.2 in a SharePoint Server 2016 environment. SharePoint Server 2016 fully supports TLS 1.1 and TLS 1.2." Enable TLS 1.1 and TLS 1.2 support in SharePoint Server 2016 [!INCLUDEappliesto-xxx-2016-xxx-xxx-xxx-md] To enable TLS protocol versions 1.1 and 1.2 in your SharePoint 2016 environment, you need to install updates and change configuration settings in each of the following locations: SharePoint servers in your SharePoint farm Microsoft SQL Servers in your SharePoint farm Client computers used to access your SharePoint sites [!IMPORTANT] If you do not update each of these locations, you run the risk of systems failing to connect to each other using TLS 1.1 or TLS 1.2. The systems will instead fall back to an older security protocol; and if the older security protocols are disabled, the systems may fail to connect entirely. > Example: SharePoint servers may fail to connect to SQL Server databases, or client computers may fail to connect to your SharePoint sites. Summary of the update process The following image shows the three step process necessary to enable TLS 1.1 and TLS 1.2 support on your SharePoint servers, SQL Servers, and client computers. Step 1: Update SharePoint servers in your SharePoint farm Follow these steps to update your SharePoint server. Steps for SharePoint ServerWindows Server 2012 R2Windows Server 2016 :-----:-----:----- 1.1 - Install ODBC Driver 11 for SQL Server update for TLS 1.2 support Required Required 1.2 - Install SQL Server 2012 Native Client update for TLS 1.2 support Required Required The following steps are recommended. Although not directly required by SharePoint Server 2016, they may be necessary for other software that integrates with SharePoint Server 2016. 1.3 - Install .NET Framework 3.5 update for TLS 1.1 and TLS 1.2 support Recommended Recommended 1.4 - Enable strong cryptography in .NET Framework 3.5 Recommended Recommended The following step is optional. You may choose to run this step based on your organization's security and compliance requirements. 1.5 - Disable earlier versions of SSL and TLS in Windows Schannel Optional Optional 1.1 - Install ODBC Driver 11 for SQL Server update for TLS 1.2 support ODBC Driver 11 for SQL Server doesn't support TLS 1.1 or TLS 1.2 by default. You must install the ODBC Driver 11 for SQL Server update for TLS 1.2 support. To install ODBC Driver 11 for SQL Server update for TLS 1.2 support, see Microsoft® ODBC Driver 11 for SQL Server® - Windows. 1.2 - Install SQL Server 2012 Native Client update for TLS 1.2 support SQL Server 2012 Native Client doesn't support TLS 1.1 or TLS 1.2 by default. You must install the SQL Server 2012 Native Client update for TLS 1.2 support. To install the SQL Server 2012 Native Client update, see Microsoft® SQL Server® 2012 Native Client - QFE. 1.3 - Install .NET Framework 3.5 update for TLS 1.1 and TLS 1.2 support .NET Framework 3.5 doesn't support TLS 1.1 or TLS 1.2 by default. [!IMPORTANT] To add support for TLS 1.1 and TLS 1.2 in Windows Server 2012 R2, you must install a KB update, and then manually configure Windows Registry keys. SharePoint Server 2016 is built on .NET Framework 4.x and doesn't use .NET Framework 3.5. However, certain prerequisite components and third party software that integrates with SharePoint Server 2016 may use .NET Framework 3.5. Microsoft recommends installing and configuring this update to improve compatibility with TLS 1.2. The SystemDefaultTlsVersions registry value defines which security protocol version defaults will be used by .NET Framework 3.5. If the value is set to 0, .NET Framework 3.5 will default SSL 3.0 or TLS 1.0. If the value is set to 1, .NET Framework 3.5 will inherit its defaults from the Windows Schannel DisabledByDefault registry values. If the value is undefined, it will behave as if the value is set to 0. To enable .NET Framework 3.5 to inherit its security protocol defaults from Windows Schannel For Windows Server 2012 R2 To install the .NET Framework 3.5 SP1 update for Windows Server 2012 R2, see the KB article Support for TLS System Default Versions included in the .NET Framework 3.5 on Windows 8.1 and Windows Server 2012 R2. After the KB update is installed, manually configure the registry keys. For Windows Server 2016 For Windows Server 2016, manually configure the registry keys. To manually configure the registry keys, do the following: From Notepad.exe, create a text file named net35-tls12-enable.reg. Copy, and then paste the following text. Windows Registry Editor Version 5.00 [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework\v2.0.50727] "SystemDefaultTlsVersions"=dword:00000001 [HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\.NETFramework\v2.0.50727] "SystemDefaultTlsVersions"=dword:00000001 Save the net35-tls12-enable.reg file. Double-click the net35-tls12-enable.reg file. Click Yes to update your Windows Registry with these changes. Restart your computer for the change to take effect. 1.4 - Enable strong cryptography in .NET Framework 3.5 The SchUseStrongCrypto registry value restricts the use of encryption algorithms with TLS that are considered weak such as RC4. Microsoft has released an optional security update for .NET Framework 3.5 on Windows Server 2012 R2 that will automatically configure the Windows Registry keys for you. No updates are available for Windows Server 2016. You must manually configure the Windows Registry keys on Windows Server 2016. Windows Server 2012 R2 To enable strong cryptography in .NET Framework 3.5 for Windows Server 2012 R2, see the KB article Description of the security update for the .NET Framework 3.5 on Windows 8.1 and Windows Server 2012 R2: May 13, 2014 Windows Server 2016 To enable strong cryptography in .NET Framework 3.5 for Windows Server 2016, configure the following Windows registry keys: From Notepad.exe, create a text file named net35-strong-crypto-enable.reg. Copy, and then paste the following text. Windows Registry Editor Version 5.00 [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework\v2.0.50727] "SchUseStrongCrypto"=dword:00000001 [HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\.NETFramework\v2.0.50727] "SchUseStrongCrypto"=dword:00000001 Save the net35-strong-crypto-enable.reg file. Double-click the net35-strong-crypto-enable.reg file. Click Yes to update your Windows Registry with these changes. Restart your computer for the change to take effect. 1.5 - Disable earlier versions of SSL and TLS in Windows Schannel SSL and TLS support are enabled or disabled in Windows Schannel by editing the Windows Registry. Each SSL and TLS protocol version can be enabled or disabled independently. You don't need to enable or disable one protocol version to enable or disable another protocol version. [!IMPORTANT] Microsoft recommends disabling SSL 2.0 and SSL 3.0 due to serious security vulnerabilities in those protocol versions. > Customers may also choose to disable TLS 1.0 and TLS 1.1 to ensure that only the newest protocol version is used. However, this may cause compatibility issues with software that doesn't support the newest TLS protocol version. Customers should test such a change before performing it in production. The Enabled registry value defines whether the protocol version can be used. If the value is set to 0, the protocol version cannot be used, even if it is enabled by default or if the application explicitly requests that protocol version. If the value is set to 1, the protocol version can be used if enabled by default or if the application explicitly requests that protocol version. If the value is not defined, it will use a default value determined by the operating system. The DisabledByDefault registry value defines whether the protocol version is used by default. This setting only applies when the application doesn't explicitly request the protocol versions to be used. If the value is set to 0, the protocol version will be used by default. If the value is set to 1, the protocol version will not be used by default. If the value is not defined, it will use a default value determined by the operating system. To disable SSL 2.0 support in Windows Schannel From Notepad.exe, create a text file named ssl20-disable.reg. Copy, and then paste the following text. Windows Registry Editor Version 5.00 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 2.0] [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 2.0\Client] "DisabledByDefault"=dword:00000001 "Enabled"=dword:00000000 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 2.0\Server] "DisabledByDefault"=dword:00000001 "Enabled"=dword:00000000 Save the ssl20-disable.reg file. Double-click the ssl20-disable.reg file. Click Yes to update your Windows Registry with these changes. Restart your computer for the change to take effect. To disable SSL 3.0 support in Windows Schannel From Notepad.exe, create a text file named ssl30-disable.reg. Copy, and then paste the following text. Windows Registry Editor Version 5.00 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 3.0] [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 3.0\Client] "DisabledByDefault"=dword:00000001 "Enabled"=dword:00000000 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 3.0\Server] "DisabledByDefault"=dword:00000001 "Enabled"=dword:00000000 Save the ssl30-disable.reg file. Double-click the ssl30-disable.reg file. Click Yes to update your Windows Registry with these changes. Restart your computer for the change to take effect. To disable TLS 1.0 support in Windows Schannel From Notepad.exe, create a text file named tls10-disable.reg. Copy, and then paste the following text. Windows Registry Editor Version 5.00 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.0] [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.0\Client] "DisabledByDefault"=dword:00000001 "Enabled"=dword:00000000 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.0\Server] "DisabledByDefault"=dword:00000001 "Enabled"=dword:00000000 Save the tls10-disable.reg file. Double-click the tls10-disable.reg. Click Yes to update your Windows Registry with these changes. Restart your computer for the change to take effect. To disable TLS 1.1 support in Windows Schannel From Notepad.exe, create a text file named tls11-disable.reg. Copy, and then paste the following text. Windows Registry Editor Version 5.00 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1] [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Client] "DisabledByDefault"=dword:00000001 "Enabled"=dword:00000000 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Server] "DisabledByDefault"=dword:00000001 "Enabled"=dword:00000000 Save the tls11-disable.reg file. Double-click the tls11-disable.reg file. Click Yes to update your Windows Registry with these changes. Restart your computer for the change to take effect. Step 2: Update your Microsoft SQL Servers in your SharePoint farm Follow these steps to update the SQL Servers in your SharePoint farm. Steps for your SQL ServersWindows Server 2012 R2Windows Server 2016 :-----:-----:----- 2.1 - Enable TLS 1.1 and TLS 1.2 support in Microsoft SQL Server Required Required The following step is optional. You may choose to run this step based on your organization's security and compliance requirements. 2.2 - Disable earlier versions of SSL and TLS in Windows Schannel Optional Optional 2.1 - Enable TLS 1.1 and TLS 1.2 support in Microsoft SQL Server SQL Server versions earlier than SQL Server 2016 don't support TLS 1.1 or TLS 1.2 by default. To add support for TLS 1.1 and TLS 1.2, you must install updates for SQL Server. To enable TLS 1.1 and TLS 1.2 support in SQL Server, follow the instructions from the KB article TLS 1.2 support for Microsoft SQL Server 2.2 - Disable earlier versions of SSL and TLS in Windows Schannel SSL and TLS support are enabled or disabled in Windows Schannel by editing the Windows Registry. Each SSL and TLS protocol version can be enabled or disabled independently. You don't need to enable or disable one protocol version to enable or disable another protocol version. [!IMPORTANT] Microsoft recommends disabling SSL 2.0 and SSL 3.0 due to serious security vulnerabilities in those protocol versions. > Customers may also choose to disable TLS 1.0 and 1.1 to ensure that only the newest protocol version is used. However, this may cause compatibility issues with software that doesn't support the newest TLS protocol version. Customers should test such a change before performing it in production. The Enabled registry value defines whether the protocol version can be used. If the value is set to 0, the protocol version cannot be used, even if it is enabled by default or if the application explicitly requests that protocol version. If the value is set to 1, the protocol version can be used if enabled by default or if the application explicitly requests that protocol version. If the value is not defined, it will use a default value determined by the operating system. The DisabledByDefault registry value defines whether the protocol version is used by default. This setting only applies when the application doesn't explicitly request the protocol versions to be used. If the value is set to 0, the protocol version will be used by default. If the value is set to 1, the protocol version will not be used by default. If the value is not defined, it will use a default value determined by the operating system. To disable SSL 2.0 support in Windows Schannel From Notepad.exe, create a text file named ssl20-disable.reg. Copy, and then paste the following text. Windows Registry Editor Version 5.00 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 2.0] [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 2.0\Client] "DisabledByDefault"=dword:00000001 "Enabled"=dword:00000000 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 2.0\Server] "DisabledByDefault"=dword:00000001 "Enabled"=dword:00000000 Save the ssl20-disable.reg file. Double-click the ssl20-disable.reg file. Click Yes to update your Windows Registry with these changes. Restart your computer for the change to take effect. To disable SSL 3.0 support in Windows Schannel From Notepad.exe, create a text file named ssl30-disable.reg. Copy, and then paste the following text. Windows Registry Editor Version 5.00 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 3.0] [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 3.0\Client] "DisabledByDefault"=dword:00000001 "Enabled"=dword:00000000 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 3.0\Server] "DisabledByDefault"=dword:00000001 "Enabled"=dword:00000000 Save the ssl30-disable.reg file. Double-click the ssl30-disable.reg file. Click Yes to update your Windows Registry with these changes. Restart your computer for the change to take effect. To disable TLS 1.0 support in Windows Schannel From Notepad.exe, create a text file named tls10-disable.reg. Copy, and then paste the following text. Windows Registry Editor Version 5.00 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.0] [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.0\Client] "DisabledByDefault"=dword:00000001 "Enabled"=dword:00000000 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.0\Server] "DisabledByDefault"=dword:00000001 "Enabled"=dword:00000000 Save the tls10-disable.reg file. Double-click the tls10-disable.reg file. Click Yes to update your Windows Registry with these changes. Restart your computer for the change to take effect. To disable TLS 1.1 support in Windows Schannel From Notepad.exe, create a text file named tls11-disable.reg. Copy, and then paste the following text. Windows Registry Editor Version 5.00 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1] [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Client] "DisabledByDefault"=dword:00000001 "Enabled"=dword:00000000 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Server] "DisabledByDefault"=dword:00000001 "Enabled"=dword:00000000 Save the tls11-disable.reg file. Double-click the tls11-disable.reg file. Click Yes to update your Windows Registry with these changes. Restart your computer for the change to take effect. Step 3: Update your client computers used to access your SharePoint sites Follow these steps to update your client computers that access your SharePoint site. Steps for your client computersWindows 7Windows 8.1Windows 10 :-----:-----:-----:----- 3.1 - Enable TLS 1.1 and TLS 1.2 in Windows Schannel Required NA NA 3.2 - Enable TLS 1.1 and TLS 1.2 support in WinHTTP Required NA NA 3.3 - Enable TLS 1.1 and TLS 1.2 support in Internet Explorer Required NA NA 3.4 - Enable strong cryptography in .NET Framework 4.5 or higher Required Required Required 3.5 - Install .NET Framework 3.5 update for TLS 1.1 and TLS 1.2 support Required Required Required The following step is recommended. Although not directly required by SharePoint Server 2016, they provide better security by restricting the use of weak encryption algorithms. 3.6 - Enable strong cryptography in .NET Framework 3.5 Recommended Recommended Recommended The following step is optional. You may choose to run this step based on your organization's security and compliance requirements. 3.7 - Disable earlier versions of SSL and TLS in Windows Schannel Optional Optional Optional 3.1 - Enable TLS 1.1 and TLS 1.2 in Windows Schannel SSL and TLS support are enabled or disabled in Windows Schannel by editing the Windows Registry. Each SSL and TLS protocol version can be enabled or disabled independently. You don't need to enable or disable one protocol version to enable or disable another protocol version. The Enabled registry value defines whether the protocol version can be used. If the value is set to 0, the protocol version cannot be used, even if it is enabled by default or if the application explicitly requests that protocol version. If the value is set to 1, the protocol version can be used if enabled by default or if the application explicitly requests that protocol version. If the value is not defined, it will use a default value determined by the operating system. The DisabledByDefault registry value defines whether the protocol version is used by default. This setting only applies when the application doesn't explicitly request the protocol versions to be used. If the value is set to 0, the protocol version will be used by default. If the value is set to 1, the protocol version will not be used by default. If the value is not defined, it will use a default value determined by the operating system. To enable TLS 1.1 support in Windows Schannel From Notepad.exe, create a text file named tls11-enable.reg. Copy, and then paste the following text. Windows Registry Editor Version 5.00 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1] [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Client] "DisabledByDefault"=dword:00000000 "Enabled"=dword:00000001 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Server] "DisabledByDefault"=dword:00000000 "Enabled"=dword:00000001 Save the tls11-enable.reg file. Double-click the tls11-enable.reg file. Click Yes to update your Windows Registry with these changes. Restart your computer for the change to take effect. To enable TLS 1.2 support in Windows Schannel From Notepad.exe, create a text file named tls12-enable.reg. Copy, and then paste the following text. Windows Registry Editor Version 5.00 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2] [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Client] "DisabledByDefault"=dword:00000000 "Enabled"=dword:00000001 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Server] "DisabledByDefault"=dword:00000000 "Enabled"=dword:00000001 Save the tls12-enable.reg file. Double-click the tls12-enable.reg file. Click Yes to update your Windows Registry with these changes. Restart your computer for the change to take effect. 3.2 - Enable TLS 1.1 and TLS 1.2 support in WinHTTP WinHTTP doesn't inherit its SSL and TLS encryption protocol version defaults from the Windows Schannel DisabledByDefault registry value. WinHTTP uses its own SSL and TLS encryption protocol version defaults, which vary by operating system. To override the defaults, you must install a KB update and configure Windows Registry keys. The WinHTTP DefaultSecureProtocols registry value is a bit field that accepts multiple values by adding them together into a single value. You can use the Windows Calculator program (Calc.exe) in Programmer mode to add the following hexadecimal values as desired. DefaultSecureProtocols valueDescription :-----:----- 0x00000008 Enable SSL 2.0 by default 0x00000020 Enable SSL 3.0 by default 0x00000080 Enable TLS 1.0 by default 0x00000200 Enable TLS 1.1 by default 0x00000800 Enable TLS 1.2 by default For example, you can enable TLS 1.0, TLS 1.1, and TLS 1.2 by default by adding the values 0x00000080, 0x00000200, and 0x00000800 together to form the value 0x00000A80. To install the WinHTTP KB update, follow the instructions from the KB article Update to enable TLS 1.1 and TLS 1.2 as a default secure protocols in WinHTTP in Windows To enable TLS 1.0, TLS 1.1, and TLS 1.2 by default in WinHTTP From Notepad.exe, create a text file named winhttp-tls10-tls12-enable.reg. Copy, and then paste the following text. For 64-bit operating system Windows Registry Editor Version 5.00 [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Internet Settings\WinHttp] "DefaultSecureProtocols"=dword:00000A80 [HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\Windows\CurrentVersion\Internet Settings\WinHttp] "DefaultSecureProtocols"=dword:00000A80 For 32-bit operating system [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Internet Settings\WinHttp] "DefaultSecureProtocols"=dword:00000A80 Save the winhttp-tls10-tls12-enable.reg file. Double-click the winhttp-tls10-tls12-enable.reg file. Click Yes to update your Windows Registry with these changes. Restart your computer for the change to take effect. 3.3 - Enable TLS 1.1 and TLS 1.2 support in Internet Explorer Internet Explorer versions earlier than Internet Explorer 11 did not enable TLS 1.1 or TLS 1.2 support by default. Support for TLS 1.1 and TLS 1.2 is enabled by default starting with Internet Explorer 11. To enable TLS 1.1 and TLS 1.2 support in Internet Explorer From Internet Explorer, click Tools > Internet Options > Advanced or click > Internet Options > Advanced. In the Security section, verify that the following check boxes are selected. If not, click the following check boxes: Use TLS 1.1 Use TLS 1.2 Optionally, if you want to disable support for earlier security protocol versions, uncheck the following check boxes: Use SSL 2.0 Use SSL 3.0 Use TLS 1.0 [!NOTE] Disabling TLS 1.0 may cause compatibility issues with sites that don't support newer security protocol versions. Customers should test this change before performing it in production. Click OK. 3.4 - Enable strong cryptography in .NET Framework 4.5 or higher .NET Framework 4.5 and higher doesn't inherit its SSL and TLS security protocol version defaults from the Windows Schannel DisabledByDefault registry value. Instead, it uses its own SSL and TLS security protocol version defaults. To override the defaults, you must configure Windows Registry keys. The SchUseStrongCrypto registry value changes the .NET Framework 4.5 and higher security protocol version default from SSL 3.0 or TLS 1.0 to TLS 1.0 or TLS 1.1 or TLS 1.2. In addition, it restricts the use of encryption algorithms with TLS that are considered weak such as RC4. Applications compiled for .NET Framework 4.6 or higher will behave as if the SchUseStrongCrypto registry value is set to 1, even if it isn't. To ensure all .NET Framework applications will use strong cryptography, you must configure this Windows Registry value. Microsoft has released an optional security update for .NET Framework 4.5, 4.5.1, and 4.5.2 that will automatically configure the Windows Registry keys for you. No updates are available for .NET Framework 4.6 or higher. You must manually configure the Windows Registry keys on .NET Framework 4.6 or higher. For Windows 7 and Windows Server 2008 R2 To enable strong cryptography in .NET Framework 4.5 and 4.5.1 on Windows 7 and Windows Server 2008 R2, see MS14-026: Vulnerability in the .NET Framework could allow elevation of privilege: May 13, 2014 (formerly published as KB article 2938782, "Description of the security update for the .NET Framework 4.5 and the .NET Framework 4.5.1 on Windows 7 Service Pack 1 and Windows Server 2008 R2 Service Pack 1: May 13, 2014"). To enable strong cryptography in .NET Framework 4.5.2 on Windows 7 and Windows Server 2008 R2, see the KB article Description of the security update for the .NET Framework 4.5.2 on Windows 7 Service Pack 1 and Windows Server 2008 R2 Service Pack 1: May 13, 2014. For Windows Server 2012 To enable strong cryptography in .NET Framework 4.5, 4.5.1, and 4.5.2 on Windows Server 2012, see the KB article Description of the security update for the .NET Framework 4.5, the .NET Framework 4.5.1, and the .NET Framework 4.5.2 on Windows 8, Windows RT, and Windows Server 2012: May 13, 2014. For Windows 8.1 and Windows Server 2012 R2 To enable strong cryptography in .NET Framework 4.5.1 and 4.5.2 on Windows 8.1 and Windows Server 2012 R2, see the KB article Description of the security update for the .NET Framework 4.5.1 and the .NET Framework 4.5.2 on Windows 8.1, Windows RT 8.1, and Windows Server 2012 R2: May 13, 2014. To enable strong cryptography in .NET Framework 4.6 or higher From Notepad.exe, create a text file named net46-strong-crypto-enable.reg. Copy, and then paste the following text. For 64-bit operating system Windows Registry Editor Version 5.00 [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework\v4.0.30319] "SchUseStrongCrypto"=dword:00000001 [HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\.NETFramework\v4.0.30319] "SchUseStrongCrypto"=dword:00000001 For 32-bit operating system [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework\v4.0.30319] "SchUseStrongCrypto"=dword:00000001 Save the net46-strong-crypto-enable.reg file. Double-click the net46-strong-crypto-enable.reg file. Click Yes to update your Windows Registry with these changes. Restart your computer for the change to take effect. 3.5 - Install .NET Framework 3.5 update for TLS 1.1 and TLS 1.2 support .NET Framework 3.5 doesn't support TLS 1.1 or TLS 1.2 by default. To add support for TLS 1.1 and TLS 1.2, you must install a KB update and configure Windows Registry keys for each of the operating systems listed in this section. The SystemDefaultTlsVersions registry value defines which security protocol version defaults will be used by .NET Framework 3.5. If the value is set to 0, .NET Framework 3.5 will default to SSL 3.0 or TLS 1.0. If the value is set to 1, .NET Framework 3.5 will inherit its defaults from the Windows Schannel DisabledByDefault registry values. If the value is undefined, it will behave as if the value is set to 0. To enable .NET Framework 3.5 to inherit its encryption protocol defaults from Windows Schannel For Windows 7 and Windows Server 2008 R2 To install the .NET Framework 3.5.1 update for Windows 7 and Windows Server 2008 R2, see the KB article Support for TLS System Default Versions included in the .NET Framework 3.5.1 on Windows 7 SP1 and Server 2008 R2 SP1 After the KB update is installed, manually configure the registry keys. For Windows Server 2012 To install the .NET Framework 3.5 update for Windows Server 2012, see the KB article Support for TLS System Default Versions included in the .NET Framework 3.5 on Windows Server 2012 After the KB update is installed, manually configure the registry keys. For Windows 8.1 and Windows Server 2012 R2 To install the .NET Framework 3.5 SP1 update for Windows 8.1 and Windows Server 2012 R2, see the KB article Support for TLS System Default Versions included in the .NET Framework 3.5 on Windows 8.1 and Windows Server 2012 R2 After the KB update is installed, manually configure the registry keys. For Windows 10 (Version 1507) This functionality is not available in Windows 10 Version 1507. You must upgrade to Windows 10 Version 1511, and then install the Cumulative Update for Windows 10 Version 1511 and Windows Server 2016 Technical Preview 4: May 10, 2016, or upgrade to Windows 10 Version 1607 or higher. For Windows 10 (Version 1511) To install the Cumulative Update for Windows 10 Version 1511 and Windows Server 2016 Technical Preview 4: May 10, 2016, see Cumulative Update for Windows 10 Version 1511 and Windows Server 2016 Technical Preview 4: May 10, 2016. After the KB update is installed, manually configure the registry keys. Windows 10 (Version 1607) and Windows Server 2016 No update needs to be installed. Configure the Windows Registry keys as described below. To manually configure the registry keys, do these steps. From Notepad.exe, create a text file named net35-tls12-enable.reg. Copy, and then paste the following text. For 64-bit operating system Windows Registry Editor Version 5.00 [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework\v2.0.50727] "SystemDefaultTlsVersions"=dword:00000001 [HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\.NETFramework\v2.0.50727] "SystemDefaultTlsVersions"=dword:00000001 For 32-bit operating system [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework\v2.0.50727] "SystemDefaultTlsVersions"=dword:00000001 Save the net35-tls12-enable.reg file. Double-click the net35-tls12-enable.reg file. Click Yes to update your Windows Registry with these changes. Restart your computer for the change to take effect. 3.6 - Enable strong cryptography in .NET Framework 3.5 The SchUseStrongCrypto registry value restricts the use of encryption algorithms with TLS that are considered weak such as RC4. Microsoft has released an optional security update for .NET Framework 3.5 on pre-Windows 10 operating systems that will automatically configure the Windows Registry keys for you. No updates are available for Windows 10. You must manually configure the Windows Registry keys on Windows 10. For Windows 7 and Windows Server 2008 R2 To enable strong cryptography in .NET Framework 3.5.1 on Windows 7 and Windows Server 2008 R2, see the KB article Description of the security update for the .NET Framework 3.5.1 on Windows 7 Service Pack 1 and Windows Server 2008 R2 Service Pack 1: May 13, 2014 For Windows Server 2012 To enable strong cryptography in .NET Framework 3.5 on Windows Server 2012, see the KB article Description of the security update for the .NET Framework 3.5 on Windows 8 and Windows Server 2012: May 13, 2014 For Windows 8.1 and Windows Server 2012 R2 To enable strong cryptography in .NET Framework 3.5 on Windows 8.1 and Windows Server 2012 R2 see the KB article Description of the security update for the .NET Framework 3.5 on Windows 8.1 and Windows Server 2012 R2: May 13, 2014 To enable strong cryptography in .NET Framework 3.5 on Windows 10 From Notepad.exe, create a text file named net35-strong-crypto-enable.reg. Copy, and then paste the following text. For 64-bit operating system Windows Registry Editor Version 5.00 [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework\v2.0.50727] "SchUseStrongCrypto"=dword:00000001 [HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\.NETFramework\v2.0.50727] "SchUseStrongCrypto"=dword:00000001 For 32-bit operating system [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework\v2.0.50727] "SchUseStrongCrypto"=dword:00000001 Save the net35-strong-crypto-enable.reg file. Double-click the net35-strong-crypto-enable.reg file. Click Yes to update your Windows Registry with these changes. Restart your computer for the change to take effect. 3.7 - Disable earlier versions of SSL and TLS in Windows Schannel SSL and TLS support are enabled or disabled in Windows Schannel by editing the Windows Registry. Each SSL and TLS protocol version can be enabled or disabled independently. You don't need to enable or disable one protocol version to enable or disable another protocol version. [!IMPORTANT] Microsoft recommends disabling SSL 2.0 and SSL 3.0 due to serious security vulnerabilities in those protocol versions. > Customers may also choose to disable TLS 1.0 and TLS 1.1 to ensure that only the newest protocol version is used. However, this may cause compatibility issues with software that doesn't support the newest TLS protocol version. Customers should test such a change before performing it in production. The Enabled registry value defines whether the protocol version can be used. If the value is set to 0, the protocol version cannot be used, even if it is enabled by default or if the application explicitly requests that protocol version. If the value is set to 1, the protocol version can be used if enabled by default or if the application explicitly requests that protocol version. If the value is not defined, it will use a default value determined by the operating system. The DisabledByDefault registry value defines whether the protocol version is used by default. This setting only applies when the application doesn't explicitly request the protocol versions to be used. If the value is set to 0, the protocol version will be used by default. If the value is set to 1, the protocol version will not be used by default. If the value is not defined, it will use a default value determined by the operating system. To disable SSL 2.0 support in Windows Schannel From Notepad.exe, create a text file named ssl20-disable.reg. Copy, and then paste the following text. Windows Registry Editor Version 5.00 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 2.0] [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 2.0\Client] "DisabledByDefault"=dword:00000001 "Enabled"=dword:00000000 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 2.0\Server] "DisabledByDefault"=dword:00000001 "Enabled"=dword:00000000 Save the ssl20-disable.reg file. Double-click the ssl20-disable.reg file. Click Yes to update your Windows Registry with these changes. Restart your computer for the change to take effect. To disable SSL 3.0 support in Windows Schannel From Notepad.exe, create a text file named ssl30-disable.reg. Copy, and then paste the following text. Windows Registry Editor Version 5.00 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 3.0] [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 3.0\Client] "DisabledByDefault"=dword:00000001 "Enabled"=dword:00000000 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 3.0\Server] "DisabledByDefault"=dword:00000001 "Enabled"=dword:00000000 Save the ssl30-disable.reg file. Double-click the ssl30-disable.reg file. Click Yes to update your Windows Registry with these changes. Restart your computer for the change to take effect. To disable TLS 1.0 support in Windows Schannel From Notepad.exe, create a text file named tls10-disable.reg. Copy, and then paste the following text. Windows Registry Editor Version 5.00 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.0] [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.0\Client] "DisabledByDefault"=dword:00000001 "Enabled"=dword:00000000 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.0\Server] "DisabledByDefault"=dword:00000001 "Enabled"=dword:00000000 Save the tls10-disable.reg file. Double-click the tls10-disable.reg file. Click Yes to update your Windows Registry with these changes. Restart your computer for the change to take effect. To disable TLS 1.1 support in Windows Schannel From Notepad.exe, create a text file named tls11-disable.reg. Copy, and then paste the following text. Windows Registry Editor Version 5.00 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1] [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Client] "DisabledByDefault"=dword:00000001 "Enabled"=dword:00000000 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Server] "DisabledByDefault"=dword:00000001 "Enabled"=dword:00000000 Save the tls11-disable.reg file. Double-click the tls11-disable.reg file. Click Yes to update your Windows Registry with these changes. Restart your computer for the change to take effect.
OfficeDocs-SharePoint/SharePoint/SharePointServer/security-for-sharepoint-server/enable-tls-1-1-and-tls-1-2-support-in-sharepoint-server-2016.md/0
Enable TLS 1.1 and TLS 1.2 support in SharePoint Server 2016
OfficeDocs-SharePoint/SharePoint/SharePointServer/security-for-sharepoint-server/enable-tls-1-1-and-tls-1-2-support-in-sharepoint-server-2016.md
OfficeDocs-SharePoint
10,506
61
title: "Plan security hardening for SharePoint Server" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 1252017 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: 763613ac-83f4-424e-99d0-32efd0667bd9 description: "Learn about security hardening for SharePoint Server and database server roles, including specific hardening requirements for ports, protocols, and services." Plan security hardening for SharePoint Server [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] Secure server snapshots In a server farm environment, individual servers have specific roles. Security hardening recommendations for these servers depend on the role each server plays. This article contains secure snapshots for two categories of server roles: SharePoint servers Database server role The snapshots are divided into common configuration categories. The characteristics defined for each category represent the optimal hardened state for SharePoint Server. This article does not include hardening guidance for other software in the environment. In addition to hardening servers for specific roles, it is important to protect the SharePoint farm by placing a firewall between the farm servers and outside requests. The guidance in this article can be used to configure a firewall. SharePoint servers This section identifies hardening characteristics for SharePoint servers. Some of the guidance applies to specific service applications; in these cases, the corresponding characteristics need to be applied only on the servers that are running the services associated with the specified service applications. CategoryCharacteristic :-----:----- Services listed in the Services MMC snap-in Enable the following services: ASP.NET State service (if you are using InfoPath Forms Services or Project Server 2016) View State service (if you are using InfoPath Forms Services) World Wide Web Publishing Service Ensure that these services are not disabled: Claims to Windows Token Service SharePoint Administration SharePoint Timer Service SharePoint Tracing Service SharePoint VSS Writer Ensure that these services are not disabled on the servers that host the corresponding roles: AppFabric Caching Service SharePoint User Code Host SharePoint Search Host Controller SharePoint Server Search Ports and protocols TCP 80, TCP 443 (SSL) Custom ports for search crawling, if configured (such as for crawling a file share or a website on a non-default port) Ports used by the search index component — TCP 16500-16519 (intra-farm only) Ports required for the AppFabric Caching Service — TCP 22233-22236 Ports required for Windows Communication Foundation communication — TCP 808 Ports required for communication between SharePoint servers and service applications (the default is HTTP): HTTP binding: TCP 32843 HTTPS binding: TCP 32844 net.tcp binding: TCP 32845 (only if a third party has implemented this option for a service application) If your computer network environment uses Windows Server 2012, Windows Server 2008, Windows Server 2008 R2, Windows 7, or Windows Vista together with versions of Windows earlier than Windows Server 2012 and Windows Vista, you must enable connectivity over both the following port ranges: High port range 49152 through 65535 Low port range 1025 through 5000 Default ports for SQL Server communication — TCP 1433, UDP 1434. If these ports are blocked on the SQL Server computer and databases are installed on a named instance, configure a SQL Server client alias for connecting to the named instance. Microsoft SharePoint Foundation User Code Service (for sandbox solutions) — TCP 32846. This port must be open for outbound connections on all Front-end and Front-end with Distributed Cache servers. This port must be open for inbound connections on Front-end and Front-end with Distributed Cache servers where this service is turned on. Ensure that ports remain open for Web applications that are accessible to users. Block external access to the port that is used for the Central Administration site. SMTP for e-mail integration — TCP 25, or a custom TCP port if you've configured outbound e-mail to use a non-default port. Registry No additional guidance Auditing and logging If log files are relocated, ensure that the log file locations are updated to match. Update directory access control lists (ACLs) also. Web.config Follow these recommendations for each Web.config file that is created after you run Setup: Do not allow compilation or scripting of database pages via the PageParserPaths elements. Ensure \ CallStack="false" and AllowPageLevelTrace="false". Ensure that the Web Part limits around maximum controls per zone are set low. Ensure that the SafeControls list is set to the minimum set of controls needed for your sites. Ensure that your Workflow SafeTypes list is set to the minimum level of SafeTypes needed. Ensure that customErrors is turned on (\). Consider your Web proxy settings as needed (\\). Set the Upload.aspx limit to the highest size you reasonably expect users to upload. Performance can be affected by uploads that exceed 100 MB. Database server role [!NOTE] With the addition to the MinRole feature in SharePoint Server 2016, the concept of roles has changed. For information about roles, see Planning for a MinRole server deployment in SharePoint Server 2016. The primary recommendation for SharePoint Server is to secure inter-farm communication by blocking the default ports used for SQL Server communication and establishing custom ports for this communication instead. For more information about how to configure ports for SQL Server communication, see Blocking the standard SQL Server ports, later in this article. CategoryCharacteristic :-----:----- Ports Block UDP 1434. Consider blocking TCP 1433. This article does not describe how to secure SQL Server. For more information about how to secure SQL Server, see Securing SQL Server (https:go.microsoft.comfwlinkp?LinkId=186828). Specific port, protocol, and service guidance The rest of this article describes in greater detail the specific hardening requirements for SharePoint Server. In this section: Blocking the standard SQL Server ports Service application communication Connections to external servers Service requirements for e-mail integration Service requirements for session state SharePoint Server Products services Web.config file Blocking the standard SQL Server ports The specific ports used to connect to SQL Server are affected by whether databases are installed on a default instance of SQL Server or a named instance of SQL Server. The default instance of SQL Server listens for client requests on TCP 1433. A named instance of SQL Server listens on a randomly assigned port number. Additionally, the port number for a named instance can be reassigned if the instance is restarted (depending on whether the previously assigned port number is available). By default, client computers that connect to SQL Server first connect by using TCP 1433. If this communication is unsuccessful, the client computers query the SQL Server Resolution Service that is listening on UDP 1434 to determine the port on which the database instance is listening. The default port-communication behavior of SQL Server introduces several issues that affect server hardening. First, the ports used by SQL Server are well-publicized ports and the SQL Server Resolution Service has been the target of buffer overrun attacks and denial-of-service attacks, including the "Slammer" worm virus. Even if SQL Server is updated to mitigate security issues in the SQL Server Resolution Service, the well-publicized ports remain a target. Second, if databases are installed on a named instance of SQL Server, the corresponding communication port is randomly assigned and can change. This behavior can potentially prevent server-to-server communication in a hardened environment. The ability to control which TCP ports are open or blocked is essential to securing your environment. [!NOTE] We recommend to use the standard SQL ports, but ensure the firewall is configured to only allow communication with the servers that need access to the SQL Server. Servers that don't need access to the SQL Server should be blocked from connecting to the SQL Server over TCP port 1433 and UDP port 1444. There are several methods you can use to block ports. You can block these ports by using a firewall. However, unless you can be sure that there are no other routes into the network segment and that there are no malicious users that have access to the network segment, the recommendation is to block these ports directly on the server that hosts SQL Server. This can be accomplished by using Windows Firewall in Control Panel. Configuring SQL Server database instances to listen on a nonstandard port SQL Server provides the ability to reassign the ports that are used by the default instance and any named instances. In SQL Server, you reassign ports by using SQL Server Configuration Manager. Configuring SQL Server client aliases In a server farm, all front-end Web servers and application servers are SQL Server client computers. If you block UDP 1434 on the SQL Server computer, or you change the default port for the default instance, you must configure a SQL Server client alias on all servers that connect to the SQL Server computer. In this scenario, the SQL Server client alias specifies the TCP port that the named instance is listening on. To connect to an instance of SQL Server, you install SQL Server client components on the target computer and then configure the SQL Server client alias by using SQL Server Configuration Manager. To install SQL Server client components, run Setup and select only the following client components to install: Connectivity Components Management Tools (includes SQL Server Configuration Manager) For specific hardening steps for blocking the standard SQL Server ports, see Configure SQL Server security for SharePoint Server. Service application communication By default, communication between SharePoint servers and service applications within a farm takes place by using HTTP with a binding to TCP 32843. When you publish a service application, you can select either HTTP or HTTPS with the following bindings: HTTP binding: TCP 32843 HTTPS binding: TCP 32844 Additionally, third parties that develop service applications can implement a third choice: net.tcp binding: TCP 32845 You can change the protocol and port binding for each service application. On the Service Applications page in Central Administration, select the service application, and then click Publish. The HTTPHTTPSnet.tcp bindings can also be viewed and changed by using the Get-SPServiceHostConfig and Set-SPServiceHostConfig Microsoft PowerShell cmdlets. Communication between service applications and SQL Server takes place over the standard SQL Server ports or the ports that you configure for SQL Server communication. Connections to external servers Several features of SharePoint Server can be configured to access data that resides on server computers outside of the server farm. If you configure access to data that is located on external server computers, ensure that you enable communication between the appropriate computers. In most cases, the ports, protocols, and services that are used depend on the external resource. For example: Connections to file shares use the File and Printer Sharing service. Connections to external SQL Server databases use the default or customized ports for SQL Server communication. Connections to Oracle databases typically use OLE DB. Connections to Web services use both HTTP and HTTPS. The following table lists features that can be configured to access data that resides on server computers outside the server farm. FeatureDescription :-----:----- Content crawling You can configure crawl rules to crawl data that resides on external resources, including Web sites, file shares, Exchange public folders, and business data applications. When crawling external data sources, the crawl role communicates directly with these external resources. For more information, see Manage crawling in SharePoint Server. Business Data Connectivity connections Web servers and application servers communicate directly with computers that are configured for Business Data Connectivity connections. Service requirements for e-mail integration E-mail integration requires the use of two services: SMTP service Microsoft SharePoint Directory Management service SMTP service E-mail integration requires the use of the Simple Mail Transfer Protocol (SMTP) service on at least one of the front-end Web servers in the server farm. The SMTP service is required for incoming e-mail. For outgoing e-mail, you can either use the SMTP service or route outgoing email through a dedicated e-mail server in your organization, such as a Microsoft Exchange Server computer. Microsoft SharePoint Directory Management service SharePoint Server includes an internal service, the Microsoft SharePoint Directory Management Service, for creating e-mail distribution groups. When you configure e-mail integration, you have the option to enable the Directory Management Service feature, which lets users create distribution lists. When users create a SharePoint group and they select the option to create a distribution list, the Microsoft SharePoint Directory Management Service creates the corresponding Active Directory distribution list in the Active Directory environment. In security-hardened environments, the recommendation is to restrict access to the Microsoft SharePoint Directory Management Service by securing the file associated with this service, which is SharePointEmailws.asmx. For example, you might allow access to this file by the server farm account only. Additionally, this service requires permissions in the Active Directory environment to create Active Directory distribution list objects. The recommendation is to set up a separate organizational unit (OU) in Active Directory for SharePoint Server objects. Only this OU should allow write access to the account that is used by the Microsoft SharePoint Directory Management Service. Service requirements for session state Both Project Server 2016 and InfoPath Forms Services maintain session state. If you are deploying these features or products within your server farm, do not disable the ASP.NET State service. Additionally, if you are deploying InfoPath Forms Services, do not disable the View State service. SharePoint Server Products services Do not disable services that are installed by SharePoint Server (listed in the snapshot previously). If your environment disallows services that run as a local system, you can consider disabling the SharePoint Administration service only if you are aware of the consequences and can work around them. This service is a Win32 service that runs as a local system. This service is used by the SharePoint Timer service to perform actions that require administrative permissions on the server, such as creating Internet Information Services (IIS) Web sites, deploying code, and stopping and starting services. If you disable this service, you cannot complete deployment-related tasks from the Central Administration site. You must use Microsoft PowerShell to run the Start-SPAdminJob cmdlet (or use the Stsadm.exe command-line tool to run the execadmsvcjobs operation) to complete multiple-server deployments for SharePoint Server and to run other deployment-related tasks. Web.config file The .NET Framework, and ASP.NET in particular, use XML-formatted configuration files to configure applications. The .NET Framework relies on configuration files to define configuration options. The configuration files are text-based XML files. Multiple configuration files can, and typically do, exist on a single system. System-wide configuration settings for the .NET Framework are defined in the Machine.config file. The Machine.config file is located in the %SystemRoot%\Microsoft.NET\Framework\%VersionNumber%\CONFIG\ folder. The default settings that are contained in the Machine.config file can be modified to affect the behavior of applications that use the .NET Framework on the whole system. You can change the ASP.NET configuration settings for a single application if you create a Web.config file in the root folder of the application. When you do this, the settings in the Web.config file override the settings in the Machine.config file. When you extend a Web application by using Central Administration, SharePoint Server automatically creates a Web.config file for the Web application. The Web server and application server snapshot presented earlier in this article lists recommendations for configuring Web.config files. These recommendations are intended to be applied to each Web.config file that is created, including the Web.config file for the Central Administration site. For more information about ASP.NET configuration files and editing a Web.config file, see ASP.NET Configuration (https:go.microsoft.comfwlinkp?LinkID=73257). See also Concepts Security for SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/security-for-sharepoint-server/security-hardening.md/0
Plan security hardening for SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/security-for-sharepoint-server/security-hardening.md
OfficeDocs-SharePoint
3,482
62
title: "Custom branding in Suite Navigation Bar" ms.reviewer: ms.author: v-smandalika author: v-smandalika manager: serdars audience: ITPro f1.keywords: - NOCSH ms.topic: overview ms.date: 08312023 ms.service: sharepoint-server-itpro ms.localizationpriority: high ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top - Strat_SP_server ms.custom: description: "Learn about the Custom Branding feature, which is one of the newly introduced features in SharePoint Server Subscription Edition Version 23H2." Custom branding in Suite Navigation Bar [!INCLUDEappliesto-xxx-xxx-xxx-SUB-xxx-md] This article describes the "Custom branding in Suite Navigation Bar" feature, which is one of the new features introduced in the SharePoint Server Subscription Edition Version 23H2 feature update. [!NOTE] Custom branding in the Suite Navigation Bar was first introduced in SharePoint Server Subscription Edition Version 23H2, but it was initially available only for SharePoint farms in Early release. Starting with SharePoint Server Subscription Edition Version 24H1, it's available regardless of whether your SharePoint farm is in Early release or Standard release. Custom branding in the Suite Navigation Bar The SharePoint Server modern UX provides a powerful yet intuitive user interface that scales from desktop to mobile devices. However, the architecture of the modern UX limited the opportunities for organizations to apply custom branding to the Suite Navigation Bar, which is the global navigation bar that provides access to the App Launcher, contextual settings menu, and user welcome control in SharePoint sites. SharePoint Server Subscription Edition Version 23H2 introduces the ability for organizations to apply custom branding in the Suite Bar to better align with their branding standards. SharePoint farm administrators can specify and update the following attributes of the Suite Navigation Bar: SuiteNavAllowOverwrite: Determines whether the Suite Navigation Bar settings of the web application can be overridden at the site-collection level. The default value is false, meaning any attempt to customize the Suite Navigation Bar at the site collection-level will be ignored. When this attribute's value is set to true, the web application-level Suite Navigation Bar settings apply to all site collections, except those collections to which explicit customizations have been made. SuiteNavBrandingText: Specifies the branding text of the Suite Navigation Bar. SuiteNavBrandingLogoUrl: Specifies a URL location that points to your logo. Ensure that the logo is from within the web application. The logo can be in the BMP, JPG, JPE, JPEG, PNG, GIF, or SVG format. SuiteNavBrandingLogoTitle: Specifies the title of your logo. SuiteNavBrandingLogoNavigationUrl: Specifies the URL to which users will navigate when they select the branding text or the logo. SuiteBarBackground: Sets a color to use for the background of the Suite Navigation Bar. The Suite Navigation Bar appears at the top on every page of your web application. The color value should be in the form AARRGGBB, RRGGBB, or RGB as hex values. SuiteBarText: Sets a color to use for the text and icons on the Suite Navigation Bar. SuiteNavAccentColor: Sets a color to use for the background color of buttons on the Suite Navigation Bar when you hover on them. Example 1 Enable a web application to allow custom branding by setting the SuiteNavAllowCustom web application-level property to true. This property must be set to true for any of the other properties to take effect. PowerShell $webapp = Get-SPWebApplication http:spwfe $webapp.SuiteNavAllowCustom = $true $webapp.Update() Set all the options, as shown in the following command-syntax example: PowerShell $webapp.SuiteNavBrandingText = "Suite Bar Branding" $webapp.SuiteNavBrandingLogoUrl = "http:spwfePhotosIMG_5004-1-scaled.jpg" $webapp.SuiteNavBrandingLogoTitle = "Logo Branding" $webapp.SuiteNavBrandingLogoNavigationUrl = "https:www.microsoft.com" $webapp.SuiteBarBackground = 'eed5b7' $webapp.SuiteNavAccentColor = '7fffd4' $webapp.SuiteBarText = '000000' $webapp.update() Example 2 Allow the custom branding by running the following command-syntax: PowerShell $webapp = Get-SPWebApplication http:spwfe $webapp.SuiteNavAllowCustom = $true $webapp.Update() Set all the options by running the following command-syntax: PowerShell $webapp.SuiteNavBrandingText = "Contoso Bass Adventures" $webapp.SuiteNavBrandingLogoUrl = "http:spwfePhotosbass-illustration.svg" $webapp.SuiteNavBrandingLogoTitle = "Contoso Logo" $webapp.SuiteNavBrandingLogoNavigationUrl = "https:www.contoso.com" $webapp.SuiteBarBackground = '999966' $webapp.SuiteNavAccentColor = '006600' $webapp.SuiteBarText = '000000' $webapp.update() :::image type="content" source="..mediaapply-custom-branding.png" alt-text="Screenshot that shows the site page after the custom branding feature has been applied." lightbox="..mediaapply-custom-branding.png":::
OfficeDocs-SharePoint/SharePoint/SharePointServer/sites/custom-branding-in-suite-bar.md/0
Custom branding in Suite Navigation Bar
OfficeDocs-SharePoint/SharePoint/SharePointServer/sites/custom-branding-in-suite-bar.md
OfficeDocs-SharePoint
1,300
63
title: "Permissions planning for sites and content in SharePoint Server" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 8242017 audience: ITPro f1.keywords: - NOCSH ms.topic: conceptual ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: 85a1866e-2743-4f98-a1ac-9ea61905c6d4 description: "Learn about how to plan permissions for sites and site content for SharePoint Server." Permissions planning for sites and content in SharePoint Server [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] Some sites in an enterprise probably contain content that should not be available to all users. For example, proprietary technical information should be accessible only on a need-to-know basis. An intranet portal for employee benefits should be available only to full-time employees, whereas the home page of an Internet Web site is accessible by anonymous clients. Permissions control access to sites and site content. You can manage permissions by using SharePoint groups, which control membership. Fine-grained permissions also help to secure content at the item and document level. Learn about default permission levels and user permissions
OfficeDocs-SharePoint/SharePoint/SharePointServer/sites/permissions-planning-for-sites-and-content.md/0
Permissions planning for sites and content in SharePoint Server
OfficeDocs-SharePoint/SharePoint/SharePointServer/sites/permissions-planning-for-sites-and-content.md
OfficeDocs-SharePoint
329
64
SharePoint Server SharePoint Updates Accessibility guidelines What's new SharePoint Server Subscription Edition New and improved features in SharePoint Server Subscription Edition New and improved features in SharePoint Server Subscription Edition Version 22H2 New and improved features in SharePoint Server Subscription Edition Version 23H1 New and improved features in SharePoint Server Subscription Edition Version 23H2 New and improved features in SharePoint Server Subscription Edition Version 24H1 What's deprecated or removed from SharePoint Server Subscription Edition SharePoint Server 2019 New and improved features in SharePoint Server 2019 What's deprecated or removed from SharePoint Server 2019 SharePoint Server 2016 New and improved features in SharePoint Server 2016 New features in Feature Pack 1 New feature in Feature Pack 2 What's deprecated or removed from SharePoint Server 2016 Getting started Security for SharePoint Server Improved ASP.NET view state security and key management Plan for administrative and service accounts Automatic password change planning Authentication overview Plan user authentication Plan server-to-server authentication Server-to-server authentication and user profiles Kerberos authentication planning Plan for app authentication in SharePoint Server Create claims-based web applications in SharePoint Server Create web applications that use classic mode authentication in SharePoint Server Implement SAML based authentication in SharePoint Server OpenID Connect 1.0 authentication Set up OIDC authentication in SharePoint Server with Microsoft Entra ID Set up OIDC authentication in SharePoint Server with Active Directory Federation Services (AD FS) Migration of Windows claims authentication to SAML based claims authentication in SharePoint Server Secure Sockets Layer (SSL) and Transport Layer Security (TLS) protocol support in SharePoint Server SSL Certificate Management Operations Create new certificates Import certificates Assign certificates to web applications Replace a certificate assignment Remove certificates View certificates Export certificates Outgoing SMTP support for client certificate authentication Renew certificates Rename certificate friendly names Move certificates between certificate stores View certificate default settings Set certificate default settings Certificates administrative action logging SSL certificate management in central administration New health analyzer rules for SSL certificates Private key management for SSL certificates Enable TLS and SSL support in SharePoint 2013 Enable TLS 1.1 and TLS 1.2 support in SharePoint Server 2016 Enable TLS 1.1 and TLS 1.2 support in SharePoint Server 2019 TLS in SharePoint Server Subscription Edition Strong Transport Layer Security (TLS) Encryption Transport Layer Security (TLS) 1.3 Support Security hardening Configure AMSI integration with SharePoint Server Plan for least-privileged administration Configure SQL Server security for SharePoint environments Federal Information Processing Standard security standards Install IP support in SharePoint Server Installation Overview for SharePoint Server Subscription Edition System requirements for SharePoint Server Subscription Edition Hardware and Topology Requirements for SharePoint Server Subscription Edition Software Requirements for Database Servers for SharePoint Server Subscription Edition Software Requirements for SharePoint Servers for SharePoint Server Subscription Edition Browser support planning Installing SharePoint Server Subscription Edition on one server Installing SharePoint Server Subscription Edition on Windows Server Core Install or uninstall language packs for SharePoint Servers Subscription Edition Uninstall SharePoint Server Subscription Edition Repair SharePoint Server Subscription Edition Install for SharePoint Server 2019 System requirements for SharePoint Servers 2016 and 2019 Hardware and software requirements Browser support planning Install SharePoint Servers 2016 or 2019 on one server Install SharePoint Servers 2016 or 2019 across multiple servers Install or uninstall language packs Add a server to a SharePoint Server 2016 or SharePoint 2019 farm Install for SharePoint Server 2016 System requirements for SharePoint Servers 2016 and 2019 Hardware and software requirements Browser support planning Software boundaries and limits Prerequisites Initial deployment administrative and service accounts in SharePoint Server Account permissions and security settings in SharePoint Servers 2016 and 2019 Install prerequisites from network share Overview of MinRole Server Roles in SharePoint Server Planning for a MinRole server deployment in SharePoint Server Install SharePoint Servers 2016 or 2019 on one server Install SharePoint Servers 2016 or 2019 across multiple servers Install or uninstall language packs Add a server to a SharePoint Server farm Install for SharePoint 2013 System requirements for SharePoint 2013 Hardware and software requirements Browser support planning Software boundaries and limits Capacity management and sizing overview Prerequisites Account permissions and security settings in SharePoint 2013 Installation and configuration overview Single server with a built-in database Single server with SQL Server Multiple servers for a three-tier farm Install or uninstall language packs Add web or application server to the farm Add a database server to an existing farm Best Practices for SharePoint Server installation Deploying on virtual machines Configure Configure client certificate authentication Configure syncing with the new OneDrive sync app User Profile service overview Create a User Profile service application My Sites overview My Sites planning Configure My Sites Upgrade and Update Upgrade to SharePoint Server Subscription Edition Get started with upgrade Overview of the upgrade process Overview of the services upgrade process Upgrade databases Create the SharePoint Server Subscription Edition farm for a database attach upgrade Copy databases to the new farm for upgrade to SharePoint Server Subscription Edition Upgrade service applications to SharePoint Server Subscription Edition Upgrade content databases to SharePoint Server Subscription Edition Verify database upgrades in SharePoint Server Subscription Edition Upgrade a site collection Troubleshoot site collection upgrade issues in SharePoint Server Subscription Edition Upgrade to SharePoint Server 2019 Upgrade from SharePoint 2013 to SharePoint Server 2019 Get started with upgrade Overview of the upgrade process Overview of the services upgrade process Upgrade databases Create the SharePoint Server 2019 farm for a database attach upgrade Copy databases to the new farm for upgrade to SharePoint Server 2019 Upgrade service applications to SharePoint Server 2019 Upgrade content databases Verify upgrade for databases Upgrade a site collection Upgrade My Sites Upgrade to SharePoint Server 2016 Get started with upgrade Overview of the upgrade process Overview of the services upgrade process Best practices for upgrade Upgrade databases Create the SharePoint Server 2016 farm for a database attach upgrade Copy databases to the new farm for upgrade to SharePoint Server 2016 Upgrade service applications to SharePoint Server 2016 Upgrade content databases Verify upgrade for databases Upgrade site collections Upgrade a site collection Upgrade My Sites Video: Cleanup of databases after upgrade procedure Deploy updates for SharePoint Server 2016, 2019, and Subscription Edition Software updates overview Install a software update Video demo of Zero Downtime Patching in SharePoint Server 2016 SharePoint Server zero downtime patching steps Video: How to enable Remote Windows PowerShell to use with SharePoint Server Upgrade from SharePoint 2010 to SharePoint 2013 Get started with upgrade What's new in SharePoint 2013 upgrade Overview of the upgrade process from SharePoint 2010 to SharePoint 2013 Upgrade overview videos for SharePoint 2013 Services upgrade overview from SharePoint 2010 to SharePoint Server 2013 Upgrade farms that share services (parent and child farms) to SharePoint 2013 Best practices for upgrading from SharePoint 2010 to SharePoint 2013 Review supported editions and products for upgrading to SharePoint 2013 Plan for upgrade Determine strategy for upgrade to SharePoint 2013 Create a plan for current customizations during upgrade to SharePoint 2013 Plan for site collection upgrades in SharePoint 2013 Create a communication plan for the upgrade to SharePoint 2013 Clean up an environment before an upgrade to SharePoint 2013 Test and troubleshoot an upgrade Use a trial upgrade to SharePoint 2013 to find potential issues Troubleshoot database upgrade issues in SharePoint 2013 Troubleshoot site collection upgrade issues in SharePoint 2013 Branding issues that may occur when upgrading to SharePoint 2013 Restart a database-attach upgrade or a site collection upgrade to SharePoint 2013 Upgrade databases Checklist for database-attach upgrade (SharePoint 2013) Create the SharePoint 2013 farm for a database attach upgrade Copy databases to the new farm for upgrade to SharePoint 2013 Upgrade service applications to SharePoint 2013 Upgrade content databases from SharePoint 2010 to SharePoint 2013 Verify upgrade Migrate from classic-mode to claims-based authentication in SharePoint 2013 Upgrade site collections Run site collection health checks in SharePoint 2013 Upgrade a site collection to SharePoint 2013 Review site collections upgraded to SharePoint 2013 Manage site collection upgrades to SharePoint 2013 Upgrade My Sites to SharePoint Server 2013 Advanced upgrade scenarios Search-first migration from FAST Search Server for SharePoint 2010 to SharePoint Server 2013 How to upgrade an environment that uses content type syndication (SharePoint Server 2013) Deploy custom features to upgraded site collections in SharePoint Server 2013 Deploy software updates for SharePoint 2013 Software updates overview Prepare to deploy software updates Install a software update Update Workflow in SharePoint Server 2013 Test and troubleshoot an upgrade Troubleshoot site collection upgrade issues Product Servicing Policy Updated Product Servicing Policy for SharePoint Server Subscription Edition FAQs for SharePoint Server Subscription Edition product servicing policy Updated Product Servicing Policy for SharePoint Server 2019 Updated Product Servicing Policy for SharePoint Server 2016 Updated Product Servicing Policy for SharePoint 2013 Sites Plan sites and site collections Sites and site collections overview Plan self-service site creation Configure self-service site creation in SharePoint Server 2019 Enable SharePoint home page in SharePoint Server 2019 farms Site policy overview Site navigation overview Plan for multilingual sites Plan for the multilingual user interface Themes overview Plan site maintenance and management Custom branding in Suite Navigation Bar Manage site collections Create a site collection Delete and restore site collections Manage the lock status for site collections View all site collections Manage a connection to a document center or a records center Change site collection administrators Create, edit, and delete quota templates Manage unused site collections Manage web parts Configure and deploy web parts Edit existing web parts in SharePoint Permissions planning for sites and content User permissions and permission levels Overview of the Contribute permission level in SharePoint Server Best practices for using fine-grained permissions in SharePoint Server Overview of security groups in SharePoint Server Determine permission levels and groups in SharePoint Server Choose administrators and owners for the administration hierarchy in SharePoint Server Plan site permissions in SharePoint Server Overview of site permissions in SharePoint Server Managing digital assets overview Plan digital asset libraries OneDrive overview OneDrive planning Set up OneDrive Configure the OneDrive modern user experience Search Search planning Search architecture overview Plan enterprise search architecture Scale enterprise search Redesign topology for more content and users Redesign for specific performance requirements Scale search for Internet sites Best practices for organizing content for search Plan crawling and federation Search schema overview Understanding result sources for search Plan to transform queries and order results Best practices of disaster recovery for search Overview of search result ranking Overview of analytics processing Configure search Create and configure a Search service application Create a Search Center site Deploy people search Search administration Classic and modern search differences Manage the Search Center Configure properties of the Search Box Web Part Configure properties of the Search Results Web Part Configure properties of the Refinement Web Part Configure properties of the Search Navigation Web Part Manage the index Manage the search schema Add or remove a file type from the search index Delete items from the search index or from search results Reset the search index Manage crawling Add, edit, or delete a content source Change the default account for crawling Start, pause, resume, or stop a crawl Manage continuous crawls Manage crawl rules Configure and use the Exchange connector Configure and use the Documentum connector Configure and use the Lotus Notes connector Manage farm-level crawl settings Configure time-out values for crawler connections Configure the crawler in case of SSL certificate warnings Configure proxy server settings for Search Best practices for crawling Manage relevance Manage query spelling correction Manage query suggestions Create and deploy a thesaurus Configure authoritative pages Manage query rules Configure result sources for search Create and deploy custom entity extractors Manage company name extraction Customize search result types Create custom ranking model Create custom dictionary Configure trust for search between two SharePoint Server farms Manage the search topology Change the default search topology Manage search components Manage the index component Manage a paused Search service application View search diagnostics Enable search alerts Enable query logging Export and import customized search configuration settings Search center scenarios Set up a Search Center How to create a Search Center Site Collection and enable crawling of your content How to configure the Search Results Web Part to use a new result source Plan to use refiners on a search results page How to add refiners to your search results page How to add a custom search vertical to your search results page How to change the way search results are displayed Understanding how search results are displayed Understanding how item display templates and hit highlighting work How to create a new result type How to display values from custom managed properties in search results - option 1 How to display values from custom managed properties in search results – option 2 How to display values from custom managed properties in the hover panel How to add a custom action to the hover panel How to change the text that is displayed in the Search Box Web Part How to change the order in which search results are displayed Create and import a thesaurus Create and import query suggestions Changing the ranking of search results Hybrid Explore SharePoint Server hybrid The building blocks of Microsoft 365 hybrid Minimum public update levels for SharePoint hybrid features SharePoint hybrid sites and search Hybrid search in SharePoint Learn about cloud hybrid search for SharePoint Learn about hybrid federated search for SharePoint Hybrid Configuration Wizard in the SharePoint admin center Plan SharePoint Server hybrid Plan hybrid SharePoint taxonomy and hybrid content types Plan hybrid OneDrive Plan cloud hybrid search for SharePoint Plan hybrid profiles The extensible hybrid app launcher Hybrid site following Plan server-to-server authentication Plan hybrid federated search Plan connectivity from Microsoft 365 to SharePoint Server Install and configure SharePoint Server hybrid Hardware and software requirements for SharePoint hybrid Accounts needed for hybrid configuration and testing Configuration roadmaps Configure hybrid OneDrive - roadmap Configure hybrid sites features - roadmap Configure cloud hybrid search - roadmap Cloud hybrid search FAQ Configure hybrid federated search (SharePoint Server) - roadmap Configure hybrid federated search (SharePoint in Microsoft 365) - roadmap Configure hybrid Business Connectivity Services - roadmap Configure the hybrid infrastructure Configure Microsoft 365 for SharePoint hybrid Set up SharePoint services for hybrid environments Configure server-to-server authentication Run Hybrid Configuration Wizard Configure inbound connectivity Configure a reverse proxy device for SharePoint Server hybrid Configure Web Application Proxy for a hybrid environment Configure Forefront TMG for a hybrid environment Configure a hybrid solution Configure hybrid OneDrive Show results from Microsoft 365 in on-premises SharePoint with cloud hybrid search Enable previews of on-premises search results in cloud hybrid search Display hybrid federated search results in SharePoint Server Display hybrid federated search results in SharePoint in Microsoft 365 Configure hybrid SharePoint taxonomy and hybrid content types Hybrid self-service site creation Deploy a Business Connectivity Services hybrid solution Prepare your environment Deploy the hybrid scenario as an external list Validate the hybrid scenario Remove hybrid scenarios Governance What is governance in SharePoint? IT governance in SharePoint Application management and governance in SharePoint Information management and governance in SharePoint Managed metadata planning Configure the Managed Metadata service How to use the Managed Solutions Gallery Searching and using keywords in the eDiscovery Center Export content and create reports in the eDiscovery Center Create and run queries in the eDiscovery Center Add content to a case and place sources on hold in the eDiscovery Center eDiscovery and in-place holds in SharePoint Server Plan for eDiscovery Configure eDiscovery Plan and manage cases in the eDiscovery Center Records management in SharePoint Server Create a file plan to manage records Plan how records are collected Use a records archive or manage records in place Document management in SharePoint Server Identify users and analyze document usage Document library planning Content type and workflow planning Document set planning Information management policy planning Versioning, content approval, and check-out planning Co-authoring overview Configure versioning for co-authoring Configure the co-authoring versioning period Configure the maximum number of co-authoring authors Disable co-authoring Workflow in SharePoint Server SharePoint Workflow Manager Farm Restore and Disaster Recovery Getting started with SharePoint Server workflow Install and configure workflow for SharePoint Server Reset Certificate Generation Key for SharePoint Workflow Manager Install Workflow Manager certificates in SharePoint Server Video series: Install and configure Workflow in SharePoint Server 2013 Update Workflow in SharePoint Server Upgrade from Workflow Manager to SharePoint Workflow Manager on a new farm Administration Server Management Feature release rings Configure Request Manager in SharePoint Server Global deployment of multiple farms Global architectures for SharePoint Server WAN performance and testing SQL Server and storage Overview of SQL Server in SharePoint Server 2016 and 2019 environments Deploy SharePoint Server with Azure SQL Managed Instance Overview of SQL Server in a SharePoint Server 2013 environment Storage and SQL Server capacity planning and configuration Database management Add a content database Attach or detach content databases Move site collections between databases Move content databases Move all databases Move or rename service application databases Run a farm that uses read-only databases Manage RBS RBS overview RBS planning Remote Share Provider Install and configure RBS Install and configure RBS with a 3rd party provider Set a content database to use RBS Migrate content into or out of RBS Maintain RBS Disable RBS on a content database Best practices for SQL Server in a SharePoint Server farm Configure log shipping Backup and recovery overview Backup and recovery planning Prepare to back up and restore Configure permissions for backup and restore Backup Back up a farm Back up a farm configuration Back up a web application Back up a service application Back up a User Profile Service application Back up a Search service application Back up the Secure Store Service Back up a content database Back up databases to snapshots Back up customizations Back up site collections Back up apps for SharePoint Export a site, list, or document library Restore Restore a farm Restore a farm configuration Document farm configuration settings Copy configuration settings between farms Restore a web application Restore a service application Restore a User Profile Service application Restore a Search service application Restore a Secure Store Service application Restore a content database Attach and restore a read-only content database Restore content from an unattached content database Restore customizations Restore site collections Restore apps for SharePoint Import a list or document library Best practices for backup and restore High availability and disaster recovery concepts Plan for disaster recovery Plan for SQL Server Always On and Microsoft Azure disaster recovery Supported high availability and disaster recovery options for SharePoint databases Plan for high availability Disaster Recovery best practices for SharePoint Server and Access Services Configure an Always On Availability Group Performance planning in SharePoint Server 2013 Capacity management and sizing for SharePoint Server 2013 Capacity planning Performance testing Monitoring and maintaining Optimize performance Performance and capacity test results and recommendations for SharePoint 2013 Enterprise intranet collaboration performance and capacity Web Content Management capacity and performance Managed Metadata Service capacity and performance Video content management capacity and performance Compliance and eDiscovery capacity and performance Social performance and capacity Divisional collaboration performance and capacity Managing a MinRole Server Farm in SharePoint Server 2016 Role conversion using MinRole in SharePoint Server 2016 Description of MinRole and associated services in SharePoint Server 2016 Remove a server from a farm in SharePoint Servers 2016 or 2019 Remove a server from a farm in SharePoint 2013 Uninstall SharePoint 2013 Uninstall SharePoint Servers 2016 or 2019 Custom Tiles in SharePoint Server Configure server-to-server authentication Replace the STS certificate Turn on automated document translation Manage the Distributed Cache service Plan for feeds and the Distributed Cache service General guidance for hosters in SharePoint Server 2013 Understanding multi-tenancy in SharePoint Server 2013 Plan service deployment in SharePoint Server SharePoint Server design samples: Corporate portal and extranet sites Host-named site collection architecture and deployment in SharePoint Server Update a web application URL and IIS bindings for SharePoint Server Change web application bindings for SharePoint Server Subscription Edition Plan alternate access mappings for SharePoint Server Configure alternate access mappings for SharePoint Server Install and manage apps for SharePoint Server Plan for apps for SharePoint Configure an environment for apps for SharePoint Manage the App Catalog Add apps for SharePoint to a SharePoint site Remove app for SharePoint instances from a SharePoint site Monitor apps for SharePoint Monitor and manage app licenses Email integration planning Incoming email planning Outgoing email planning Configure Exchange Autodiscover with a My Site Host URL Configure email integration Incoming email configuration Outgoing email configuration Configure Exchange task synchronization Configure site mailboxes in SharePoint Service application management Start or stop a service Assign or remove administrators of service applications Configure automatic password change Delete a service application Share service applications across farms Exchange trust certificates between farms Publish a service application Set permission to a published service application Connect to a service application on a remote farm Add or remove a service application connection to a Web application Restrict or enable access to a service application Monitoring overview Using Administrative Actions logging in SharePoint Server 2016 Configure SharePoint Hybrid Auditing (Preview) Configure monitoring Configure diagnostic logging Configure SharePoint Health Analyzer timer jobs Configure usage and health data collection Configure SharePoint Health Analyzer rules Overview of scripted monitoring configuration Profile schema reference Run scripted monitoring configuration View reports and logs View health reports View diagnostic logs View timer job status Monitor cache performance View data in the logging database Monitoring planning Configure the One Customer Voice (OCV) feedback Overview of Access Services in SharePoint Server 2013 Set up and configure Access Services for Access apps Set up and configure Access Services 2010 for web databases in SharePoint 2013 Web applications management Manage permissions for a web application Manage permission policies for a web application Manage anonymous access for a web application Extend a claims-based web application Define managed paths Cache settings operations Cache settings configuration for a web application Configure object cache user accounts Flush the BLOB cache Web content management Overview of publishing to Internet, intranet, and extranet sites Plan for Internet, intranet, and extranet publishing sites Plan for cross-site publishing Overview of cross-site publishing Plan the logical architecture for cross-site publishing Plan SharePoint authoring sites for cross-site publishing Plan SharePoint publishing sites for cross-site publishing Plan search for SharePoint cross-site publishing sites Plan variations for multilingual cross-site publishing site Overview of managed navigation Plan navigation term sets Variations overview Plan for variations Caching and performance planning View usage reports How to set up a product-centric website An introduction to cross-site publishing Stage 1: Create site collections for cross-site publishing Stage 2: Import list content into the Product Catalog Site Collection Stage 3: How to enable a list as a catalog Stage 4: Set up search and enable the crawling of your catalog content From site column to managed property - What's up with that? Stage 5: Connect your publishing site to a catalog Stage 6: Upload and apply a new master page to a publishing site Stage 7: Upload page layouts and create new pages in a publishing site Stage 8: Assign a category page and a catalog item page to a term Stage 9: Configure the query in a Content Search Web Part on a category page Stage 10: Configure the query in a Content Search Web Part on a catalog item page Stage 11: Upload and apply display templates to the Content Search Web Part Stage 12: Plan to use refiners for faceted navigation in - Part I Stage 13: Plan to use refiners for faceted navigation - Part II Stage 14: Configure refiners for faceted navigation Stage 15: Add refiners for faceted navigation to a publishing site Stage 16: Add a Taxonomy Refinement Panel Web Part to a publishing site How to display recommendations and popular items in SharePoint Server An introduction to recommendations and popular items Change the Content Search Web Part display template and use Windows PowerShell to start Usage analytics Add and configure the Recommended Items and Popular Items Web Part Use recommendations and popular items on websites with anonymous users View and configure usage analytics reports Search Engine Optimization (SEO) Integrating social media with public-facing websites Configure web content management solutions Configure cross-site publishing Connect a publishing site to a catalog Assign a category page and a catalog item page to a term Configure Search Web Parts Configure refiners and faceted navigation Configure result sources for web content management Create query rules for web content management Configure recommendations and usage event types Business intelligence Software requirements for business intelligence Excel Services overview Configure Excel Services BI capabilities in Excel and Excel Services Data authentication for Excel Services Data sources supported in Excel Services (SharePoint Server 2013) Plan Excel Services Global Settings Manage Excel Services global settings Plan Trusted File Locations Manage Excel Services trusted file locations Plan Trusted Data Connection Libraries Manage Excel Services trusted data connection libraries Plan Excel Services Data Model settings Manage Excel Services data model settings Manage Excel Services trusted data providers Manage Excel Services user defined function assemblies Share data connections by using Excel and Excel Services (SharePoint Server 2013) Share a SQL Server Analysis Services data connection using Excel Services (SharePoint Server 2013) Share a SQL Server data connection using Excel Services (SharePoint Server 2013) Share an OLE DB or ODBC connection using Excel Services (SharePoint Server 2013) Share workbooks by using Excel Services (SharePoint Server 2013) Create an Excel Services dashboard using an OData data feed Create an Excel Services dashboard using SQL Server Analysis Services data Create an Excel Services dashboard using a Data Model (SharePoint Server 2013) Configure PerformancePoint Services PerformancePoint Services in SharePoint Server 2016 overview Enable trusted locations for PerformancePoint Services PerformancePoint Services application settings Overview of Visio Services in SharePoint Server Configure Visio Services Plan Visio Services security in SharePoint Server Plan Visio Services deployment in SharePoint Server Data authentication for Visio Services in SharePoint Server Use Visio Services with external lists in SharePoint Server Use Visio Services with SharePoint lists Configure the Secure Store Service Plan the Secure Store Service in SharePoint Server Secure Store for Business Intelligence service applications Use Excel Services with Secure Store Configure the unattended service account Configure Excel Services data refresh by using embedded data connections in SharePoint Server 2013 Configure Excel Services data refresh by using external data connections in SharePoint Server 2013 Use PerformancePoint Services with Secure Store Configure the unattended service account for PerformancePoint Services Configure Secure Store for use with PerformancePoint Services Use Visio Services with Secure Store Configure the unattended service account Configure data refresh by using external data connections Use Secure Store with SQL Server Authentication Configure Power Pivot for SharePoint 2013 Data refresh using Secure Store Data refresh using a specified account Data refresh using the unattended data refresh account Use Analysis Services EffectiveUserName in SharePoint Server Use EffectiveUserName with Excel Services (SharePoint Server 2013) Use EffectiveUserName in PerformancePoint Services Configure AdventureWorks User profiles and identities Plan user profiles Overview of profile synchronization in SharePoint Server 2013 Profile synchronization in SharePoint Server 2016 Plan profile synchronization for SharePoint Server 2013 Microsoft Identity Manager in SharePoint Servers Install Microsoft Identity Manager for User Profiles in SharePoint Servers Use a sample MIM solution in SharePoint Servers Overview of Microsoft Identity Manager Synchronization Service in SharePoint Server Deployment considerations for implementing Microsoft Identity Manager with SharePoint Server People Picker and claims providers overview Plan for People Picker Configure People Picker Configure People Picker in SharePoint Server Subscription Edition Plan for custom claims providers for People Picker Enhanced People Picker for modern authentication User Profile service administration Add, edit, or delete custom properties for a user profile Manage profile synchronization Configure profile synchronization Configure profile synchronization by using SharePoint Active Directory Import Start profile synchronization manually Schedule profile synchronization Maintain profile synchronization Create an audience for SharePoint Server Create a web application in SharePoint Server Server name indication improvement for web application Configure basic authentication for a claims-based web application in SharePoint Server Configure digest authentication for a claims-based web application in SharePoint Server Edit general settings on a web application in SharePoint Server Mobile devices overview Plan for mobile views in SharePoint 2013 Mobile device browsers supported in SharePoint 2013 Mobile security and authentication in SharePoint 2013 Authentication for mobile devices in SharePoint 2013 Supporting the SharePoint mobile apps online and on-premises Business Connectivity Services overview Configure a Business Data Connectivity service application Security tasks overview Plan a Business Connectivity Services solution Configure Business Connectivity Services solutions Deploy an on-premises solution Create Test and Developer Environments Test lab guides Configure SharePoint Server 2013 in a three-tier farm Configure intranet and team sites Demonstrate forms-based claims authentication Demonstrate profile synchronization Demonstrate SAML-based claims authentication Demonstrate social features Configure eDiscovery Configure a highly available SharePoint Server 2013 Search topology SharePoint 2013 devtest environments in Azure SharePoint Server in Microsoft Azure SharePoint Server devtest environment in Azure Intranet SharePoint Server in Azure devtest environment Designing a SharePoint Server farm in Azure Deploying SharePoint Server with SQL Server Always On Availability Groups in Azure SharePoint Intranet Farm in Azure Phase 1: Configure Azure SharePoint Intranet Farm in Azure Phase 2: Configure domain controllers SharePoint Intranet Farm in Azure Phase 3: Configure SQL Server Infrastructure SharePoint Intranet Farm in Azure Phase 4: Configure SharePoint servers SharePoint Intranet Farm in Azure Phase 5: Create the availability group and add the SharePoint databases Integrate Viva Engage with on-premises SharePoint Server Viva Engage networks, groups, and users overview Social scenarios with Viva Engage and SharePoint Server Integrate a new Viva Engage network into SharePoint Server Integrate a single Viva Engage network into SharePoint Server Integrate multiple Viva Engage networks into SharePoint Server Integrate a Viva Engage network into SharePoint Server with social features Integrate a Viva Engage network into SharePoint Server and Microsoft 365 Add Viva Engage to the SharePoint Server navigation Hide SharePoint Server default social features Add the Viva Engage Embed widget to a SharePoint page Troubleshoot Claims authentication does not validate user Troubleshoot common fine-grained permissions issues SharePoint site inaccessible Upgrade SharePoint 2013 to SharePoint 2016 through Workflow Manager Upgrade SharePoint 2016 to SharePoint 2019 through Workflow Manager After installing .NET security patches to address CVE-2018-8421, SharePoint crawler may fail Technical reference System Center Operations Manager knowledge articles Access Services in SharePoint Server Machine Translation service in SharePoint Server Search in SharePoint Server Visio Services in SharePoint Server Project Server 2013 knowledge articles Database types and descriptions SharePoint Health Analyzer rules reference A State Service Application has no database defined Accounts used by application pools or service identities are in the local machine Administrators group All State Service databases are paused for a State Service Application Application pools recycle when memory limits are exceeded Business Data Connectivity connectors are currently enabled in a partitioned environment Cached objects have been evicted Certificate notification contacts haven't been configured Content databases contain orphaned Apps Critical state of this rule indicates that the Word Automation Services is not running when it should be running Databases exist on servers running SharePoint Foundation Databases require upgrade or not supported Databases running in compatibility range, upgrade recommended Databases within this farm are set to read only and will fail to upgrade unless it is set to a read-write state Databases used by SharePoint have outdated index statistics Dedicated crawl target configuration has one or more invalid servers Distributed cache service is not configured on server(s) Distributed cache service is not enabled in this deployment Distributed cache service is unexpectedly configured on server(s) Drives used for SQL databases are running out of free space Expired sessions are not being deleted from the ASP.NET Session State database Firewall client settings on the cache host are incorrect Immediate translations for the Machine Translation service are disabled InfoPath form library forms cannot be filled out in a Web browser InfoPath Forms Services forms cannot be filled out in a Web browser because no State Service connection is configured More cache hosts are running in this deployment than are registered with SharePoint One of the cache hosts in the cluster is down One or more servers is not responding One or more servers can't retrieve People Picker credentials One or more servers can't retrieve the outgoing email credentials One or more web applications are configured to use Windows Classic authentication Product patch installation or server upgrade required Some content databases are growing too large SSL certificates are about to expire SSL certificates have expired The Application Discovery and Load Balancer Service is not running in this farm The InfoPath Forms Services Maintenance Timer Job not enabled The Machine Translation Service is not running when it should be running The number of Distributed Cache hosts in the farm exceeds the recommended value. The Security Token Service is not available The settings for the Machine Translation Service are not within the recommended limits The settings for Word Automation Services are not within the recommended limits The State Service Delete Expired Sessions timer job is not enabled The timer service failed to recycle The unattended Service Account Application ID is not specified or has an invalid value The Visio Graphics Service has a maximum cache age setting that will adversely impact performance The Visio Graphics Service has a maximum cache size setting that may adversely impact performance The Visio Graphics Service has a maximum recalc duration setting that will adversely impact user perceived performance The Visio Graphics Service has a maximum Web Drawing Size setting that will adversely impact performance The Visio Graphics Service has a minimum cache age setting that may cause a security issue The Visio Graphics Service has a minimum cache age setting that will adversely impact performance This Distributed Cache host may cause cache reliability problems Upcoming SSL certificate expirations Verify each User Profile service application has a My Site host configured Verify each User Profile Service Application has an associated Managed Metadata Service Connection Verify each User Profile Service Application has an associated Search Service Connection Verify that OAuth is configured correctly for the Machine Translation Service application Verify that OAuth is configured correctly for the Machine Translation Service application proxy Verify that the Activity Feed Timer Job is enabled Verify that the critical User Profile Application and User Profile Proxy Application timer jobs are available and have not been mistakenly deleted Web Applications using Claims authentication require an update Web.config file has incorrect settings for the requestFiltering element Web.config files are not identical on all machines in the farm XLIFF translations for the Machine Translation Service is disabled The server farm account should not be used for other services Databases used by SharePoint have fragmented indices The paging file size should exceed the amount of physical RAM in the system Automatic Update setting inconsistent across farm servers Built-in accounts are used as application pool or service identities Missing server side dependencies One or more categories are configured with Verbose trace logging Outbound e-mail has not been configured Server role configuration isn't correct One or more app domains for web applications aren't configured correctly Drives are running out of free space Drives are at risk of running out of free space Content databases contain orphaned items The Net.Pipe Listener Adapter isn't available One or more services have started or stopped unexpectedly The current server is running low on memory AMSI protection may not be working Basic authentication is deprecated The nonce cookie certificate is not imported Timer job reference for SharePoint Server Default timer jobs in SharePoint Server 2019 Default timer jobs in SharePoint Server 2016 Default timer jobs in SharePoint 2013 Search technical reference Default crawled file name extensions and parsed file types Default connectors Linguistic search features Crawled and managed properties overview Query variables Result types and display templates that are used to display search results Supported and unsupported Documentum object types and properties Publishing technical reference Automatically created managed properties in SharePoint Stsadm to Microsoft PowerShell mapping Display template reference in SharePoint Server Technical diagrams SharePoint PowerShell
OfficeDocs-SharePoint/SharePoint/SharePointServer/spstoc/toc.md/0
[SharePoint Server](../sharepoint-server.yml)
OfficeDocs-SharePoint/SharePoint/SharePointServer/spstoc/toc.md
OfficeDocs-SharePoint
8,191
65
title: "Content databases contain orphaned items (SharePoint Server)" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 8312017 audience: ITPro f1.keywords: - NOCSH ms.topic: troubleshooting ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: 636d25e9-be42-4b66-a354-9b9af570f907 description: "Learn how to resolve the SharePoint Health Analyzer rule: Content databases contain orphaned items, for SharePoint Server." Content databases contain orphaned items (SharePoint Server) [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] Rule Name: Content databases contain orphaned items. Summary: The SharePoint Health Analyzer has detected some sites in a content databases that are not referenced in the configuration database. These sites may not be accessible. Cause: A restore operation that was not completed can result in sites in a content database that are not referenced in the SharePoint configuration database. Resolution: Decrease the number of days to store log files Verify that the user account that is performing this procedure is a member of the Farm Administrators group. On the SharePoint Central Administration website, click Monitoring, in the Health Analyzer section, click Review problems and solutions. On the Review problems and solutions page, click the alert for the failing rule, and then click Fix Now. Keep the dialog open so you can run the rule again to confirm the resolution. [!NOTE] The Fix Now feature removes all orphans from the content database. After following the steps in the Remedy section, in the Review problems and solutions dialog for the alert, click Re-analyze Now to confirm the resolution. If the problem is resolved, the rule is not flagged as a failing rule on the Review problems and solutions page.
OfficeDocs-SharePoint/SharePoint/SharePointServer/technical-reference/content-databases-contain-orphaned-items.md/0
Content databases contain orphaned items (SharePoint Server)
OfficeDocs-SharePoint/SharePoint/SharePointServer/technical-reference/content-databases-contain-orphaned-items.md
OfficeDocs-SharePoint
467
66
ms.date: 10232018 title: "Default timer jobs in SharePoint Server 2019" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars audience: ITPro f1.keywords: - NOCSH ms.topic: reference ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top description: "Learn about the default timer jobs in SharePoint Server 2019." Default timer jobs in SharePoint Server 2019 [!INCLUDEappliesto-xxx-xxx-2019-xxx-xxx-md] Default timer jobs The following table lists the default timer jobs for SharePoint Server 2019. Timer job Description Default schedule ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Access Web App Export to SharePoint List Driving the process of exporting data from Access Web App to SharePoint List. 1 Minute Activities Auto Cleanup Deletes activities which are older than the number of days value that is specified. Weekly Analytics Event Store Retention Weekly Analytics Timer Job for \ Periodically schedules analytics for Search Service Application. 10 Minutes App installation queue processor 1 Minute App installation Service Installs and uninstalls Apps. 5 Minutes App State Update Retrieves and applies updated information on apps from the SharePoint Store, including the availability of updates and disable information. Hourly Application Addresses Refresh job Synchronizes connection information for remote service applications. 15 Minutes Application Server Administration Service Timer Job Synchronizes farm-wide settings related to the Search and SSO services to each server in the farm. 1 Minute Application Server Timer Job Synchronizes farm-wide settings related to the Search and SSO services to each server in the farm. 1 Minute Async Feature Activation Job Timer Job that activates features asynchronously. 1 Minute Audit Log Trimming Trims audit log entries from site collections. Monthly Autohosted app instance counter Counts the number of autohosted app instances per site subscription. Weekly Bulk Operation Detection Job This job detects bulk operations on content so that the user and admin can be notified. Disabled Bulk workflow task processing Work item to batch process workflow tasks. Daily CEIP Data Collection Records datapoints on the local machine. Daily Cell Storage Data Cleanup Timer Job Deletes temporary Cell Storage data and frees up SQL Server disk space. 15 Minutes Change Log The Change Log records many different types of changes made to SharePoint sites. This timer job is used to periodically delete old entries from the log. Daily Compliance Dar Processing This job processes data at rest compliance tasks. 10 Minutes Compliance Dar Processing Multiplexer This job processes data compliance tasks in parallel. 10 Minutes Compliance High Priority Policy Processing This job processes high priority data at rest compliance tasks. 15 Minutes Compliance High Priority Policy Processing Multiplexer This job processes high priority compliance tasks in parallel. 15 Minutes Compliance Policy Processing This job processes compliance policies as defined in Policy Center and invokes appropriate actions on items. Daily Config Collection Cache Refresh Checks for configuration database collection cache inconsistencies and refreshes the cache file on all servers. Hourly Config Collection Full Cache Refresh Performs a full refresh of the configuration database collection cache file on all servers. Daily Content database upgrade session cleanup job Clean up old content database upgrade sessions. Weekly Content Organizer Processing Processes documents in the drop off library which match organizing rules. Daily Content Type Subscriber Retrieves content types packages from the hub and applies them to the local content type gallery. Hourly Crawl Log Cleanup for \ Periodically cleans up the crawl log tables to remove stale log information. Daily Database Performance Metric Provider 1 Minute Database Wait Statistics Periodically gather database wait statistics. Hourly Dead Site Delete Disabled Deferred access control list update job Applies updates to access control lists (ACLs) resulting from broad security changes. 1 Minute Delete Job History Deletes old entries from the timer job history. Daily Delete Upgrade Evaluation Site Collections job Deletes upgrade evaluation site collections which are past their expiry date and sends notifications to the ones that are near expiry date. Daily Diagnostic Data Provider: App Usage Periodically gathers App Statistics. Disabled Diagnostic Data Provider: Event Log Collects Windows Event Log entries. Disabled Diagnostic Data Provider: IO Intensive SQL Queries Collects a SQL trace of IO intensive SQL queries. Disabled Diagnostic Data Provider: Per-database IO Collects IOs for each database file. Disabled Diagnostic Data Provider: Performance Counters – Database Servers Disabled Diagnostic Data Provider: Performance Counters – Web Front Ends Collects Performance Monitor Counters data on web front ends. Disabled Diagnostic Data Provider: Site Size Collects size for each site collection. Disabled Diagnostic Data Provider: SQL Blocking Queries Collects data associated with blocked SQL queries. \ Diagnostic Data Provider: SQL Blocking Reports Captures the text of any queries that cause SQL blocking. Disabled Diagnostic Data Provider: SQL Deadlocks Captures the call graphs of SQL deadlocks. Disabled Diagnostic Data Provider: SQL DMV Collects SQL Dynamic Management Views (DMV) data. Disabled Diagnostic Data Provider: SQL Memory DMV Collects SQL Dynamic Management Views (DMV) data. Disabled Diagnostic Data Provider: Trace Log Collects Trace Log entries. Disabled Disk Over Quota Warning Sends out disk over quota warning e-mail notifications. Daily Document and List Static Data Fixed Sample job1 Timer job to collect fixed sampled static documents and lists information. Daily Document Full Crawl Blob Compression Processing Disabled Document ID assignment job Work item that assigns Document ID to all items in the site collection. Daily Document ID enabledisable job Work item that propagates content type changes across all sites when the Document ID feature is reconfigured. Daily Document Set fields synchronization job Synchronizes metadata from the document set to the items inside. 15 Minutes Document Set template update job Propagates changes made to the document set template to the existing items. Hourly Dump site information Daily Dump web information Daily eDiscovery In-Place Hold Processing The in-place hold timer job initiates and releases the holds of SharePoint web sites. Hourly Enterprise Metadata site data update Updates all Site Collections after a language pack addition or an Enterprise Metadata Service application restore. Hourly Expiration policy This job processes items that are due for a retention action, such as deleting items passed their expiration date. Weekly Extension Map Refresh Checks for changes in the Extension Map data. 10 Minutes File Post Processor This job processes the files asynchronously after the file has been saved. The processing includes extraction of the file-specific metadata and generation of default thumbnails. 1 Minute Fix Site Storage Metrics Hourly Gradual column index management job Builds or removes column indexes in large lists. 5 Minutes Gradual Site Delete Hourly Health Analysis job (Daily, Central Administration, All Servers) Runs SharePoint Health Analyzer jobs. Daily Health Analysis job (Daily, Central Administration, Any Server) Runs SharePoint Health Analyzer jobs. Daily Health Analysis job (Daily, Machine Translation Service, All Servers) Runs SharePoint Health Analyzer jobs. Daily Health Analysis job (Daily, Machine Translation Service, Any Server) Runs SharePoint Health Analyzer jobs. Daily Health Analysis job (Daily, Microsoft SharePoint Foundation Timer, All Servers) Runs SharePoint Health Analyzer jobs. Daily Health Analysis job (Daily, Microsoft SharePoint Foundation Timer, Any Server) Runs SharePoint Health Analyzer jobs. Daily Health Analysis job (Daily, Microsoft SharePoint Foundation Web Application, All Servers) Runs SharePoint Health Analyzer jobs. Daily Health Analysis job (Daily, Microsoft SharePoint Foundation Web Application, Any Server) Runs SharePoint Health Analyzer jobs. Daily Health Analysis job (Daily, User Profile Service, Any Server) Runs SharePoint Health Analyzer jobs. Daily Health Analysis job (Daily, Word Automation Services, All Servers) Runs SharePoint Health Analyzer jobs. Daily Health Analysis job (Hourly, Distributed Cache, All Servers) Runs SharePoint Health Analyzer jobs. Hourly Health Analysis job (Hourly, Microsoft SharePoint Foundation Timer, All Servers) Runs SharePoint Health Analyzer jobs. Hourly Health Analysis job (Hourly, Microsoft SharePoint Foundation Timer, Any Server) Runs SharePoint Health Analyzer jobs. Hourly Health Analysis job (Hourly, Security Token Service, All Servers) Runs SharePoint Health Analyzer jobs. Hourly Health Analysis job (Hourly, User Profile Service, Any Server) Runs SharePoint Health Analyzer jobs. Hourly Health Analysis job (Hourly, Word Automation Services, Any Server) Runs SharePoint Health Analyzer jobs. Hourly Health Analysis job (Monthly, Microsoft SharePoint Foundation Timer, Any Server) Runs SharePoint Health Analyzer jobs. Monthly Health Analysis job (Weekly, Central Administration, All Servers) Runs SharePoint Health Analyzer jobs. Weekly Health Analysis job (Weekly, Microsoft SharePoint Foundation Timer, All servers) Runs SharePoint Health Analyzer jobs. Weekly Health Analysis job (Weekly, Microsoft SharePoint Foundation Timer, Any Server) Runs SharePoint Health Analyzer jobs. Weekly Health Analysis job (Weekly, Microsoft SharePoint Foundation Web Application, All Servers) Runs SharePoint Health Analyzer jobs. Weekly Health Analysis job (Weekly, User Profile Service, Any Server) Runs SharePoint Health Analyzer jobs. Weekly Health Statistics Updating 1 Minute High Write Volume Sites Document Changed Anti-virus Processing Disabled Hold Processing and Reporting This job generates reports about items on hold and removes items from holds that are pending release. Daily Hybrid Auditing ODL Error Log Upload Job Upload event IDs and timestamps from ODL error logs to Microsoft to monitor the health and stability of the SharePoint Hybrid Auditing Feature. This job only uploads data if you have turned on the SharePoint Hybrid Auditing feature in the Hybrid Configuration Wizard – if Hybrid Auditing is not configured this timer job will do nothing. Disabled Identity column maintenance job Checks and reseeds identity column values. Weekly Immediate Alerts Sends out immediate and scheduled alerts. 1 Minute Indexing Schedule Manager on \ 5 Minutes Information management policy This job performs background processing for information policies, such as calculating updated expiration dates for items with a new policy. Weekly Internal App State Update Retrieves and applies updated information on apps from App Catalogs. Hourly Large list automatic column index management job Automatically manage list column indices for large lists. Daily License Renewal Renews all licenses of the apps from the SharePoint Store. Hourly Licensing Synchronizer Job Synchronizes licensing information in the configuration database. Hourly Limited Permissions Cleanup Job This job removes unnecessary limited permission role assignments from items, libraries and sites. Daily Machine Translation Service – Language Support Timer Job Updates the languages available to the Machine Translation Service. Weekly Machine Translation Service – Machine Translation Service Timer Job Initiates translation of documents which have been submitted to the Machine Translation Service. 15 Minutes Machine Translation Service – Remove Job History Timer Job Removes the history for expired jobs from the Word Automation Services. Weekly Microservice work item Timer Job Timer job that processes microservice work items in the work item queue. 1 Minute Microsoft SharePoint Foundation Usage Data Import Imports usage log files into the logging database. 5 Minutes Microsoft SharePoint Foundation Usage Data Maintenance Performs maintenance in the logging database. Hourly Microsoft SharePoint Foundation Usage Data Processing Process andor aggregate usage data in the logging database. Disabled Migration Job 1 Minute My Site Cleanup Job Handles the deletion of user profiles and My Sites of those users. Daily My Site Host Automatic Upgrade A timer job for automatically upgrading the My Site Host. Daily My Site Instantiation Interactive Request Queue A timer job queue for interactive (web initiated) My Site instantiation requests. 1 Minute My Site Instantiation Non-Interactive Request Queue A timer job queue for non-interactive (Office-client initiated) My Site instantiation requests. 1 Minute My Site Second Instantiation Interactive Request Queue A second timer job queue for interactive (web initiated) My Site instantiation requests. 1 Minute My Sites Automatic Upgrade A timer job for automatically upgrading the My Sites. Daily Notification Timer Job \ The Notification Job is used to query and update the notification list and send out pending scheduling notifications. Daily Over Quota Notification Requests Queue A timer job queue for site over quota email notification requests. Hourly Password Management Sends email and logs events for expiring passwords and password changes. Makes sure managed passwords are changed before they expire. Daily Performance Metric Provider The diagnostic data provider that collects the perf metrics data. 1 Minute Persisted Navigation Term Set Synchronization The timer job that synchronizes the persisted copy of navigation term sets. Hourly Prepare query suggestions Prepares candidate queries for query suggestion and performs pre-computations for result block ranking. Daily Product Version job Checks the install state of the machine and puts that data into the database. Daily Project Server: Active Directory Sync job for Project Server Service Application Synchronizes Active Directory with Project Web App enterprise resource pools and security groups. Daily Project Server: Alerts and Reminders job for Project Server Service Application Sends the alerts and reminders set up by Project Web App users. Daily Project Server: Backup and restore job for Project Server Service Application Backs up and restores Project Web App data to and from the archive store, using the schedule set by the Project Server administrator. Daily Project Server: Database Maintenance job for Project Server Service Application This timer job performs routine maintenance on the Project Server database including defragmenting the indexes and updating the database usage. Daily Project Server: Language Installation job for Project Server Service Application Completes Project Web App Language Pack installation in the database, and ensures deployment of localized Report Center reports. Daily Project Server: Monitor Scheduled Cube Jobs job for Project Server Service Application Updates data analysis cubes as scheduled in Project Web App. Hourly Project Server: Permission Sync State Cleanup job for Project Server Service Application This timer job purges older sync states to maintain the performance of user sync. Daily Project Server: Product Feedback job for Project Server Service Application Collects statistical data on the usage, reliability and performance of Project Server features and sends this information to Microsoft to be used to improve the product in future releases. Daily Project Server: Projects Cleanup job for Project Server Service Application This timer job cleans up any fragments of project data that are orphaned or redundant. Daily Project Server: Queue Auto Heal job for Project Server Service Application This timer job tries to automatically heal stuck Project Server queue jobs - when the queue job is stuck at Waiting For Processing state or Processing state due to internal error. 30 Minutes Project Server: Queue Maintenance job for Project Server Service Application This timer job purges older Project Server queue jobs to maintain the performance of the Project Server queue. Daily Project Server: Resource Capacity Refresh job for Project Server Service Application This timer job refreshes the resource capacity information in Project Web App reporting. Daily Project Server: Synchronization of Project Web App permissions to SharePoint Server permissions job for Project Server Service Application Synchronizes Project permissions to the SharePoint Server project sites. Users who can view or change projects in Project Web App will be granted permissions to the SharePoint Server sites for those projects. You can change these permissions from the PWA Settings page. Daily Project Server: Synchronization of SharePoint Server permissions to Project Web App permissions job for Project Server Service Application This timer job synchronizes SharePoint Server permissions to Project Web App. 1 Minute Project Server: Synchronize Exchange OOF Calendar job for Project Server Service Application Synchronizes out-of-office time for users who have selected this option. The Microsoft Exchange calendar of each user is synchronized with their Project Web App resource calendar. Daily Project Server: Task List Synchronizer for SharePoint Tasks List Projects job for Project Server Service Application This timer job updates Project Server with the latest changes from connected SharePoint Server Project Task Lists. 5 Minutes Project Server: Workflow Maintenance job for Project Server Service Application Maintains the health of Project Server workflows. It resolves issues between Enterprise Project Templates and workflows, updates the status of workflows and terminates completed workflows. Daily Query Classification Dictionary Update for \ Periodically updates dictionary used for query classification. 30 Minutes Query Classification Dictionary Update for \ Periodically updates dictionary used for query classification. 30 Minutes Query Classification Dictionary Update for \ Periodically updates dictionary used for query classification. 30 Minutes Query Logging Updates query and click logs by inserting new entries and deleting old ones. 15 Minutes Query Suggestions Updates search dictionaries for query suggestions. Daily Rebalance crawl store partitions for \ Timer job that rebalances crawl store partitions. 1 Minute Recycle Bin Daily Repair Orphan Site Collections Daily Scheduled Approval The Approval Job is used to approve pages on a schedule. 1 Minute Scheduled Unpublish The Unpublish job is used to unpublish pages according to the set schedule. 1 Minute Search and Process This job performs bulk actions on a set of search results, such as adding all items in an eDiscovery query to a specified hold. Daily Search Change Log Generator The timer job that generates appropriate change logs when SharePoint items change. This is required for search to function properly. 5 Minutes Search Custom Dictionaries Update Updates the custom dictionaries used for Search. These include custom dictionaries for company extraction and for query spelling correction. 10 Minutes Search Engine Sitemap job The search engine sitemap job is used to generate search engine sitemaps and update robots.txt. Daily Search Health Monitoring – Trace Events 1 Minute Second Async Feature Activation Job Second Timer Job that Activates features Asynchronously. 1 Minute SharePoint BI Maintenance This timer job periodically deletes temporary dashboard objects and user-persistent filter values from the database . The longevity of these values can be set on the PerformancePoint Services Settings page. Hourly SharePoint Server CEIP Data Collection Daily Site Lookup Refresh Checks the sitemap data for site lookup changes. 10 Minutes Site Master Invalidation Checks the site masters in content DB for any feature or site definition changes. If required re-creates the site master. Hourly Site Policy and Exchange Site Mailbox Policy Update Updates Exchange Site Mailboxes with the site policy of the associated SharePoint site. Daily Software Quality Metrics reporting for Search Collections and reports Software Quality Metrics reporting for Search. Weekly Solution Daily Resource Usage Update Marks the daily boundary for sandboxed solution resource quota monitoring. Daily Solution Resource Usage Log Processing Aggregates resource usage data from sandboxed solution execution. 3 Minutes Solution Resource Usage Update Records resource usage data from sandboxed solution execution, and sends email to owners of site collections that are exceeding their allotted resource quota. 5 Minutes Spelling Customizations Upgrade Upgrades user spelling customizations from the previous SharePoint version to this version. This job will run on schedule until it succeeds with the upgrade and then be set to disabled. If there are no spelling customizations to upgrade, it will be set to disabled after the first run. Disabled Spelling Dictionary Update Updates the dynamic dictionary that is used to correct the spelling of queries with changes in the indexed content. Note that this is a time-consuming operation that should not be executed more than once a day. Daily State Service Delete Expired Sessions Deletes expired data stored in the State Service databases. Hourly Storage Metrics Processing Processes storage metrics changes for site collections. 5 Minutes SyncDefaultComplianceTags Sync List's DefaultComplianceTag to its items. 5 Minutes Taxonomy Update Scheduler Updates Site Collections with the latest term changes made to the Enterprise Metadata Service. Hourly Thicket Feature Enabled State Recalculation Job Determines whether to disable the thicket feature in a site collection based on an analysis of its contents. Daily Thicket Repair Job This job repairs hidden orphan thicket supporting files by downloading and re-uploading them. Daily Timer Service Recycle Daily Topology State Cleanup for \ Periodically cleans up the Topology State tables to remove old inactive topologies. Daily Translation Export Job Definition Exports page and list content to XLIFF for human translation or machine translation via the Machine Translation Service. 15 Minutes Translation Import Job Definition Imports translated page and list content from XLIFF to correct location in a site collection. 15 Minutes Unified Policy File Sync Job This job synchronizes unified policy components such as custom sensitive types from the master store to SharePoint Server. 5 Minutes Unified Policy File Sync Urgent Job This job handles urgent requests to sync unified policy components such as custom sensitive types from the master store to SharePoint Server. 15 Minutes Unified Policy OnPrem Sync Job This job synchronizes unified policy from the master policy store for SharePoint Server. Hourly Unified Policy Sync Status Update Job his job uploads workload policy sync status to master policy store. 5 Minutes Upgrade site collections job Upgrades site collections in a content database. Daily Upgrade site collections job Upgrades site collections in a content database. 10 Minutes Upgrade site collections job Upgrades site collections in a content database. Hourly Upgrade Work Item Job Processes deferred work items following an upgrade. Daily Upload App Analytics Job Uploads aggregated app usage data to Microsoft. Microsoft uses this data to improve the quality of apps in the marketplace. If you have multiple content farms connecting to the same search server, activate this feature only on one farm. Daily Usage Analytics Timer Job for \ Periodically schedules processing of the Usage Analytics analysis. 10 Minutes UPA - User PointPublishing Processing Job Executes User PointPublishing personal site collection operations. 1 Minute UPA – Social Rating Synchronization Job Timer job to synchronize rating values between Social database and Content database. Hourly UPA – User change import timer job Imports user property changes into UPA database. 5 Minutes UPA – Feed Cache Repopulation Job Handled the repopulation of feed cache. 5 Minutes UPA – User Profile to SharePoint Full Synchronization Synchronizes user information from the user profile application to SharePoint users and synchronizes site memberships from SharePoint to the user profile application. Hourly UPA – User Profile to SharePoint Language And Region Synchronization Synchronizes language and region information from the user profile application to SharePoint users. 15 Minutes UPA – Feed Cache Full Repopulation Job Handles the full repopulation of feed cache. 5 Minutes UPA – User Profile to SharePoint Quick Synchronization Synchronizes user information from the user profile application to SharePoint users recently added to a site. 5 Minutes UPA – User Profile Change Cleanup Job Cleans up data which is 14 days old from User Profile Change Log. Daily UPA – Background Operations Processing Job Executes background operations for the User Profile Application. 5 Minutes UPA – User Profile ActiveDirectory Import Job Imports objects from Active Directory into Profile Database. 5 Minutes UPA – Activity Feed Job Pre-computes activities to be shown in users' activity feeds. 10 Minutes UPA – User Profile Change Job Processes changes to User Profiles. Hourly UPA – Audience Compilation Job Computes memberships of defined audiences. Weekly UPA – User Profile Language Synchronization Job Looks for new language pack installations and makes sure that strings related to user profile service are localized properly. Hourly UPA – My Site Suggestions Email Job Sends out emails with suggestions for keywords and people to follow to people who don't update their profile often, prompting them to update their profiles. Monthly UPA – Activity Feed Cleanup Job Cleans up pre-computed activities used in activity feeds which are older than 14 days. This job does not affect the User Profile Change Log. Daily UPA – Updates Profile Memberships and Relationships Job Updates group membership changes and Profile relationships from Active Directory into Profile Database. 5 Minutes UPA – Profile Attribute Synch Job Syncs attributes from Active Directory into Profile Database. 10 Minutes UPA – Social Data Maintenance Job Aggregates social tags and ratings and cleans the social data change log. Hourly Variations Create Hierarchies Job Definition Creates a complete variations hierarchy by spawning all sites and pages from the source site hierarchy for all Variation labels. Hourly Variations Propagate List Items Job Definition Propagates list items to variant sites. 15 Minutes Variations Propagate Page Job Definition Updates or creates peer pages in variant sites. 15 Minutes Variations Propagate Sites and Lists Timer Job Creates variant sites when the Variations Automatic Creation setting is enabled. 30 Minutes Video Query Rule Provisioner Provisions video query rule for a site when the Search Service Application becomes available. Daily Word Automation Services – Remove Job History Timer Job Removes the history for expired jobs from the Word Automation Services. Weekly Word Automation Services Timer Job 15 Minutes Workflow Processes workflow events. 5 Minutes Workflow auto Cleanup Deletes tasks and workflow instances which have been marked complete longer than the expiration specified in the workflow associa... Daily Workflow Failover Processes events for workflows that have failed and are marked to be retried. 15 Minutes 1 This timer job is not needed by SharePoint Server 2019 and is removed by the security update for SharePoint Server 2019: March 9, 2021 (KB4493230). See also Other Resources Default timer jobs in SharePoint Server 2016 Default timer jobs in SharePoint 2013
OfficeDocs-SharePoint/SharePoint/SharePointServer/technical-reference/default-timer-jobs-in-sharepoint-server-2019.md/0
Default timer jobs in SharePoint Server 2019
OfficeDocs-SharePoint/SharePoint/SharePointServer/technical-reference/default-timer-jobs-in-sharepoint-server-2019.md
OfficeDocs-SharePoint
5,652
67
title: "More cache hosts are running in this deployment than are registered with SharePoint (SharePoint Server)" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 1252017 audience: ITPro f1.keywords: - NOCSH ms.topic: troubleshooting ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: 5e7be5ff-5216-406b-9bfe-f9ff1f9651aa description: "Learn how to resolve the SharePoint Health Analyzer rule: More Cache hosts are running in this deployment than are registered with SharePoint, for SharePoint Server." More cache hosts are running in this deployment than are registered with SharePoint (SharePoint Server) [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] Rule Name: More Cache hosts are running in this deployment than are registered with SharePoint. Summary: Some cache hosts are running but not registered with SharePoint Server. Cause:SharePoint Server fails to identify some cache hosts. Resolution: Log on to the cache host that is not registered with SharePoint Server, and then manually stop the AppFabric Caching Service. Identify the cache hosts that are not registered with SharePoint Server. To do this, in the SharePoint Central Administration website, in the Monitoring section, click Review problems and solutions, and then find the name of the server in the Failing Servers list. If there are multiple failing servers in a server farm, you must repeat the following steps on each failing server. Verify that the user account that is performing this procedure is a member of the Administrators group on the local computer. On Server Manager, click Tools, and then select Services. In the Services list, double-click AppFabric Caching Service. In the AppFabric Caching Service Properties (Local Computer) dialog, click Stop. See also Concepts Manage the Distributed Cache service in SharePoint Server Plan for feeds and the Distributed Cache service in SharePoint Server Other Resources Planning and using the Distributed Cache service
OfficeDocs-SharePoint/SharePoint/SharePointServer/technical-reference/more-cache-hosts-are-running-in-this-deployment-than-are-registered-with-sharepo.md/0
More cache hosts are running in this deployment than are registered with SharePoint (SharePoint Server)
OfficeDocs-SharePoint/SharePoint/SharePointServer/technical-reference/more-cache-hosts-are-running-in-this-deployment-than-are-registered-with-sharepo.md
OfficeDocs-SharePoint
517
68
ms.date: 03132018 title: "Search in SharePoint Server knowledge articles" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars audience: ITPro f1.keywords: - NOCSH ms.topic: troubleshooting ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: 3fdf9bcc-11c0-48af-b8c3-327902f53d73 description: "Learn how to resolve alerts about the search index, content processing, query processing, and other search issues in the SharePoint Server management pack for Systems Center Operations Manager (SCOM)." Search in SharePoint Server knowledge articles [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] Learn how to resolve alerts about the search index, content processing, query processing, and other search issues in the SharePoint Servers 2019, 2016, and 2013 management pack for Systems Center Operations Manager (SCOM). The articles in this section are knowledge articles for search services in SharePoint Server. Typically, these articles appear when you click a link in an alert in the Operations Manager console. You can use these articles to help you troubleshoot and resolve problems in search. Download and install: System Center Management Pack for SharePoint Server 2019 System Center Monitoring Pack for SharePoint Server 2016 System Center Monitoring Pack for SharePoint Server 2013 System Center Monitoring Pack for SharePoint Foundation 2013 Use the following to resolve alerts about the search issues: Analytics analysis: failed to start - search analytics Analytics analysis: failed to start warning - search analytics Content Processing: Fallback word breaker did not load Content Processing: Query classification dictionary exceeds size limit Content Processing: Spelling dictionary exceeds size limit Content Processing: Search Custom Dictionaries Update Content Processing: Spelling Dictionary Update Content Processing: Gatherer Content Processing connector Content Processing: flow failed to start Content Processing: Query classification dictionary close to size limit Content Processing: Spelling dictionary close to size limit Crawler: Search Gatherer Host Unavailable DocParsing: No More Parser Server Workers DocParsing: Parser Server Worker Failed to Restart Index: Lost Generations Index: Missing partition Index: Indexing Blocked Index: Journal IO Exception Read Index: Journal IO Exception Write Index Lookup: Schema service availability query processing Index Lookup: Missing partition Query Processing: Query classification dictionary update Query Processing: Search Service Application Availability Query Processing: Component Availability - Query Processing Query processing: flow failed to start Query Processing: Fallback word breaker did not load Query Processing: Query Component Get Configuration Query Processing: Query Normalization Schema Service Availability Query Processing: Query Parsing Schema Service Availability Query Processing: QueryParsing Scope Cache Availability Query Service: Service availability query processing Query Service: Start Service Availability - Query Processing Query Service: Unable to stop query processing Query URL Mapping: Alternate URL Mapping Service Availability - Query Processing Schema Reader: Schema Service Availability - Query Processing Search Admin Platform Services: Repository Initialization Failed Search Admin Platform Services: Repository Installation Failed Search Admin Platform Services: Repository Replication Search Analytics: analysis run state Search Analytics Search analytics: Timer job cannot resolve Analytics Processing Engine (APE) Search Analytics: Timer job cannot resolve Link database Search Analytics: analysis run state search analytics Search Gatherer: Disk Full Crawler Search Usage Analytics: Analysis configuration failed Search Usage Analytics: Analysis failed to start Search Usage Analytics: Feeding failure Search Usage Analytics: Reporting API write failure Search Usage Analytics: Store not available Search Usage Analytics: Usage analytics APE not available Usage table exceeded max bytes limit Usage table exceeded max bytes limit Services Host Controller Analytics analysis: failed to start - search analytics Alert Name: Analytics analysis: failed to start - search analytics Summary: At a preconfigured time interval (once every 24 hours) the timer job will try to start a run of the search analytics analysis. This alert is triggered when the timer job fails to start the analysis. Multiple problems can prevent the analysis from being started. If the timer job fails to start the analysis, the timer job will stop and retry later (in 10 minutes). Cause Analytics processing component is not available. Network outage. Server issues. Analysis configuration database not available. Resolution Make sure these areas are functioning: The analytics processing component is functioning properly. There are no networking issues. There are no server issues. The analysis reporting database is available. Analytics analysis: failed to start warning - search analytics Alert Name: Analytics analysis: failed to start warning - search analytics Summary: If a search analytics analysis is still running when the timer job is trying to start a new run of the same analysis, this new start has to be postponed. The system with try to restart the search analytics analysis as soon as the timer job detects that the running analysis has stopped. Cause One or more of the following might be the cause: Previous analysis run has not completed. Heavy load on servers or network leads to longer analysis run time. Resolution Make sure these areas are functioning: Current running analysis does have progress. If not, investigate why there is no progress. There might be a networking or server issue. Resolve these. Do make sure that all databases are available (Link database, analytics reporting database, etc.). You might have to perform a manual stop of the analysis if it is running without any progress. Content Processing: Fallback word breaker did not load Alert Name: Content Processing: Fallback word breaker did not load Summary: The word breaker service could not load the fallback word breaker for the indicated word breaker language. Cause Installation error or another transient error that might be resolved by a content processing component restart or rebooting the server In an on-premises setting, it could also indicate an error caused by changing the word breaker configuration. Resolution Restart the content processing component or restart the server. component. If a word breaker customization was tried, the logs could suggest more information to indicate a cause and resolution. Content Processing: Query classification dictionary exceeds size limit Alert Name: Content Processing: Query classification dictionary exceeds size limit Summary: The dictionary has grown too large and is exceeding the size limit. The current dictionary will still be used and no dictionary updates will occur until dictionary compilation succeeds again. The system will retry until the dictionary compiles again. But a reduction in data from the Term Store may be required. Cause The dictionary used for query classification rules has grown too large and is now exceeding the size limit. Resolution Check the term sets used for query classification rules. Entries may need reducing to make the dictionary compile again. The recurring retries may otherwise create a high load on the content processing component. Content Processing: Spelling dictionary exceeds size limit Alert Name: Content Processing: Spelling dictionary exceeds size limit Summary: The query spelling correction dictionary has grown too large and is now exceeding the size limit. The current dictionary will still be used and no dictionary updates will occur until dictionary compilation succeeds again. The system will retry until the dictionary compiles again. But a reduction in data in the search index or a reconfiguration of the query spelling components may be required. Cause The dynamic query spelling correction dictionary has grown too large and is exceeding the size limit. Resolution Check the size of the index. This error indicates that the index is very large and might need splitting soon. Also, consider lowering the number of dictionary terms allowed per organization by using the cmdlet Set-SPEnterpriseSearchQuerySpellingCorrection. The recurring retries may otherwise create a high load on the content processing component. Content Processing: Search Custom Dictionaries Update Alert Name: Content Processing: Search Custom Dictionaries Update Summary: The timer job "Search Custom Dictionaries Update" fails. Cause One or more of the following might be the cause: The Search service application or the term store is paused or not available. An error occurred during the deployment of the custom entity extraction dictionary or while establishing a connection. Resolution Check the status of the Search service application and the term store, restart if it is necessary. Check whether the custom entity extraction dictionary is deployed, and make sure that connection is present. Restart timer job if necessary. Content Processing: Spelling Dictionary Update Alert Name: Content Processing: Spelling Dictionary Update Summary: The timer job "Spelling Dictionary Update" fails. Cause One or more of the following might be the cause: The Search service application or the term store is paused or not available. An error occurs during deployment or connection establishment. Resolution Check the status of the Search service application and the term store, restart if it is necessary. Make sure that connection is present. Restart timer job if it is necessary. Content Processing: Gatherer Content Processing connector Alert Name: Content Processing: Gatherer Content Processing connector Summary: The crawler could not connect to content processing. Cause This may be caused by intermittent loss of network connectivity, or the content processing component is not available. Resolution If the problem persists, check network connectivity between the crawler and content Processing. Also, check the content processing component, and restart if it is necessary. Content Processing: flow failed to start Alert Name: Content Processing: flow failed to start Summary: The indicated processing flow is not started as expected. Cause The required services are not present, or the search index may be unavailable. Resolution Check that the required search components and databases are running and restart these if necessary. Content Processing: Query classification dictionary close to size limit Alert Name: Content Processing: Query classification dictionary close to size limit Summary: The dictionary has grown close to its size limit (at or beyond 80 percent of its size limit). Cause The number of terms in the corresponding dictionary used for query classification rules across all organizations is growing towards the limit. Resolution Check the term sets used for query classification. Content Processing: Spelling dictionary close to size limit Alert Name: Content Processing: Spelling dictionary close to size limit Summary: The query spelling correction dictionary has grown close to its size limit (at or beyond 80 percent of its size limit). Cause The number of terms relevant for the dynamic query spelling correction dictionary across all organizations is growing towards the limit. Resolution Check the number of indexed documents. If the number of indexed documents is close to the system's limit, this warning can be discarded. Crawler: Search Gatherer Host Unavailable Alert Name: Crawler: Search Gatherer Host Unavailable Summary: As part of a search crawl, the servers that contain the crawl components communicate with content servers to retrieve the items for the crawl. Network communication problems between crawl server and content server will block this crawl. When SharePoint is crawling content from a specific content server and network communication problems exist, SharePoint might display the following symptoms: The crawl of the specific content server does not progress and seems to stall. The crawl logs show no new crawled documents for the specific content server. Cause Network issues might be preventing the communication. Check that the content server is online and can connect to the server that hosts the crawl component. Resolution Check the status of the content server. If the content server is online and serves content to the crawl account, check for network connection issues. DocParsing: No More Parser Server Workers Alert Name: DocParsing: No More Parser Server Workers Summary: All parser server workers cannot restart. Therefore the pool has no more workers available. At this point, only format handlers that are not configured to run in a sandbox can run. Cause One or more of the following might be the cause: Broken pipe between the parser server and the content processing component. Excessive resource usage causing a time-out to obtain resources. Permission failure to any of the required directories. External change in directory permission of either the parser process or the temporary directory that it writes to. Misplaced or corrupted configuration file. Resolution Check CPU and memory usage of the content processing component. Check appropriate permissions to parserserver.exe and the related directories. Check the configuration file. DocParsing: Parser Server Worker Failed to Restart Alert Name: DocParsing: Parser Server Worker Failed to Restart Summary: A parser server worker cannot restart. Therefore it is being removed from the pool. The pool still has the indicated number of workers available. This occurs when a worker inside the parser server pool could not restart. When this happens, the number of available workers in the pool is decreased, which reduces throughput. Cause One or more of the following might be the cause: Broken pipe between the parser server and the content processing component Excessive resource usage causing a time-out to obtain resources. Permission failure to any of the required directories; External change in directory permission of either the parser process or the temporary directory that it writes to. Misplaced or corrupted configuration file. Resolution Check CPU and memory usage of the content processing component. Check appropriate permissions to parserserver.exe and the related directories. Check the configuration file. Index: Lost Generations Alert Name: Index: Lost Generations Summary: Data acknowledged as indexed is permanently lost. Potentially lost documents on the index system indicated, between the given generations. Cause This may be caused by the amount failures exceeding the fault-tolerance. Resolution A full re-crawl may be required. Index: Missing partition Alert Name: Index: Missing partition Summary: Incomplete result set because of one or more missing partitions on the indicated index system. This may require resending of information if the related search components remain unavailable. That is, it continues to fail. Cause One or more of the following might be the cause: Missing injection from index component on the query processing component. Lost network connectivity. Index component fails. Resolution Lookup Service will restart automatically. Check index component. Check communication issues. Index: Indexing Blocked Alert Name: Index: Indexing Blocked Summary: Information flow from content processor to index is blocked; waiting for a checkpoint (time indicated). This is limiting index speed; indexing queues remain full for longer than expected. Cause One or more of the following might be the cause: Indexer not receiving enough resources. Information coming in too quickly. Resolution Investigate resource usage on the index component, and possibly adjust the search topology to lower the feeding rate. Index: Journal IO Exception Read Alert Name: Index: Journal IO Exception Read Summary: Unable to read Journal file. This indicates a potential problem with recovery or replication. Indexing is stopped. Cause Journal file is stored on disk, external cause to indexer may be: Lock on file. Changes to file. Physical disk problem. Resolution Release lock on file. Revert changes. Fix disk. This issue may require deleting the Journal and refeeding. Index: Journal IO Exception Write Alert Name: Index: Journal IO Exception Write Summary: Unable to write to Journal file. This indicates a potential problem with recovery or replication. Indexing is stopped. Cause Journal file is stored on disk. External cause to indexer may be: Lock on file. Changes to file. Physical disk problems. Resolution Release lock on file. Revert changes. Fix disk. This issue may require deleting the Journal and refeeding. Index Lookup: Schema service availability query processing Alert Name: Index Lookup: Schema service availability query processing Summary: Index lookup could not work because the schema service was not available. Cause The indexer or the database, or both are unavailable. Resolution Check, and restart, the Schema service or OM, if it is necessary. Index Lookup: Missing partition Alert Name: Index Lookup: Missing partition Summary: One or more partitions are missing on the index system indicated. This means that at least one content processing component has no connection to one primary index component. Therefore the system cannot feed for that particular connection. Cause Possible causes include the following: Index component down. Missing injection from index component on the content processing component. Lost network connectivity. Resolution The service will restart automatically; check index component; check communication issues, restart if it is necessary. Query Processing: Query classification dictionary update Alert Name: Query Processing: Query classification dictionary update Summary: The timer job Query Classification Dictionary fails. Cause One or more of the following might be the cause: The Search service application is paused or not available. An error occurred during deployment or connection establishment. Resolution Check the status of the Search service application, and restart if it is necessary. Make sure that connection is present. Restart timer job if it is necessary. Query Processing: Search Service Application Availability Alert Name: Query Processing: Search Service Application Availability Summary: Query parsing could not work because the Search service application with the indicated ID is not available. The Search service application is down. Therefore service is interrupted. Cause The Search service application is not available. Resolution Check, and restart, the Search service application if it is necessary. Query Processing: Component Availability - Query Processing Alert Name: Query Processing: Component Availability - Query Processing Summary: The query processing component is stopped and is unavailable. Cause The query processing component is stopped under either expected or unexpected conditions. Resolution If the issue persists, this indicates system issues. Check and restart the query processing component, if it is necessary. Query processing: flow failed to start Alert Name: Query processing: flow failed to start Summary: The indicated processing flow is not started as expected. Cause The required services are not present, or the index may be unavailable. Resolution Check that the required search components and databases are running and restart these if necessary. Query Processing: Fallback word breaker did not load Alert Name: Query Processing: Fallback word breaker did not load Summary: The word breaker service could not load the fallback word breaker, for the indicated word breaker language. Cause Installation error or another transient error that might be resolved by a component restart or rebooting the server. In an on-premises setting, it could also indicate an error caused by changing the word breaker configuration. Resolution Restart the query processing component or restart the server. If a word breaker customization was tried, the logs could suggest more information to indicate a cause and resolution. Query Processing: Query Component Get Configuration Alert Name: Query Processing: Query Component Get Configuration Summary: Unable to get the configuration for query processing component. So, the query service cannot start and is currently not working. Cause Corrupt or missing configuration. Resolution Check settings for the query processing component. Query Processing: Query Normalization Schema Service Availability Alert Name: Query Processing: Query Normalization Schema Service Availability Summary: Query normalization could not work for the particular query because the schema service was not available. Query processing fails for particular queries. Cause The indexer is unavailable, or the search administration database is down, or both. Resolution Check and restart the schema service or OM. Query Processing: Query Parsing Schema Service Availability Alert Name: Query Processing: Query Parsing Schema Service Availability Summary: Query parsing could not work for the particular query because the schema service related to the indicated Search service application was not available. Query parsing fails for particular queries. Cause The indexer is unavailable, or the search administration database is down, or both. The query processing component has restarted and cannot access any index component. Check index components. Indexers are available, but cannot provide the schema to query processing. Resolution Check and restart the schema service or OM. Query Processing: QueryParsing Scope Cache Availability Alert Name: Query Processing: QueryParsing Scope Cache Availability Summary: Query parsing could not work for the particular query because the query rules for the indicated Search Service service application weren't available. Cause Query rules are not available. Resolution Check the query rules. Query Service: Service availability query processing Alert Name: Query Service: Service availability query processing Summary: The query service that has the indicated service URI has stopped. Cause Query processing component is stopped under either expected or unexpected conditions. That leads to the shutdown of the WCF service it exposes. Resolution If the issue persists, check and restart the query processing component. Query Service: Start Service Availability - Query Processing Alert Name: Query Service: Start Service Availability - Query Processing Summary: Unable to start the query service that has the indicated service URI because the query service is not working correctly. Cause Potential network communication service issues. For example, the WCF port is in use. Resolution Restart the query processing component, the query service or WCF services if the WCF port is in use and has to be released. Query Service: Unable to stop query processing Alert Name: Query Service: Unable to stop query processing Summary: Unable to stop the query service that has the indicated service URI. Cause Potential network communication service issues. For example, the WCF port is in use. Resolution Restart the query processing component, the query Service or WCF services if the WCF port is in use and has to be released. Query URL Mapping: Alternate URL Mapping Service Availability - Query Processing Alert Name: Query URL Mapping: Alternate URL Mapping Service Availability - Query Processing Summary: The alternate URL mapping did not work because the mapping service was not available. Query or results processing fails. Cause The Alternate Access Mapping OM is unavailable. Resolution Check, and restart, the Alternate Access Mapping service or OM if it is necessary. Schema Reader: Schema Service Availability - Query Processing Alert Name: Schema Reader: Schema Service Availability - Query Processing Summary: The schema reader cannot work for the particular query because the schema service is not available. Query processing fails for particular queries. Cause The Indexer is unavailable, or the search administration database is down, or both. Resolution Check and restart the schema service or OM if it is necessary. Search Admin Platform Services: Repository Initialization Failed Alert Name: Search Admin Platform Services: Repository Initialization Failed Summary: The PackageManager service in each component will read the Default-manifest.txt on component startup to find the current status of the repository quickly. While running, the PackageManager in each component obtains updates about installs and uninstalls from the host controller. In this case, the repository could not be initialized. This means that there was a failure when the repository was initializing on host controller startup. For example, during a primary failover or during a repository reset. Cause The cause could be an ACL issue or a corrupted repository manifest file. If it is a corrupted repository with a good manifest file, initialization will try to rebuild the repository again. However, the same event is also sent when rebuild fails. Resolution Check for potential ACL issues. Follow this procedure for fixing the repository on secondary (not primary) hosts: Stop the host controller. Delete the directory C:\Program Files\Microsoft Office Servers\15.0\Data\Office Server\Applications\Search\Repository. Start the host controller. This will affect the host controller to create an empty repository with a GUID that does not match the primary. So, the secondary will immediately start to replicate the contents of the primary. Search Admin Platform Services: Repository Installation Failed Alert Name: Search Admin Platform Services: Repository Installation Failed Summary: Repository installation fails for the file indicated. When dictionary installation fails for the primary, users should be aware that linguistic features such as query suggestions might get outdated. Cause The file may already be installed in the repository, or the related timer job may have failed. Resolution Check the repository. Make sure that a dictionary compilation timer job has completed successfully after the event. You can manually trigger the timer job from Central Administration or by using Microsoft PowerShell in order to re-install. Search Admin Platform Services: Repository Replication Alert Name: Search Admin Platform Services: Repository Replication Summary: This situation occurs in secondary repositories. The repository in the local host controller has found that it has a lower repository version than the primary host controller (or lower version than other secondary host controllers if the primary is down). When this happens, the local repository attempts to obtain the latest version from the primary or a secondary based on how it is triggered. This indicates that something went wrong during the replication. Cause Connection to the remote repository which is being replicated can be lost due to a network issue or due to a remote repository (host controller) issue during replication. The local repository can be corrupted or have an ACL issue. Resolution If persistent, a manual intervention might require to make it correct based on the problem (see cause). Except for the obvious resolutions on network and ACL and so on, remote host controller can be restarted. If the repository is corrupted, usually it is automatically rebuilt. Search Analytics: analysis run state Search Analytics Alert Name: Search Analytics: analysis run state Search Analytics Summary: An analysis can encounter different problems. Hence the system cannot guarantee a successful run of search analytics analysis. The timer job, following the analysis failure model, will try to re-schedule a new analysis run to prevent from waiting until daily schedule is triggered. Symptoms: Monitor is triggered when unsuccessful analysis run is detected by the timer job. Cause One or more of the following might be the cause: Analytics processing component(s) or some other farm components are not available. Databases (Link database, analytics reporting database, etc.) are not running or reachable. Network outage. Server issues. Resolution Make sure these areas are functioning: The analytics processing component(s) and other farm components are functioning correctly. There are no networking issues. There are no server issues. The analytics databases are available and that the user can read or write to the databases. Search analytics: Timer job cannot resolve Analytics Processing Engine (APE) Alert Name: Search analytics: Timer job cannot resolve Analytics Processing Engine (APE) Summary: After the timer job has successfully found the Search service application it will try to resolve the Analysis Processing Engine in the analytics processing component. This component is needed to be able to read or write the analysis configuration and to start, stop, suspend, or resume the analysis. Symptoms: If the system cannot locate or communicate with the Analysis Processing Engine, it will be unable to start a new run of the analysis. Cause One or more of the following might be the cause: Unable to connect to Search service application. Unable to connect to the search administration component. Unable to connect to System Manager. Unable to connect to the Analysis Processing Engine. Resolution Ensure that the following components are up and running: Search service application. Search administration component. System Manager. Analysis Processing component. Search Analytics: Timer job cannot resolve Link database Alert Name: Search Analytics: Timer job cannot resolve Link database Summary: The timer job will check that all Link database partitions are available before it tries to start the search analytics analysis. Symptoms: If the Link database partitions are unavailable when it is time to start a new analysis run, the start try will be aborted. New Link database checks followed by a new start try will be performed in the next timer job polls (every 10 minutes). Cause One or more of the following might be the cause: Unable to resolve the Search service application. Link database connection timed out. Link database connection is refused. Link database is not available. Resolution Verify that the Link database is available and reachable. Also verify that the user has access to read from the Link database. Search Analytics: analysis run state search analytics Alert Name: Search Analytics: analysis run state search analytics Summary: The search analytics analysis is started by its timer job. The analysis will run until all its tasks are completed, and the analysis state will be set to Stopped. If some tasks cannot be completed, the analysis run will be stopped (after some retries) and the state set to Failed. Stopped before completion indicates that the analysis has reached the Stopped state, but all analysis tasks have NOT completed. Some of the work done by the analysis might have been sent to the index then, but to make sure nothing is lost and all of the work is processed again during next analysis run. Cause Analysis is stopped for the following reasons: A manual stop on a running search analytics analysis was performed Or, system maintenance tasks require stopping a running search analytics analysis. Resolution Check the state of the search analytics analysis and any active system maintenance tasks. Search Gatherer: Disk Full Crawler Alert Name: Search Gatherer: Disk Full Crawler Summary: As the crawler crawls content, it creates files in a temporary location. This temporary location can grow over time. The disk where the temporary location is located is running out of space. SharePoint experiences the following symptoms when this happens: The crawl does not progress and seems to stall. The crawl logs show no new crawled documents. Cause The disk on which the crawler creates temporary files is running out of space. Resolution Free up space on the disk on which the crawler creates the temporary files. To free up disk space: Use Disk Cleanup to delete temporary files on the drive where the temporary index files are stored. Use the procedure following this one to determine the location of the index files. [!NOTE] If the temporary files are located on a drive other than the operating system drive (drive C), you must restart the search service to test the performance of the crawler after you delete the temporary files. After Disk Cleanup is complete, test the performance of the crawler. If the crawler is not crawling content, delete unnecessary files and folders on the selected drive. If you cannot clear any disk space, restart the search service. Restarting the search service recreates the Temp directory for crawled files. In a Command Prompt window, run the command net stop osearch15 to stop the search service. Run the command net start osearch15 to restart the search service. Search Usage Analytics: Analysis configuration failed Alert Name: Search Usage Analytics: Analysis configuration failed Summary: If the system cannot get or set the analysis configuration, it will not schedule a new run of the analysis. The timer job will retry later. Analysis run with wrong event configuration: Wrong recommendations weight. Wrong event rate weight. When the deletion scopes are fetched, any errors from the API will result in an event. The analysis can still run without this. However, the quality or correctness of the result will deteriorate over time. Cause One or more of the following might be the cause: Analytics processing component not available. Network issues. Time-out from API. Analytics processing component throwing exceptions. Analytics reporting database down. Networking issues. Errors in configuration. Getting wrong event configuration from API. No configuration data is present in the system. Networking problems causing API calls to fail. API returning time-out. Network issues causing failures when you are using the API. The API returning errors. Resolution Make sure these areas are functioning: There is a Search service application in the system. The Analytics processing component is functioning correctly. There are no networking issues. Check configuration. Verify that the component where the API is running is up and functioning. Check for any network problems. Verify that search analytics has run the last 30 days. Search Usage Analytics: Analysis failed to start Alert Name: Search Usage Analytics: Analysis failed to start Summary: Usage analytics analysis cannot start, or usage analytics analysis fails during execution. Cause One or more of the following might be the cause: This can be caused by external dependencies like configuration, analytics processing component or Search service application dependencies or networking issues. This monitor can also be triggered if the usage analytics analysis fails during execution. This can be caused by failures in components usage analytics depend on, such as the analytics processing component or other networking issues. There can also be something in the events being processed causing the analysis to fail. Resolution Verify other System Center Operations Manager monitors and see whether any parent errors are present. The Search service application is up. The analytics processing database and link databases are running, and are in healthy state. There is no problem with the analytics processing component There is no problem with disk space. There is no network connectivity issue. The usage analytics timer job is running. The search analytics timer job is running. Verify the status of the search analytics analysis. Search Usage Analytics: Feeding failure Alert Name: Search Usage Analytics: Feeding failure Summary: This means the recommendations and statistics provided by usage analytics will be old or missing. Any recent analysis changes will not be visible. Symptom: usage analytics feeding required for the analysis was unsuccessful. Cause One or more of the following might be the cause: The indexer is returning errors. The content processing component is not working. The component that the indexer is running on could be down. Resolution Verify that the index system is functioning correctly. Check for network errors. Make sure there are no errors on the analytics processing component. Search Usage Analytics: Reporting API write failure Alert Name: Search Usage Analytics: Reporting API write failure Summary: Background or symptoms: The information written to the reporting API or database is used if the cache files must be rebuilt. When usage analytics cannot write to the reporting API, this means that the recommendations and data aren't updated. Counts presented to the user are therefore out of date, and old or wrong data will also be presented next time that the cache is rebuilt. The Reporting API could not be read so that no cache files could be created and the analysis will therefore not be run. Running the analysis in this state would in effect overwrite the current data. Impact: There will be no usage analytics analysis run. Recommendations will not be updated for items in the system. Cause One or more of the following might be the cause: Reporting API is returning an error saying database is down. Reporting API is responding that the wrong data is being sent (nothing is stored). The reporting API cannot be reached (networking issues). Reporting API is returning an error saying database is down Reporting API is returning empty (wrong data). The cache dumper is malfunctioning. Resolution Verify that there are no database problems in the system, and make sure that the analytics reporting database is functioning. Verify that analytics processing component is functioning. This means that analyses can run. Check for any networking issues. Search Usage Analytics: Store not available Alert Name: Search Usage Analytics: Store not available Summary: Background or symptoms: Usage analytics cannot communicate with the event store. The usage analysis will not be started, and usage analytics data in the analytics reporting database and the search index will be stale. During the first step of the usage, the system fetches the event data from the event store. This is the raw input to the analysis. It contains the actions for the users in the system. Without this data, the analysis cannot be run. Cause One or more of the following might be the cause: Network outage, Event Store process is down, or the event store is returning errors in response to request. The event store can be down. The event store is returning no data because the components registering events are not functioning correctly, or there are networking issues so that the event store cannot be reached. Resolution Ensure that the event store is up and running. Verify that the event store is functioning correctly. Check for networking errors. Make sure the databases in the system are running. Search Usage Analytics: Usage analytics APE not available Alert Name: Search Usage Analytics: Usage analytics APE not available Summary: Background or symptoms: Fails to retrieve Search service application, which means that no new analysis can be run, and recommendations cannot be updated. Fails to retrieve Analysis Engine Service so that no new analysis can be started. Cause Inability to connect to the Search service application may be caused by one or more of the following: Network issues. Search service application not created. User permissions mismatch. Inability to connect to the Analysis Engine Service can be caused by one or more of the following: Unable to connect to Search service application. Unable to connect to the search administration component. Unable to connect to System Manager. Network problems. User rights or WCF errors. Unable to connect to the Analytics processing component. Resolution Check the following areas: Check the following areas: A Search service application does not exist. Network problems. User permissions or WCF errors. Ensure that the following components are up and running: Search service application. Search administration component. System Manager. Analysis processing component Networking is ok on host. Usage table exceeded max bytes limit - 1 Alert Name: Usage table exceeded max bytes limit Summary: This alert is to report that a usage table exceeds its quota. Usage data is stored in various tables. And each table has its quota. Once a table is full, new data cannot be inserted. Cause Too much usage data is generated in one day. Resolution There are two ways to resolve this issue: Extend the table quota defined in "configurations" table. Disable the usage provider to stop generating data. Usage table exceeded max bytes limit - 2 Alert Name: Usage table exceeded max bytes limit Summary: This alert is to report that a usage table exceeds its quota. Usage data is stored in various tables. And each table has its quota. Once a table is full, new data cannot be inserted. Cause Too much usage data is generated in one day. Resolution There are two ways to resolve this issue: Extend the table quota defined in the "configurations" table. Disable the usage provider to stop generating data. Services Host Controller Alert Name: Services Host Controller Summary: The host controller service is currently unavailable. Cause This may indicate an issue internal to the host controller. Resolution If persistent or recurring, check the host controller and restart if it is necessary. See also Concepts Plan for monitoring in SharePoint Server Other Resources System Center Monitoring Pack for SharePoint Foundation System Center Monitoring Pack for SharePoint Server 2013 System Center Monitoring Pack for SharePoint Server 2016
OfficeDocs-SharePoint/SharePoint/SharePointServer/technical-reference/search-in-sharepoint-server.md/0
Search in SharePoint Server knowledge articles
OfficeDocs-SharePoint/SharePoint/SharePointServer/technical-reference/search-in-sharepoint-server.md
OfficeDocs-SharePoint
8,598
69
ms.date: 03132018 title: "The Machine Translation Service is not running when it should be running (SharePoint Server)" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars audience: ITPro f1.keywords: - NOCSH ms.topic: troubleshooting ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: f140e0ad-07e4-42f8-a198-54d800355698 description: "Learn how to resolve the SharePoint Health Analyzer rule: The Machine Translation Service is not running when it should be running, for SharePoint Server." The Machine Translation Service is not running when it should be running (SharePoint Server) [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] Rule Name: The Machine Translation Service is not running when it should be running. Summary: The Machine Translation Service batch mode uses a timer job to pull translation items from the Machine Translation Service database and then assign those translation items to individual application servers. If the timer job doesn't run, items can't be translated. Cause: The Machine Translation Service timer job isn't enabled. Resolution: Enable the Machine Translation Service timer job. Verify that the user account that is performing this procedure is a member of the Farm Administrators group. On Central Administration , click Monitoring. On the Job Definitions page, in the list of timer jobs, click Machine Translation Service Timer Job. On the Edit Timer Job page, in the Recurring Schedule section, specify when you want the timer job to run, and then click Enable. The default is every 15 minutes. See also Concepts Default timer jobs in SharePoint Server 2019 Other Resources Default timer jobs in SharePoint Server 2016 Default timer jobs in SharePoint 2013
OfficeDocs-SharePoint/SharePoint/SharePointServer/technical-reference/the-machine-translation-service-is-not-running-when-it-should-be-running.md/0
The Machine Translation Service is not running when it should be running (SharePoint Server)
OfficeDocs-SharePoint/SharePoint/SharePointServer/technical-reference/the-machine-translation-service-is-not-running-when-it-should-be-running.md
OfficeDocs-SharePoint
450
70
title: "The Visio Graphics Service has a minimum cache age setting that may cause a security issue ((SharePoint Server)" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 1252017 audience: ITPro f1.keywords: - NOCSH ms.topic: troubleshooting ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: 4fcd074b-32b1-49b3-9910-5bb174894603 description: "Learn how to resolve the SharePoint Health Analyzer rule: The Visio Graphics Service has a minimum cache age setting that may cause a security issue, for SharePoint Server." The Visio Graphics Service has a minimum cache age setting that may cause a security issue ((SharePoint Server) [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] Rule Name: The Visio Graphics Service has a minimum cache age setting that may cause a security issue Summary: Setting Minimum Cache Age to 0 minutes may leave the Visio Graphics Service open to a denial of service (DoS) attack. A value of 0 for this setting might lead to large processor and network load of the Visio Graphics Service and SharePoint Server, decreasing the expected performance of both. However, increasing this value means that users will not see their data-connected diagrams refreshing as frequently. Cause: The Minimum Cache Age setting was set to 0 minutes. Resolution: Increase the value of the Minimum Cache Age setting Verify that the user account that is performing this procedure is an administrator of the Visio Graphics Service service application. In Central Administration, on the Home page, in the Application Management section, click Manage service applications. On the Service Applications page, click the Visio Graphics service application. On the Manage the Visio Graphics Service page, click Global Settings. Ensure that the settings have the values that are listed in the following table. If they do not, type the value in the corresponding text box and click OK. SettingValue :-----:----- Maximum Web Drawing Size \<= 25 (Megabytes) Minimum Cache Age >= 5 (Minutes) Maximum Cache Age \<= 60 (Minutes) Maximum Recalc Duration \<= 60 (Seconds) Maximum Cache Size >= 5120 (Megabytes)
OfficeDocs-SharePoint/SharePoint/SharePointServer/technical-reference/the-visio-graphics-service-has-a-minimum-cache-age-setting-that-may-cause-a-secu.md/0
The Visio Graphics Service has a minimum cache age setting that may cause a security issue ((SharePoint Server)
OfficeDocs-SharePoint/SharePoint/SharePointServer/technical-reference/the-visio-graphics-service-has-a-minimum-cache-age-setting-that-may-cause-a-secu.md
OfficeDocs-SharePoint
561
71
title: "Web.config files are not identical on all machines in the farm (SharePoint Server)" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 8312017 audience: ITPro f1.keywords: - NOCSH ms.topic: troubleshooting ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: c2f1d2a8-dc6a-471c-b373-be420b460306 description: "Learn how to resolve the SharePoint Health Analyzer rule: Web.config files are not identical on all machines in the farm, for SharePoint Server." Web.config files are not identical on all machines in the farm (SharePoint Server) [!INCLUDEappliesto-2013-2016-2019-SUB-xxx-md] Rule Name: Web.config files are not identical on all machines in the farm. Summary: If you have multiple front-end Web servers in the farm and have made manual changes to the Web.config files, you will experience a problem where a front-end Web server cannot read session state that was saved by another server in the farm. Cause: The Web.config files on the front-end Web servers in the farm are not identical. Resolution: Ensure that the Web.config files are identical on all front-end Web servers in the farm. Verify that the user account that is performing this procedure is a member of the Farm Administrators group. Identify the server on which this event occurs. On the SharePoint Central Administration website, in the Monitoring section, click Review problems and solutions, and then find the name of the server in the Failing Servers column. If there are multiple failing servers in a server farm, you must repeat the following steps on each failing server. Verify that the user account that is performing the following steps is a member of the Administrators group on the local computer that you identified in the previous step. Log on to the server on which this event occurs. Typically the Web.config file is stored at C:\inetpub\wwwroot\wss\VirtualDirectories\Port_Number. Note the modified date of the Web.config file. Repeat the previous steps on other failing servers. Compare these Web.config files and decide which one is correct. To view the content of the Web.config file, do the following: In Server Manager, click Tools, and then click Internet Information Services (IIS) Manager. In the Internet Information Services management console, in the Connections pane, expand the tree view of the server name, expand Sites, and then click the site for which you want to view the settings of the Web.config file. On the site Home page, switch to the Features View, and then in the Management section, double-click Configuration Editor. In the Section list, select a section to view the settings of the Web.config file. Delete the incorrect Web.config file on each failing server, and then copy and paste the correct Web.config file. By default, the Repair Automatically option is enabled for this rule. You can restore the default setting for this rule by doing the following: Restore default setting In Central Administration, click Monitoring. On the Monitoring page, in the Health Analyzer section, click Review rule definitions. On the Health Analyzer Rule Definitions - All Rules page, in the Category: Configuration section, click the name of the rule. In the Health Analyzer Rule Definitions dialog, click Edit Item. Select the Repair Automatically check box, and then click Save.
OfficeDocs-SharePoint/SharePoint/SharePointServer/technical-reference/web-config-files-are-not-identical-on-all-machines-in-the-farm.md/0
Web.config files are not identical on all machines in the farm (SharePoint Server)
OfficeDocs-SharePoint/SharePoint/SharePointServer/technical-reference/web-config-files-are-not-identical-on-all-machines-in-the-farm.md
OfficeDocs-SharePoint
827
72
title: "Create the SharePoint Server 2016 farm for a database attach upgrade" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 12302016 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: 99684f0b-f617-4c10-8420-d4d0adea3687 description: "Create and configure a SharePoint Server 2016 farm so that you can upgrade databases from SharePoint 2013." Create the SharePoint Server 2016 farm for a database attach upgrade [!INCLUDEappliesto-xxx-2016-xxx-xxx-xxx-md] When you upgrade from SharePoint Server 2013 to SharePoint Server 2016, you must use a database attach upgrade, which means that you upgrade only the content for your environment and not the configuration settings. Before you can upgrade the content, you must configure a new server or server farm by using SharePoint Server 2016. This article lists the items that you have to configure when you create that new environment. Phase 1 of the upgrade process: Create SharePoint Server 2016 farm :-----:----- This is the first phase in the process to upgrade SharePoint Server 2013 data and sites to SharePoint Server 2016. The process includes the following phases that must be completed in order: Create the SharePoint Server 2016 farm for a database attach upgrade (this phase) Copy databases to the new farm for upgrade to SharePoint Server 2016Upgrade service applications to SharePoint Server 2016Upgrade content databases to SharePoint Server 2016For an overview of the whole process, see Overview of the upgrade process to SharePoint Server 2016. For an overview of the whole process, see Overview of the upgrade process to SharePoint Server 2016. Before you begin Before you create the SharePoint Server 2016 farm, review the following information and take any recommended actions. Make sure that the hardware and software that you are using meets the requirements in Hardware and software requirements for SharePoint Server 2016. Make sure that you have appropriately planned your logical and physical architecture to support the features and functionality that you want in the SharePoint Server 2016 farm. Make sure that you have planned for sufficient performance and capacity for the SharePoint Server 2016 farm. Ensure that you are prepared to set up the required accounts by using appropriate permissions. For detailed information, see Initial deployment administrative and service accounts in SharePoint Server. Collect information and settings [!IMPORTANT] The section explains how to configure service applications, except for the Business Data Connectivity service application which applies to SharePoint Server 2016. Before you start to upgrade, you must collect information and settings about your existing environment. You have to know what is in your SharePoint Server 2013 environment before you can start to build your SharePoint Server 2016 environment. Gather information such as the following: Alternate access mappings Authentication providers and authentication modes that are being used Quota templates Managed paths Self-service site management settings Incoming and outgoing e-mail settings Customizations You also have to turn off or remove services or components in the SharePoint Server 2013 with Service Pack 1 (SP1) environment that could cause errors in the upgrade process. The following services or components should be removed or stopped before you back up your databases: PowerPoint Broadcast Sites Office Online Server has changed into a separate server product which can serve multiple SharePoint farms for viewing and editing documents. Because of this change, PowerPoint Broadcast sites cannot be upgraded to SharePoint Server 2016. Record the passphrase for the Secure Store service application The Secure Store service application uses a passphrase to encrypt information. You have to know what this passphrase is so that you can use it in the new environment. Otherwise, you will not have access to the information in the Secure Store. If you do not know the passphrase, you can refresh the key, and then back up the Secure Store database. For more information, see Work with encryption keys in Configure the Secure Store Service in SharePoint 2013 . Install SharePoint Server 2016 in a new environment Before you can upgrade your databases, you must use SharePoint Server 2016 to configure a new server or server farm. The first step in creating your new environment is to install SharePoint Server 2016 and configure your new server or server farm. You must do the following: Run the Microsoft SharePoint Products Preparation Tool to install all required software. Run Setup to install the product. Install all language packs that you want in your environment. [!NOTE] For more information about how to install available language packs, see Install or uninstall language packs for SharePoint Server 2016. Run the SharePoint Products Configuration Wizard to configure your server or servers. [!IMPORTANT] Some service applications can be upgraded by using a service application database upgrade. If you want to upgrade these service applications by upgrading the service application databases, do not use the Farm Configuration Wizard to configure these service applications when you set up your new farm. For step-by-step instructions for these tasks, see Install SharePoint Server 2016. Configure service applications You must create the service applications on your new farm before you upgrade your content databases. There are some service applications that can be upgraded from SharePoint Server 2013 to SharePoint Server 2016. The steps in Install SharePoint Server 2016 describe how to use the Farm Configuration Wizard to enable all service applications. However, you should not use the Farm Configuration Wizard to enable the service applications that you want to upgrade. The following service applications can be upgraded by performing a services database upgrade: Business Data Connectivity service Managed Metadata service PerformancePoint services Search Secure Store service User Profile service For an overview of how to upgrade these service applications, see Services upgrade overview for SharePoint Server 2016. For the specific steps to upgrade these service application databases see Upgrade service applications to SharePoint Server 2016. Configure farm settings The next step in creating the new environment is to apply general farm settings. You must manually reapply configuration settings from your SharePoint Server 2013 farm, such as the following: Incoming and outgoing e-mail settings All farm-level security and permission settings, such as adding user or group accounts to the Farm Administrators group Blocked file types And you must configure all new farm-level settings that you want to use, such as the following: Usage and health data collection Diagnostic logging Settings and schedules for timer jobs [!IMPORTANT] If you had disabled the Workflow Auto Cleanup timer job in your SharePoint Server 2013 environment, make sure that you disable this timer job in your new environment also. If this timer job is enabled in the new environment and disabled in the SharePoint Server 2013 environment, you might lose workflow associations when you upgrade. . In a standard installation, the next step would be to create web applications. However, for upgrade, you create web applications later in the process, after you upgrade the service application databases. For more information, see Create web applications. :-----:----- This is the first phase in the process to upgrade SharePoint Server 2013 data and sites to SharePoint Server 2016. Next phase: Copy databases to the new farm for upgrade to SharePoint Server 2016 For an overview of the whole process, see Overview of the upgrade process to SharePoint Server 2016.
OfficeDocs-SharePoint/SharePoint/SharePointServer/upgrade-and-update/create-the-sharepoint-server-2016-farm-for-a-database-attach-upgrade.md/0
Create the SharePoint Server 2016 farm for a database attach upgrade
OfficeDocs-SharePoint/SharePoint/SharePointServer/upgrade-and-update/create-the-sharepoint-server-2016-farm-for-a-database-attach-upgrade.md
OfficeDocs-SharePoint
1,622
73
title: "Software updates overview for SharePoint Server 2013" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars ms.date: 2222018 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: 476d6a49-7263-4460-8e4c-28102fec1442 description: "Learn how to deploy a software update to a SharePoint farm." Software updates overview for SharePoint Server 2013 [!INCLUDEappliesto-2013-xxx-xxx-xxx-xxx-md] Administrators update SharePoint 2013 to deploy or update assemblies that provide functionality and to upgrade databases. A successful update follows a methodical approach that minimizes interruptions in service. Review information in this article to learn about the process before you begin the update process. Before you begin Before you begin the software update process, review the following information about permissions, hardware requirements, and software requirements. Account permissions and security settings in SharePoint 2013 Hardware and software requirements for SharePoint 2013 Terminology To understand how to implement software updates in SharePoint 2013, it is important to understand the terminology for the core components. Term Definition Comment :-----:-----:----- Cumulative Update (CU) A CU is a rollup update that contains all previous critical on-demand hotfixes to date. Additionally, a CU contains fixes for issues that meet the hotfix acceptance criteria. These criteria may include the availability of a workaround, the effect on the customer, the reproducibility of the problem, the complexity of the code that must be changed, or other reasons. patch A compiled, executable installer file that contains updates to one or more products. Examples of packages are the executable (.exe) files that you download to install a service pack, cumulative update (CU), or hotfix. Packages are also known as MSI files. software update A software update is any update, update rollup, service pack, feature pack, critical update, security update, or hotfix that is used to improve or to fix a software product that is released by Microsoft Corporation. upgrade Process by which you change an environment to use a newer version of software. You can upgrade to a minor release, such as an update or patch, or to a major release. An upgrade to a minor release is called a build-to-build upgrade. An upgrade to a major release is called a version-to-version upgrade. In SharePoint 2013, for build-to-build upgrades, you can use either in-place or database-attach methods. For version-to-version upgrade, only database-attach is supported. For more information about version-to-version upgrade, see Overview of the upgrade process from SharePoint 2010 to SharePoint 2013. For an overview of the steps for in-place and database-attach upgrade for build-to-build upgrades, see Install a software update (SharePoint 2013) For a complete list of terminology about software updates, see Description of the standard terminology that is used to describe Microsoft software updates. Features SharePoint 2013 has features that facilitate the end-to-end software update experience. Some of these features are as follows: Backward compatibility between an updated services farm and non-updated content farm. There is full support for automatic updates that use Windows Server Update Services (WSUS), Windows Update, and Microsoft Update. [!NOTE] An automatic update copies the binary files to the farm servers, but you must complete the software update by running the upgrade on the servers. Administrators can use the SharePoint Central Administration website or Microsoft PowerShell to monitor the status of an update. Intended audience and scope Information in this article is for all IT professionals who maintain SharePoint 2013. However, specific instructions to install a software update are intended for IT professionals who have to deploy software updates on a farm of servers that host SharePoint 2013. Information in this article applies to the following products: SharePoint 2013 SharePoint 2013 language pack Microsoft Filter Pack [!NOTE] The process that installs software updates in stand-alone environments of SharePoint 2013 is a simpler process than the process that installs software updates in a server farm and does not require all the steps that are required for a server farm. Software update process The process that deploys updates in a SharePoint 2013 environment is a two-phase process: patching and build-to-build upgrade. Each phase has specific steps and results. It is possible to postpone the build-to-build upgrade phase. [!CAUTION] Although we try to ensure the highest level of backwards compatibility, the longer you run in such a state, the greater the chance of finding a case where farm behavioral issues might occur. Patch phase The patch phase has two steps, the patch deployment step and the binaries deployment step. During the patch deployment step, new binary files are copied to the server running SharePoint 2013. Services that use files that the patch has to replace are temporarily stopped. Stopping services reduces the requirement to restart the server to replace files that are being used. However, in some instances you have to restart the server. The second step in the patch phase is the binaries deployment step. In this step, the installer copies support dynamic link library (.dll) files to the appropriate directories on the server that is running SharePoint 2013. This step ensures that all the web applications are running the correct version of the binary files and will function correctly after the update is installed. The update phase is complete after the binaries deployment step. The next and final phase to deploy software updates is the build-to-build upgrade phase. This phase modifies database schemas, updates objects in the farm, and updates site collections. Build-to-build upgrade phase After you finish the patch phase, you must complete the update installation by starting the build-to-build upgrade phase. The build-to-build upgrade phase is task intensive and, therefore, takes the most time to finish. The first action is to upgrade all the SharePoint processes that are running. After you upgrade the processes, the databases are crawled and upgraded. After you complete a farm upgrade on one server, you have to complete the process on all other servers to maintain compatibility. Software update strategy The update strategy that you select is based primarily on one of the following factors: The amount of downtime that is acceptable to install the update. The additional staff and computing resources that are available to reduce downtime. As you determine your update strategy, consider how the strategy enables you to manage and control the update. In terms of downtime reduction, the following options, ordered from most to least downtime, are available: Install the update and do not postpone the upgrade phase. Install the update and postpone the upgrade phase. Software update deployment cycle The cycle that is used for upgrading SharePoint 2013 farms and servers also applies to deploying software updates, which are a subset of an upgrade phase. We recommend that you use the update cycle in the following illustration as a guide to deploy software updates. Learn During this phase of the cycle, you learn about requirements to install the update. This information also affects new servers that you want to update and then add to the farm. Requirements and prerequisites First, ensure that the system can be provisioned as a farm server. For more information, see Hardware and software requirements for SharePoint 2013. Ensure that any server that you plan to update is running the same version of the operating system as the other farm servers. This includes updates, service packs, and security hotfixes. Update strategy Determine the strategy that you want to use to update the farm. Depending on your requirements, you can use one of the following strategies: In-place Database-attach For more information about the update strategy to use, see Install a software update (SharePoint 2013) Downtime reduction Research and assess the options that are available to reduce downtime. First, check for missing dependencies, which may extend the amount of downtime. Identify all the dependencies for the update and either address these dependencies before you start to deploy the update, or factor the additional time into your schedule. Consider using read-only content databases and doing parallel upgrades to reduce downtime. Common issues Identify and address common issues such as missing or out-of-date dependencies and lack of space on the servers where you will install the update. Prepare To prepare for the software update, document the environment and plan an update strategy to ensure that the update will go as planned in the expected downtime window. Document the environment You document the environment to determine what is unique in your farm. You can use several techniques to gather information about your farm, such as manual inspection, comparisons by using WinDiff, and Microsoft PowerShell commands. Document, as appropriate, the following elements of the environment: Farm topology and site hierarchy Language packs and filter packs that are installed Customizations that could be affected by the update Manage customizations Customizations are typically one of the top issues during a farm upgrade or software update. Identify your farm customizations and determine whether they might be affected by the update. If in doubt, err on the side of caution and determine how you will manage the customizations. You must ensure that customizations will work after the software update. You can use the Stsadm ExportIPFSAdminObjects command to collect and export InfoPath administrator deployed forms only. Plan the update strategy During the Learn phase of the update cycle, you should have determined an update strategy and the required downtime minimization. In addition to determining hardware, space, and software requirements, you must include the following in your update strategy: The update sequence for the farm servers The order of operations The downtime limits and how you plan to reduce downtime A rollback process if there is a major problem [!TIP] Clean up the farm environment before you deploy the update. The benefits of a cleanup are improved update installation performance and the elimination of potential issues during and after the software update. For more information, see Clean up an environment before an upgrade to SharePoint 2013. The two final requirements for the update strategy are a communication plan and an update schedule. It is important to communicate with site owners and users about what to expect during an upgrade. An administrator should inform users about downtime and the risk that the upgrade may take longer than expected or that some sites may need some rework after upgrade. For more information, see Create a communication plan for the upgrade to SharePoint 2013. Create a benchmark schedule for update operations that contains the start times of operations that are related to the update deployment. At a minimum, the plan should include the following operations: Back up the farm. Start the update of the farm servers. Start the upgrade of the farm databases. Stop the upgrade and resume operations in the non-upgraded farm. Resume the upgrade, if it is required. Verify that the environment is completely working, either as the original version if you rolled back or the new version if you completed the upgrade. Make farm items ready for updates Ensure that farm items are ready for the update. Farm items are ready if they are backed up, documented, or updated to ensure that the update can be installed. Verify that the following aspects of a farm are ready for updates: Solutions Features Site definitions Web Parts Test The rigor, thoroughness, and detail of your tests determine the success or failure of the software update deployment. In a production computer environment, there are no safe shortcuts, and there are consequences from insufficient testing. For more information, see Use a trial upgrade to SharePoint 2013 to find potential issues. Build a test farm Build a test farm that represents the production environment. We recommend that you use a copy of the production data to determine potential problem areas and monitor overview system performance during the upgrade. The key indicator is the length of time it takes from the beginning to the end of the deployment process. This should include backup and validation. You can incorporate this information in the update schedule. If possible, use hardware in the test environment that has equivalent performance capabilities to the production servers. [!TIP] Consider the use of a test farm in a virtual environment. After you finish the tests, you can shut down the virtual farm and use it later for future updates. Evaluate techniques A test farm also enables you to evaluate the techniques that you plan to use to update the production environment. In addition to testing and assessing your downtime reduction strategy, you can refine update monitoring. This is especially important in the areas of validating and troubleshooting the software update. Implement The update strategy that you use determines whether you have to build a new farm or deploy the update on the current farm servers. Build or update farms Whether you build a new farm or do an in-place update, the most important farm elements to consider are as follows: Content Services Service applications Deploy customizations Use solutions whenever possible so that you can deploy individual files or components. Reduce downtime Reduce downtime by using techniques such as read-only databases and update parallelism. For more information, see the "How to minimize downtime during upgrade" section in Determine strategy for upgrade to SharePoint 2013. Monitor progress The refined techniques that you use to monitor the software update in the test environment apply when you deploy the update in the production environment. Use the Upgrade and Migration page in Central Administration to monitor available status indicators. This feature enables live monitoring and provides a single location to view the patch status for all farm servers. Additionally, you can use the Upgrade and Migration page to view the update status for individual servers and the status and type of farm databases. Finally, when you use Central Administration to monitor updates, you can identify farm servers that you must update. The following tables describe the status information that is available in Central Administration. Status valueDescriptionHyperlink :-----:-----:----- No action required Farm server does not currently require any action to be taken by the administrator. No hyperlink Installation required Farm server is missing an .msi file that is set to mandatory for all farm servers or has a patch level below the individual farm-wide effective patch version. Hyperlink to the Patch Deployment State page Upgrade in progress Farm server is currently undergoing an upgrade operation. Hyperlink to the Upgrade Status page Upgrade available Farm server is running in backward-compatibility mode. Hyperlink to the Upgrade and Migration page Upgrade required Farm server is outside the backward-compatibility mode range with one or more databases. Hyperlink to the Upgrade and Migration page Upgrade blocked If an upgrade is available and any farm server requires installation, the remaining servers that do not require installation will be set to this status unless they are currently undergoing an upgrade. Hyperlink to the Patch Deployment State page Installed Indicates that no action is required Not applicable MissingRequired Displayed if a product is required on each server or if a patch for a specific .msi file is located on one server but not on the server for which this status is shown Not applicable MissingOptional Displayed if a product is not required on each server Not applicable Superseded Displayed if an update is no longer required on a server because a newer patch supersedes it Not applicable Log files and PowerShell commands are other tools to monitor the update process. [!IMPORTANT] Remember to monitor the length of time that the update is taking. Compare current update processes against the benchmark schedule to determine whether the update will meet the downtime window. If not, communicate this information to users of the farm. Validate You can start to validate the success of the update during the implementation phase and continue validation after the update is implemented. Logged event failures Review the event logs to discover issues that occurred during the deployment. Resolve these issues and then resume or restart the update as appropriate. For more information about event log files, see Configure diagnostic logging in SharePoint Server. User interface or experience issues Any user interface or user experience issues will surface on site pages. These issues mainly occur during a version-to-version upgrade. Look for the following issues: Unghosted files that are, ASP.NET (.aspx) pages that a user has modified within the site collection, and now behave differently than expected or have rendering issues caused by recent upgrades of the files on the server. User interface version mismatch HTML and XHTML compliance Other issues may include missing templates, user identifiers, and content issues such as large lists. Data issues Data issues result from the condition of the farm databases and can include all or some of the following: Connectivity issues to data sources Database corruption Orphaned items Hidden column data In some cases you can troubleshoot minor issues and then resume or restart the update. Be prepared to roll back the update if you cannot resolve issues.
OfficeDocs-SharePoint/SharePoint/SharePointServer/upgrade-and-update/software-updates-overview-for-sharepoint-server-2013.md/0
Software updates overview for SharePoint Server 2013
OfficeDocs-SharePoint/SharePoint/SharePointServer/upgrade-and-update/software-updates-overview-for-sharepoint-server-2013.md
OfficeDocs-SharePoint
3,649
74
title: "Upgrade service applications to SharePoint Server Subscription Edition" ms.reviewer: ms.author: serdars author: jitinmathew manager: serdars ms.date: 07092021 audience: ITPro f1.keywords: - NOCSH ms.topic: landing-page ms.service: sharepoint-server-itpro ms.localizationpriority: medium ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top - SP2019 ms.assetid: 6de4e8e0-5d27-4b1b-a87f-bebd8b9d6e77 description: "Upgrade service applications (Business Connectivity Services, Managed Metadata, Secure Store, and Search) to SharePoint Server Subscription Edition." Upgrade service applications to SharePoint Server Subscription Edition [!INCLUDEappliesto-xxx-xxx-xxx-SUB-xxx-md] When you upgrade from SharePoint Server 2019 or SharePoint Server 2016 to SharePoint Server Subscription Edition, you must use a database-attach upgrade, which means that you upgrade only the content for your environment and not the configuration settings. After you have configured the SharePoint Server Subscription Edition environment, and copied the content and service application databases, you can upgrade the service applications to SharePoint Server Subscription Edition. This article contains the steps that you take to upgrade the service applications. Phase 3 of the upgrade process: Upgrade service applications PhasesDescription :-----:----- This is the third phase in the process to upgrade SharePoint Server 2019 and SharePoint Server 2016 data and sites to SharePoint Server Subscription Edition. The process includes the following phases that must be completed in order: Create the SharePoint Server Subscription Edition farm for a database attach upgrade Copy databases to the new farm for upgrade to SharePoint Server Subscription Edition Upgrade service applications to SharePoint Server Subscription Edition (this phase) Upgrade content databases to SharePoint Server Subscription Edition For an overview of the whole process, see Overview of the upgrade process to SharePoint Server Subscription Edition. Before you begin Before you upgrade the service applications, review the following information and take any recommended actions. Ensure that the account that you use to perform the steps in this article is a member of the Farm administrators group in the Central Administration website. Decide which service application pool to use for the upgraded service applications. The procedures below use the default application pool for service applications which is "SharePoint Web Services Default". You can view a list of available service application pools by using the Get-SPServiceApplicationPool cmdlet in PowerShell. Or you can create a service application pool by using the New-SPServiceApplicationPool cmdlet. For more information, see Get-SPServiceApplicationPool and New-SPServiceApplicationPool. [!TIP] Throughout this article, variables (such as $applicationPool, $sss, $upa, and so on) are used in the PowerShell cmdlets to save time and effort. You do not have to use these variables if you would prefer not to. However, if you do not use these variables, you must use IDs for the service applications and service application proxies when you specify the Identity parameters. Each procedure has information about the variables used, or the alternate cmdlets to use to look up any IDs that are required. > Also, many procedures in this article include a step to set the $applicationPool variable. If you are performing all of these procedures in the same session of PowerShell, and you want to use the same application pool for all service applications, you do not have to repeat this step in each procedure. Instead, you can set this variable once at the beginning and use it throughout the procedures in this article. About upgrading the service application databases To upgrade a service application database, you create a new service application and provide the name of the existing database to use for the new service application. As the service application is created, the database is upgraded. This process has several steps. [!NOTE] Word Automation Services and Machine Translation Services can't be upgraded. A new service instance will need to be created. [!IMPORTANT] The following steps only apply to the Custom server role type. For more information on server role types, see Planning for a MinRole server deployment in SharePoint Server 2016 and SharePoint Server 2019 Start the service instances. The first step is to start service instances for the four service applications that you can upgrade: the Business Data Connectivity service, Managed Metadata Web Service, Secure Store service, and Search service. Most of these service instances can be started from Central Administration. However the SharePoint Server Search service instance must be started by using PowerShell. Create the service applications and upgrade the databases. After you have started the service instances, the next step is to create the service applications and upgrade the databases. You must use PowerShell to restore the service application databases. Create proxies for the service applications. After you have upgraded the service application databases, you create the proxies for the service applications and add them to the default proxy group. You must create proxies for the following service applications: Managed Metadata service application Search service application Secure Store service application The Business Data Connectivity service application automatically creates a proxy and assigns it to the default proxy group when you create the service application. Verify that the proxies are in the default group The following sections provide procedures to complete these steps. Start the service instances The following procedures start the service instances. To start service application instances from Central Administration Start SharePoint 2019 Central Administration. In SharePoint 2019 Central Administration, on the Application Management page, in the Service Applications section, click Manage Services on Server. Next to the Business Data Connectivity service, click Start. Next to the Managed Metadata Web Service, click Start. Next to the Secure Store Service, click Start. The Search service instance must be started by using PowerShell because you cannot start it from Central Administration unless a Search Service application already exists. [!TIP] When using MinRoles, Start may not be available as it is managed by the farm. When the associated Service Application has been created, it automatically starts the Service Instance. To start the Search service instance by using PowerShell Verify that you have the following memberships: securityadmin fixed server role on the SQL Server instance. db_owner fixed database role on all databases that are to be updated. Administrators group on the server on which you are running the PowerShell cmdlets. An administrator can use the Add-SPShellAdmin cmdlet to grant permissions to use SharePoint Server Subscription Edition cmdlets. [!NOTE] If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For additional information about PowerShell permissions, see Add-SPShellAdmin. Start the SharePoint Subscription Edition Management Shell. To start the Search service instance, at the Microsoft PowerShell command prompt, type the following commands and press ENTER after each one: powershell $SearchInst = Get-SPEnterpriseSearchServiceInstance Stores the identity for the Search service instance on this server as a variable powershell Start-SPServiceInstance $SearchInst Starts the service instance For more information, see Get-SPEnterpriseSearchServiceInstance and Start-SPServiceInstance. Upgrade the Secure Store service application To upgrade the Secure Store service application, you create the new service application and upgrade the database, create a proxy and add it to the default proxy group, and then restore the passphrase from the previous environment. To upgrade the Secure Store service application by using PowerShell Verify that you have the following memberships: securityadmin fixed server role on the SQL Server instance. db_owner fixed database role on all databases that are to be updated. Administrators group on the server on which you are running the PowerShell cmdlets. An administrator can use the Add-SPShellAdmin cmdlet to grant permissions to use SharePoint Server cmdlets. [!NOTE] If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For additional information about PowerShell permissions, see Add-SPShellAdmin. Start the SharePoint Subscription Edition Management Shell. To store the application pool that you want to use as a variable for this service application, at the Microsoft PowerShell command prompt, type the following command: powershell $applicationPool = Get-SPServiceApplicationPool -Identity 'SharePoint Web Services default' Where: - _SharePoint Web Services default_ is the name of the service application pool that will contain the new service applications. This is the default service application pool. You can specify a different service application pool. This cmdlet sets the service application pool as a variable that you can use again in the cmdlets that follow. If you have multiple application pools and have to use a different application pool for a particular service application, repeat this step in the procedure to create each service application to use the appropriate application pool. To upgrade the Secure Store service application, at the PowerShell command prompt, type the following command: powershell $sss = New-SPSecureStoreServiceApplication -Name 'Secure Store' -ApplicationPool $applicationPool -DatabaseName 'SecureStore_Upgrade_DB' -AuditingEnabled Where: - _SecureStore_ is the name that you want to give the new Secure Store service application. - $applicationpool is the variable that you set earlier to identify the service application pool to use. > [!TIP] > If you do not use the variable $applicationPool, then you must specify the name of an existing service application pool in the format ' _Application Pool Name_'. To view a list of service application pools, you can run the Get-SPServiceApplicationPool cmdlet. - _SecureStore_Upgrade_DB_ is the name of the service application database that you want to upgrade. This command sets a variable, $sss, that you use when you create the proxy later. For more information, see [New-SPSecureStoreApplication](powershellmodulesharepoint-serverNew-SPSecureStoreApplication?view=sharepoint-ps&preserve-view=true). Type the following command to create a proxy for the Secure Store service application: powershell $sssp = New-SPSecureStoreServiceApplicationProxy -Name ProxyName -ServiceApplication $sss -DefaultProxyGroup Where: - _ProxyName_ is the proxy name that you want to use. - $sss is the variable that you set earlier to identify the new Secure Store service application. > [!TIP] > If you do not use the variable $sss, then you must use an ID to identify the Secure Store service application instead of a name. To find the ID, you can run the Get-SPServiceApplication cmdlet to return a list of all service application IDs. - _DefaultProxyGroup_ adds the Secure Store service application proxy to the default proxy group for the local farm. This command sets a variable, $sssp, for the service application proxy that you use when you restore the passphrase. For more information, see [New-SPSecureStoreServiceApplicationProxy](powershellmodulesharepoint-serverNew-SPSecureStoreServiceApplicationProxy?view=sharepoint-ps&preserve-view=true). After you create the Secure Store service application and the proxy, you have to refresh the encryption key. For information about how to refresh the encryption key, see [Refresh the Secure Store encryption key](..administrationconfigure-the-secure-store-service.mdrefresh). Type the following command to restore the passphrase for the Secure Store service application: powershell Update-SPSecureStoreApplicationServerKey -Passphrase -ServiceApplicationProxy $sssp Where: - _\_ is the Passphrase for the Secure Store service application from your previous environment. - $sssp is a variable that you set earlier to identify the new Secure Store service application proxy. > [!TIP] > If you do not use the variable $sssp, then you must use an ID to identify the Secure Store service application proxy instead of a name. To find the ID, you can run the Get-SPServiceApplicationProxy cmdlet to return a list of all service application proxy IDs. For more information, see [Update-SPSecureStoreApplicationServerKey](powershellmodulesharepoint-serverUpdate-SPSecureStoreApplicationServerKey?view=sharepoint-ps&preserve-view=true). Upgrade the Business Data Connectivity service application To upgrade the Business Data Connectivity service application, you create the new service application and upgrade the database. You do not have to create a proxy for the Business Data Connectivity service application. The Business Data Connectivity service application automatically creates a proxy and assigns it to the default proxy group when you create the service application. To upgrade the Business Data Connectivity service application by using PowerShell Verify that you have the following memberships: securityadmin fixed server role on the SQL Server instance. db_owner fixed database role on all databases that are to be updated. Administrators group on the server on which you are running the PowerShell cmdlets. An administrator can use the Add-SPShellAdmin cmdlet to grant permissions to use SharePoint Server 2019 or SharePoint Server 2016 cmdlets. [!NOTE] If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For additional information about PowerShell permissions, see Add-SPShellAdmin. Start the SharePoint Subscription Edition Management Shell. To store the application pool that you want to use as a variable for this service application, at the Microsoft PowerShell command prompt, type the following command: powershell $applicationPool = Get-SPServiceApplicationPool -Identity 'SharePoint Web Services default' Where: - _SharePoint Web Services default_ is the name of the service application pool that will contain the new service applications. This cmdlet sets the service application pool as a variable that you can use again in the cmdlets that follow. If you have multiple application pools and have to use a different application pool for a particular service application, repeat this step in the procedure to create each service application to use the appropriate application pool. To upgrade the Business Data Connectivity service application, at the Microsoft PowerShell command prompt, type the following command: powershell New-SPBusinessDataCatalogServiceApplication -Name 'BDC Service' -ApplicationPool $applicationPool -DatabaseName 'BDC_Service_DB' Where: - _BDC Service_ is the name that you want to give the new Business Data Connectivity service application. - $applicationpool is the variable that you set earlier to identify the service application pool to use. > [!TIP] > If you do not use the variable $applicationPool, then you must specify the name of an existing service application pool in the format ' _Application Pool Name_'. To view a list of service application pools, you can run the Get-SPServiceApplicationPool cmdlet. - _BDC_Service_DB_ is name of the service application database that you want to upgrade. For more information, see [New-SPBusinessDataCatalogServiceApplication](powershellmodulesharepoint-serverNew-SPBusinessDataCatalogServiceApplication?view=sharepoint-ps&preserve-view=true). Upgrade the Managed Metadata service application To upgrade the Managed Metadata service application, you create the new service application and upgrade the database, and then create a proxy and add it to the default proxy group. To upgrade the Managed Metadata service application by using PowerShell Verify that you have the following memberships: securityadmin fixed server role on the SQL Server instance. db_owner fixed database role on all databases that are to be updated. Administrators group on the server on which you are running the PowerShell cmdlets. An administrator can use the Add-SPShellAdmin cmdlet to grant permissions to use SharePoint Server 2019 or SharePoint Server 2016 cmdlets. [!NOTE] If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For additional information about PowerShell permissions, see Add-SPShellAdmin. Start the SharePoint Subscription Edition Management Shell. To store the application pool that you want to use as a variable for this service application, at the PowerShell command prompt, type the following command: powershell $applicationPool = Get-SPServiceApplicationPool -Identity 'SharePoint Web Services default' Where: - _SharePoint Web Services default_ is the name of the service application pool that will contain the new service applications. This cmdlet sets the service application pool as a variable that you can use again in the cmdlets that follow. If you have multiple application pools and have to use a different application pool for a particular service application, repeat this step in the procedure to create each service application to use the appropriate application pool. To upgrade the Managed Metadata service application, at the PowerShell command prompt, type the following command: powershell $mms = New-SPMetadataServiceApplication -Name 'Managed Metadata Service Application' -ApplicationPool $applicationPool -DatabaseName 'Managed Metadata Service_DB' Where: - _Managed Metadata Service Application_ is the name that you want to give the new Managed Metadata service application. - $applicationpool is the variable that you set earlier to identify the service application pool to use. > [!TIP] > If you do not use the variable $applicationPool, then you must specify the name of an existing service application pool in the format ' _Application Pool Name_'. To view a list of service application pools, you can run the Get-SPServiceApplicationPool cmdlet. - _Managed Metadata Service_DB_ is name of the service application database that you want to upgrade. This command sets a variable, $mms, that you use when you create the proxy later. For more information, see [New-SPMetadataServiceApplication](powershellmodulesharepoint-serverAdd-SPShellAdmin?view=sharepoint-ps&preserve-view=true). At the PowerShell command prompt, type the following command to create a proxy for the Managed Metadata service application: powershell New-SPMetadataServiceApplicationProxy -Name ProxyName -ServiceApplication $mms -DefaultProxyGroup Where: - _ProxyName_ is the proxy name that you want to use. - $mms is the variable that you set earlier to identify the new Managed Metadata service application. > [!TIP] > If you do not use the variable $mms, then you must use an ID to identify the Managed Metadata service application proxy instead of a name. To find the ID, you can run the Get-SPServiceApplication cmdlet to return a list of all service application IDs. - _DefaultProxyGroup_ adds the Managed Metadata service application proxy to the default proxy group for the local farm. For more information, see New-SPMetadataServiceApplicationProxy. Upgrade the User Profile service application Upgrade the Managed Metadata service application before you upgrade the User Profile service application. To upgrade the User Profile service application, you copy the Profile and Social databases in your SharePoint Server 2019 or SharePoint Server 2016 farm to your SharePoint Server Subscription Edition farm and create a new User Profile service application from your SharePoint Server 2019 or SharePoint Server 2016 farm in your SharePoint Server Subscription Edition farm. The restore triggers SharePoint Server Subscription Edition to create a new User Profile service application in the SharePoint Server Subscription Edition farm and point it to the copied User Profile databases. To complete the upgrade of the User Profile service application, you create a proxy and add it to the default proxy group. To upgrade the User Profile service application by using PowerShell Copy the Profile and Social databases in the SharePoint Server 2019 or SharePoint Server 2016 farm to the SharePoint Server Subscription Edition farm by following these steps: [!IMPORTANT] Perform these steps in the SharePoint Server 2019 and SharePoint Server 2016 environment. Verify that you have the following memberships: securityadmin fixed server role on the SQL Server instance. db_owner fixed database role on all databases that are to be updated. Administrators group on the server on which you are running the PowerShell cmdlets. An administrator can use the Add-SPShellAdmin cmdlet to grant permissions to use SharePoint Server 2019 or SharePoint Server 2016 cmdlets. [!NOTE] If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For additional information about PowerShell permissions, see Add-SPShellAdmin. Start the SharePoint Management Shell. Set the User Profile databases to read-only. In the second phase of the process to upgrade SharePoint Server 2019 or SharePoint Server 2016 data and sites to SharePoint Server Subscription Edition, you set all the other databases to read-only. Copy the Profile and Social databases in the SharePoint Server 2019 or SharePoint Server 2016 farm to the SharePoint Server Subscription Edition farm, follow the procedures in Copy databases to the new farm for upgrade to SharePoint Server Subscription Edition. [!IMPORTANT] Perform the next steps in the SharePoint Server Subscription Edition environment. Verify that you have the following memberships: securityadmin fixed server role on the SQL Server instance. db_owner fixed database role on all databases that are to be updated. Administrators group on the server on which you are running the PowerShell cmdlets. An administrator can use the Add-SPShellAdmin cmdlet to grant permissions to use SharePoint Server 2019 or SharePoint Server 2016 cmdlets. [!NOTE] If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For additional information about PowerShell permissions, see Add-SPShellAdmin. Start the SharePoint Subscription Edition Management Shell. To store the application pool that you want to use as a variable for this service application, at the Microsoft PowerShell command prompt, type the following command: powershell $applicationPool = Get-SPServiceApplicationPool -Identity 'SharePoint Web Services default' Where: SharePoint Web Services default is the name of the service application pool that will contain the new service applications. This cmdlet sets the service application pool as a variable that you can use again in the cmdlets that follow. If you have multiple application pools and have to use a different application pool for a particular service application, repeat this step in the procedure to create each service application to use the appropriate application pool. To restore the User Profile service application and upgrade the Profile and Social databases, at the Microsoft PowerShell command prompt, type the following command: powershell New-SPProfileServiceApplication -Name '' -ApplicationPool $applicationPool -ProfileDBName '' -SocialDBName '' -ProfileSyncDBName '' Where: UserProfileApplicationName is the name of the User Profile service application. $applicationpool is the variable that you set to identify the service application pool to use. [!NOTE] If you do not use the variable $applicationPool, then you must specify the name of an existing service application pool in the format 'Application Pool Name'. To view a list of service application pools, you can run the Get-SPServiceApplicationPool cmdlet. ProfileDBName is the name of the Profile database that you want to upgrade. SocialDBName is the name of the Social database that you want to upgrade. SyncDBName is the name of the new Synchronization database. Create the User Profile service application proxy and add it to the default proxy group by completing these actions: Type the following command to get the ID for the User Profile service application and store it as a variable: powershell $sa = Get-SPServiceApplication ?{$_.TypeName -eq 'User Profile Service Application'} For more information, see Get-SPServiceApplication. Type the following command to create a proxy for the User Profile service application: powershell New-SPProfileServiceApplicationProxy -Name 'User Profile Service Application Proxy' -ServiceApplication $sa Where: ProxyName is the proxy name that you want to use. $sa is the variable that you set earlier to identify the new User Profile service application. [!TIP] If you do not use the variable $sa, then you must use an ID to identify the User Profile service application instead of a name. To find the ID, you can run the Get-SPServiceApplication cmdlet to return a list of all service application IDs. For more information, see New-SPProfileServiceApplicationProxy. Type the following command to get the User Profile service application proxy ID for the proxy you just created and set it as the variable $proxy: powershell $proxy = Get-SPServiceApplicationProxy ?{$_.TypeName -eq 'User Profile Service Application Proxy'} For more information, see Get-SPServiceApplicationProxy. Type the following command to add the User Profile service application proxy to the default proxy group: powershell Add-SPServiceApplicationProxyGroupMember -member $proxy -identity "" Where: - $proxy is the variable that you set earlier to identify the ID for the proxy you just created for the User Profile service application. Tip: If you do not use the variable $proxy, then you must use an ID to identify the User Profile service application proxy instead of a name. To find the ID, you can run the Get-SPServiceApplicationProxy cmdlet to return a list of all service application proxy IDs. - You use an empty Identity parameter ("") to add it to the default group. For more information, see Add-SPServiceApplicationProxyGroupMember. Upgrade the Search service application Upgrade the User Profile service application and the Managed Metadata service application before you upgrade the Search service application. To upgrade the Search service application, you copy the search administration database in your SharePoint Server 2019 or SharePoint Server 2016 farm to your SharePoint Server Subscription Edition farm and restore the Search service application from your SharePoint Server 2019 or SharePoint Server 2016 farm in your SharePoint Server Subscription Edition farm. The restore triggers SharePoint Server Subscription Edition to create a new Search service application in the SharePoint Server Subscription Edition farm and point it to the copied search administration database. To complete the upgrade of the Search service application you create a proxy and add it to the default proxy group and you ensure that the new Links Database and the new search topology is configured the same way as in the SharePoint Server 2019 or SharePoint Server 2016 farm. SharePoint Server Subscription Edition normally creates a new search topology with all the search components and databases when it creates a new Search service application. During a restore of a Search service application, SharePoint Server Subscription Edition creates a new search topology, but upgrades the restored Search Administration database instead of creating a new Search Administration database. The upgraded Search Administration database retains any additions or modifications made to the search schema, result sources and query rules from the SharePoint Server 2019 or SharePoint Server 2016 farm. [!NOTE] During this upgrade, search doesn't crawl content in your SharePoint Server 2019 or SharePoint Server 2016. If freshness of search results is important, save time by familiarizing yourself with these steps before starting the upgrade. [!IMPORTANT] As the search topology in the SharePoint Server Subscription Edition farm is new, the index is empty. You have to perform a full crawl of the entire indexed corpus after you have upgraded all content sources (the fourth phase in the process to upgrade SharePoint Server 2019 and SharePoint Server 2016 data and sites to SharePoint Server Subscription Edition). To upgrade the Search service application by using PowerShell Copy the search administration database in the SharePoint Server 2019 or SharePoint Server 2016 farm to the SharePoint Server Subscription Edition farm and follow these steps: [!NOTE] You copied all other content and service databases in your SharePoint Server 2019 or SharePoint Server 2016 environment in an earlier step of the process for upgrading to SharePoint Server Subscription Edition. We recommend copying the Search Administration database at this later stage because you have to pause the Search service application in your SharePoint Server 2019 or SharePoint Server 2016 environment while copying the Search Administration database. [!IMPORTANT] Perform these steps in the SharePoint Server 2019 or SharePoint Server 2016 environment. Verify that you have the following memberships: securityadmin fixed server role on the SQL Server instance. db_owner fixed database role on all databases that are to be updated. Administrators group on the server on which you are running the PowerShell cmdlets. An administrator can use the Add-SPShellAdmin cmdlet to grant permissions to use SharePoint Server Subscription Edition cmdlets. [!NOTE] If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For additional information about PowerShell permissions, see Add-SPShellAdmin. Start the SharePoint 2019 or SharePoint 2016 Management Shell. Pause the Search service application. At the PowerShell command prompt, type the following command: powershell $ssa = Get-SPEnterpriseSearchServiceApplication Suspend-SPEnterpriseSearchServiceApplication -Identity $ssa Where: SearchServiceApplicationName is the name of the Search service application you want to pause. [!NOTE] While the Search service application is paused, the index in the SharePoint Server 2019 or SharePoint Server 2016 environment isn't updated. This means that during the upgrade to SharePoint Server Subscription Edition, search results might be less fresh. Set the Search Administration database to read-only. In the second phase of the process to upgrade SharePoint Server 2019 or SharePoint Server 2016 data and sites to SharePoint Server Subscription Edition, you set all the other databases to read-only. Follow the same instructions now for the Search Administration database. Copy the search administration database in the SharePoint Server 2019 or SharePoint Server 2016 farm to the SharePoint Server Subscription Edition farm, follow the procedures in Copy databases to the new farm for upgrade to SharePoint Server Subscription Edition for the search administration database only. [!IMPORTANT] Perform the next steps in the SharePoint Server Subscription Edition environment. Verify that you have the following memberships: securityadmin fixed server role on the SQL Server instance. db_owner fixed database role on all databases that are to be updated. Administrators group on the server on which you are running the PowerShell cmdlets. An administrator can use the Add-SPShellAdmin cmdlet to grant permissions to use SharePoint Server Subscription Edition cmdlets. [!NOTE] If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For additional information about PowerShell permissions, see Add-SPShellAdmin. Start the SharePoint Subscription Edition Management Shell. To store the application pool that you want to use as a variable for this service application, at the PowerShell command prompt, type the following command: powershell $applicationPool = Get-SPServiceApplicationPool -Identity 'SharePoint Web Services default' Where: SharePoint Web Services default is the name of the service application pool that will contain the new service applications. This cmdlet sets the service application pool as a variable that you can use again in the cmdlets that follow. If you have multiple application pools and have to use a different application pool for a particular service application, repeat this step in the procedure to create each service application to use the appropriate application pool. To restore the Search service application and upgrade the Search Administration database, at the PowerShell command prompt, type the following command: powershell $searchInst = Get-SPEnterpriseSearchServiceInstance -local Gets the Search service instance and sets a variable to use in the next command Restore-SPEnterpriseSearchServiceApplication -Name '' -applicationpool $applicationPool -databasename '' -databaseserver -AdminSearchServiceInstance $searchInst Where: SearchServiceApplicationName is the name of the Search service application. $applicationpool is the variable that you set to identify the service application pool to use. [!NOTE] If you do not use the variable $applicationPool, then you must specify the name of an existing service application pool in the format ' Application Pool Name'. To view a list of service application pools, you can run the Get-SPServiceApplicationPool cmdlet. SearchServiceApplicationDBName is the name of the search administration database that you want to upgrade, and that this Search service application shall use. $searchInst is the variable that you set to identify the new Search Service application instance. [!NOTE] A Search service application upgrade might fail, for example due to network or SQL Server latency. If an error message appears during the upgrade, do the following: Delete the Search Administration database that you were trying to upgrade. Using the backup copy that you made of the Search Administration database, repeat the following procedures in this article for the Search service application only: Restore a backup copy of the database Set the databases to read-write Type the command to upgrade the Search service application again at the PowerShell command prompt. For more information, see Restore-SPEnterpriseSearchServiceApplication. Create the Search service application proxy and add it to the default proxy group by completing these actions: Type the following command to get the ID for the Search service application and store it as a variable: powershell $ssa = Get-SPEnterpriseSearchServiceApplication For more information, see Get-SPEnterpriseSearchServiceApplication. Type the following command to create a proxy for the Search service application: powershell New-SPEnterpriseSearchServiceApplicationProxy -Name ProxyName -SearchApplication $ssa Where: ProxyName is the proxy name that you want to use. $ssa is the variable that you set earlier to identify the new Search service application. [!TIP] If you do not use the variable $ssa, then you must use an ID to identify the Search service application instead of a name. To find the ID, you can run the Get-SPServiceApplication cmdlet to return a list of all service application IDs. For more information, see New-SPEnterpriseSearchServiceApplicationProxy. Type the following command to get the Search service application proxy ID for the proxy you just created and set it as the variable $ssap: powershell $ssap = Get-SPEnterpriseSearchServiceApplicationProxy For more information, see [Get-SPEnterpriseSearchServiceApplicationProxy](powershellmodulesharepoint-serverGet-SPEnterpriseSearchServiceApplicationProxy?view=sharepoint-ps&preserve-view=true). Type the following command to add the Search service application proxy to the default proxy group: powershell Add-SPServiceApplicationProxyGroupMember -member $ssap -identity "" Where: $ssap is the variable that you set earlier to identify the ID for the proxy you just created for the Search service application. [!TIP] If you do not use the variable $ssap, then you must use an ID to identify the Search service application proxy instead of a name. To find the ID, you can run the Get-SPServiceApplicationProxy cmdlet to return a list of all service application proxy IDs. You use an empty Identity parameter ("") to add it to the default group. For more information, see Add-SPServiceApplicationProxyGroupMember. If the SharePoint Server 2019 or SharePoint Server 2016 farm uses a Links Database that is partitioned, partition the Links Database in the SharePoint Server Subscription Edition farm the same way. Learn how in Move-SPEnterpriseSearchLinksDatabases. (Optional) Preserve search relevance settings from the SharePoint Server 2019 or SharePoint Server 2016 farm. As the upgraded Search service application has a new, empty index, search analytics data from the SharePoint Server 2019 or SharePoint Server 2016 farm cannot be fully retained. Copy the Analytics Reporting database from the SharePoint Server 2019 or SharePoint Server 2016 farm and attach it to the new Search service application in the SharePoint Server Subscription Edition farm: In the SharePoint Server 2019 or SharePoint Server 2016 farm, backup the Analytics Reporting database. In the SharePoint Server Subscription Edition farm, restore the backed up database to the new database server. In the SharePoint Server Subscription Edition farm, attach the restored database to the new Search service application. Verify that the search topology on the new SharePoint Server Subscription Edition farm is alike that of the SharePoint Server 2019 or SharePoint Server 2016 farm. If your requirements for search have changed, now is a good time to scale out the search topology of the new SharePoint Server Subscription Edition farm. Resume the Search service application in the SharePoint Server environment. At the PowerShell command prompt, type the following command: powershell $ssa = Get-SPEnterpriseSearchServiceApplication $ssa.ForceResume(0x02) Where: SearchServiceApplicationName is the name of the Search service application you want to resume. Verify that all of the new proxies are in the default proxy group Use the following procedure to verify that the steps to create the proxies and add them to the default proxy group worked. To verify that all of the new proxies are in the default proxy group by using PowerShell Verify that you have the following memberships: securityadmin fixed server role on the SQL Server instance. db_owner fixed database role on all databases that are to be updated. Administrators group on the server on which you are running the PowerShell cmdlets. An administrator can use the Add-SPShellAdmin cmdlet to grant permissions to use SharePoint Server 2019 or SharePoint Server 2016 cmdlets. [!NOTE] If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For additional information about PowerShell permissions, see Add-SPShellAdmin. Start the SharePoint Subscription Edition Management Shell. At the PowerShell command prompt, type the following commands: powershell $pg = Get-SPServiceApplicationProxyGroup -Identity "" $pg.Proxies Where: $pg is a variable you set to represent the default proxy group. You use an empty Identity parameter ("") to specify the default proxy group. This returns a list of all proxies in the default proxy group, their display names, type names, and IDs. For more information, see Get-SPServiceApplicationProxyGroupGet-SPServiceApplicationProxyGroup. Now that the service applications are upgraded, you can start the process to upgrade the content databases. The first step in that process is to create the web applications that are needed for each content database. PhasesDescription :-----:----- This is the third phase in the process to upgrade SharePoint Server 2019 and SharePoint Server 2016 data and sites to SharePoint Server Subscription Edition. For an overview of the whole process, see Overview of the upgrade process to SharePoint Server Subscription Edition. Next phase: Upgrade content databases to SharePoint Server Subscription Edition See also Concepts Create the SharePoint Server Subscription Edition farm for a database attach upgrade Copy databases to the new farm for upgrade to SharePoint Server Subscription Edition Upgrade content databases to SharePoint Server Subscription Edition Services upgrade overview for SharePoint Server Subscription Edition
OfficeDocs-SharePoint/SharePoint/SharePointServer/upgrade-and-update/upgrade-service-applications-to-sharepoint-server-subscription-edition.md/0
Upgrade service applications to SharePoint Server Subscription Edition
OfficeDocs-SharePoint/SharePoint/SharePointServer/upgrade-and-update/upgrade-service-applications-to-sharepoint-server-subscription-edition.md
OfficeDocs-SharePoint
8,578
75
ms.date: 03132018 title: "New and improved features in SharePoint Server 2016" ms.reviewer: ms.author: serdars author: SerdarSoysal manager: serdars audience: ITPro f1.keywords: - NOCSH ms.topic: overview ms.service: sharepoint-server-itpro ms.localizationpriority: high ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top - Strat_SP_server ms.custom: ms.assetid: e81557fb-5046-4a67-8ec8-fdfda648af68 description: "Learn about the new features and updates to existing features in SharePoint Server 2016." New and improved features in SharePoint Server 2016 [!INCLUDEappliesto-xxx-2016-xxx-xxx-xxx-md] Learn about the new features and updates to existing features in SharePoint Server 2016. For a comparison of SharePoint on-premises features between SharePoint 2013 and SharePoint Server 2016 editions, see SharePoint feature availability across on-premises solutions. For new features in SharePoint Server 2016 for end users, see What's new in SharePoint Server 2016. Summary of features The following table provides a summary of the new features that you can try out in this SharePoint Server 2016 release. FeatureDescriptionMore info :-----:-----:----- Access Services New Access features are available when you deploy Access Services in SharePoint Server 2016. For more info, see Access Services plus Access client and server. Compliance features New compliance features for SharePoint Server 2016 include the document deletion and in-place hold policies. For more info, see Compliance features. Customized web parts The compile time for customized XSLT files used for Content Query, Summary Links, and Table of Contents Web Parts is improved. NA Document Library accessibility SharePoint Server 2016 includes new document library accessibility features. For more info, see Document Library accessibility. Durable links Resource-based URLs now retain links when documents are renamed or moved in SharePoint. NA Encrypted Connections SharePoint Server 2016 supports TLS 1.2 connection encryption by default. For more info, see Encrypted connections. Fast Site Collection Creation The Fast Site Collection Creation feature is a rapid method to create site collections and sites in SharePoint. For more info, see Fast Site Collection Creation. Filenames - expanded support for special characters SharePoint Server 2016 now supports using some special characters in file names that were blocked in previous versions. For more info, see File names - expanded support for special characters. Hybrid in SharePoint 2016 Hybrid in SharePoint Server 2016 enables you to integrate your on-premises farm with Microsoft 365 productivity experiences, allowing you to adopt the cloud at your own pace. For more info, see Hybrid in SharePoint Server 2016. Identify and search for sensitive content SharePoint Server 2016 now provides the same data loss prevention capabilities as Office 365. For more info, see Identify and search for sensitive content in both SharePoint Server 2016 and OneDrive documents. Image and video previews You can now preview images and videos in SharePoint Server 2016 document libraries. For more info, see Image and video previews. Information Rights Management SharePoint Server 2016 provides Information Rights Management (IRM) capabilities to secure info by encrypting and securing info about SharePoint libraries with OneDrive. For more info, see Information Rights Management. Large file support SharePoint Server 2016 now supports uploading and downloading files larger than 2,047 MB. For more info, see Large file support. MinRole MinRole is a new feature in SharePoint Server 2016 that allows a SharePoint farm administrator to define each server's role in a farm topology. For more info, see MinRole farm topology. Mobile experience SharePoint Server 2016 offers an improved mobile navigation experience. For more info, see Mobile experience. New features in November 2016 PU for SharePoint Server 2016 (Feature Pack 1) The November 2016 Public Update for SharePoint Server 2016 (Feature Pack 1) offers seven new features for SharePoint Server 2016. For more info, see New features in November 2016 PU for SharePoint Server 2016 (Feature Pack 1). New controls for working with OneDrive SharePoint Server 2016 provides controls at the top of your personal document folders that make common tasks in OneDrive more accessible. For more info, see New controls for working with OneDrive. New Recycle Bin in OneDrive and Team sites SharePoint Server 2016 adds a link for the Recycle Bin in the left navigation area of the OneDrive and Team sites. NA Open Document Format (ODF) SharePoint Server 2016 adds support for Open Document Format (ODF) files to use in document library templates. For more info, see Open Document Format (ODF) available for document libraries. Project Server New Project Server features are available in SharePoint Server 2016. For more info, see Project Server 2016 . ReFS file system support SharePoint Server 2016 now supports drives that are formatted with the ReFS file system. For more info about the ReFS file system, see Resilient File System Overview and Resilient file system. SharePoint business intelligence SharePoint Server 2016 now supports SQL Server 2016 CTP 3.1 and the Power Pivot add-in and Power View. For more info about SharePoint business intelligence, see Power Pivot add-in and Power View are now available to use with SharePoint Server 2016. SharePoint Search SharePoint Search Server Application has significant changes to its deployment. For more info, see SharePoint Search Service application. Sharing improvements SharePoint Server 2016 has many new sharing improvements available. For more info, see Sharing. Site Folders view SharePoint Server 2016 provides a new Site Folders view that lets you access the document libraries in sites that you're following. For more info, see Site folders view. Sites page pinning This new feature helps you see and follow sites. For more info, see Sites page pinning. SMTP Connection Encryption SharePoint Server 2016 supports sending email to SMTP servers that use STARTTLS connection encryption. For more info, see SMTP connection encryption. SMTP ports (non-default) SharePoint Server 2016 adds support for SMTP servers that use TCP ports other than the default port (25). For more info, see Use SMTP ports other than the default (25). Web Application Open Platform Interface Protocol (WOPI) You can now rename files, create new files, and share files from within the WOPI iframe on the browser page. NA Detailed description of features This section provides detailed descriptions of the new and updated features in SharePoint Server 2016. Access Services plus Access client and server The following new Access features are available when you deploy Access Services in SharePoint Server 2016: Support apps for Office. For more info, see Spice up your Access app with add-ins for Office. Access App Upgrade. For more info, see Upgrade an Access app. Download in Excel feature available for users to pivot Access tables. For more info, see Introducing a new feature in Access 2013 web apps-Download in Excel. With the improved Related Item Control, you can do the following: On the Related Item Control, select from any existing view for the dialog. Add a new item on the Related Item Control when the parent record isn't saved. At the bottom of the Related Item Control, turn off the Add link. The Cascading Combo box is now available in Access. For more info, see Introducing a new user experience feature in Access web apps: Cascading Controls. Central Administration is no longer provisioned on all servers by default SharePoint Server 2016 Central Administration is now provisioned on the first server in a farm by default when using the SharePoint Products Configuration Wizard. Central Administration is not provisioned on additional servers in a farm by default. You can provision or unprovision Central Administration on individual servers in a farm, no matter what the server role is by using the following methods: The Services on Server page on Central Administration > System Settings Microsoft PowerShell cmdlets: New-SPCentralAdministration Remove-SPCentralAdministration The psconfig.exe -cmd adminvs operation The SharePoint Products Configuration Wizard [!NOTE] The state of Central Administration does not affect whether a server is considered compliant with MinRole. The MinRole health rule will not attempt to provision or unprovision Central Administration. Compliance features The document deletion policy lets you delete documents in users' OneDrive sites after specific periods of time. The In-Place Hold policy allows administrators to preserve documents, email, and other files. For more info, see Overview of document deletion policies. Document Library accessibility The following features are now available for working in SharePoint Server 2016 document libraries: Landmarks to a page make it easier to navigate, and there are alt text improvements for all major navigation links. Keyboard shortcuts are provided for the following document tasks: Alt + N - N ew Alt + E - E dit Alt + U - U pload Alt + M - M anage Alt + S - S hare Alt + Y - S y nchronization Focus improvements, such as keeping focus on prior elements and focus trapping. Announcements for upload progress. Announcements for file name and file types when browsing folder and file lists. Improved callout reading. Fixed use of color issues for views switcher. Updates to the Help documentation. Encrypted connections When you set up an SSL binding in Internet Information Services (IIS) Manager to host your web application, SharePoint uses TLS 1.2 connection encryption if your client application supports it. SharePoint also supports TLS 1.2 connection encryption when connecting to other systems, for example when crawling websites. [!NOTE] A security vulnerability was identified in the SSL 3.0 protocol that can allow an attacker to decrypt data. For enhanced security, some SharePoint features now disable SSL 3.0 connection encryption by default, as well as certain encryption algorithms (for example RC4) with known weaknesses. SharePoint disables SSL 3.0 connection encryption by default for some, but not all features. To ensure that SSL 3.0 is disabled for all features, you should disable it in Windows by editing the Windows Registry. For more info, see the "Disable SSL 3.0 in Windows For Server Software", and "For Client Software", workarounds in Microsoft Security Advisory 3009008. Fast Site Collection Creation This new feature provides templates that work at same level as SQL Server, which reduces the round trips required between the SharePoint and SQL servers. Use the SPSiteMaster Microsoft PowerShell cmdlets to create sites and site collections quickly. File names - expanded support for special characters SharePoint has historically blocked file names that included the &, ~, {, and } characters, file names that contained a GUID, file names with leading dots, and file names longer than 128 characters. These restrictions are removed in SharePoint Server 2016 and are now available to use. [!IMPORTANT] Restricted characters such as % and are still not allowed in file names. Page file names, such as wiki pages, may not contain the following characters: " % : ? \ nor can they begin with a leading dot (period) character. Hybrid in SharePoint Server 2016 In SharePoint Server 2016, new hybrid features are available to enable hybrid solutions. Hybrid sites Hybrid sites features allows your users to have an integrated experience while using SharePoint Server and SharePoint in Microsoft 365 sites: Users can follow SharePoint Server and SharePoint in Microsoft 365 sites, and see them consolidated in a single list. Users have a single profile in Office 365, where all of their profile info is stored. For more info, see SharePoint hybrid sites and search. Hybrid OneDrive Hybrid sites features are used in concert with Hybrid OneDrive (introduced in SharePoint Server 2013 with Service Pack 1 (SP1)): Users can sync files with Office 365 and share them with others. Users can access their files directly through Office 365 from any device. Cloud hybrid search Cloud hybrid search is a new hybrid search solution alternative. With cloud hybrid search: You index all of your crawled content, including on-premises content, to your search index in Office 365. You can set up the crawler in SharePoint Server 2016 to crawl the same content sources and use the same search connectors in Office SharePoint Server 2007, SharePoint Server 2010, and SharePoint Server 2013. When users query your search index in Office 365, they get unified search results from both on-premises and Office 365 content. For more info about cloud hybrid search, see the public Microsoft cloud hybrid search program on Microsoft Office connection. For more info, see Plan for hybrid OneDrive. For more info about the hybrid solutions available today, visit the SharePoint Hybrid Solutions Center. Identify and search for sensitive content in both SharePoint Server 2016 and OneDrive documents With this new capability, you can: Search for sensitive content across SharePoint Server 2016, SharePoint in Microsoft 365, and OneDrive. Leverage 51 built-in sensitive information types (credit cards, passport numbers, Social Security numbers, and more). To discover sensitive content relating to common industry regulations from the SharePoint eDiscovery Center, from the eDiscovery site collection, select DLP Queries, identify offending documents, and export a report. Turn on DLP Policies from the Compliance Policy Center site collection to notify end users and administrators when documents with sensitive info are stored in SharePoint and automatically protect the documents from improper sharing. Info about configuring and using this feature is documented in SharePoint and Microsoft 365. For more info, see: Search for sensitive content in SharePoint and OneDrive documents Use DLP in SharePoint to identify sensitive data stored on sites Image and video previews In SharePoint Server 2016, when you post images and videos to a document library, you can see a preview by hovering the mouse over the image or video, or by selecting them. Information Rights Management For more info, see Secure and sync with Information Rights Management on OneDrive and Apply Information Rights Management to a list or library. Large file support Previous versions of SharePoint did not support uploading or downloading files larger than 2,047 MB. SharePoint Server 2016 now allows you to upload or download larger files. You can configure the desired maximum file-size limit on a per-web application basis in your SharePoint farm. MinRole farm topology The role of a server is specified when you create a new farm or join a server to an existing farm. SharePoint automatically configures the services on each server based on the server role, optimizing the performance of the farm based on that topology. There are eight predefined server roles that are available, as shown in the following table. Server roleDescription :-----:----- Front-end Service applications, services, and components that serve user requests belong on front-end web servers. These servers are optimized for low latency. Application Service applications, services, and components that serve back-end requests, such as background jobs or search crawl requests, belong on Application servers. These servers are optimized for high throughput. Distributed Cache Service applications, services, and components that are required for a distributed cache belong on Distributed Cache servers. Search Service applications, services, and components that are required for search belong on Search servers. Custom Custom service applications, services, and components that do not integrate with MinRole belong on Custom servers. The farm administrator has full control over which service instances can run on servers assigned to the Custom role. MinRole does not control which service instances are provisioned on this role. Single-Server Farm Service applications, services, and components required for a single-machine farm belong on a Single-Server Farm. A Single-Server Farm is meant for development, testing, and very limited production use. A SharePoint farm with the Single-Server Farm role cannot have more than one SharePoint server in the farm. Important: The Standalone Install mode is no longer available in SharePoint Server 2016. The Single-Server Farm role replaces the Standalone Install mode available in previous SharePoint Server releases. Unlike Standalone Install, the SharePoint admin must separately install and prepare Microsoft SQL Server for SharePoint. The SharePoint admin must also configure the SharePoint farm services and web applications, either manually or by running the Farm Configuration Wizard. Front-end with Distributed Cache Shared role that combines the Front-end and Distributed Cache roles on the same server. Note: This shared role was introduced in the November Public Update for SharePoint Server 2016 (Feature Pack 1). Application with Search Shared role that combines the Application and Search roles on the same server. Note: This shared role was introduced in the November Public Update for SharePoint Server 2016 (Feature Pack 1). For more info about the MinRole feature, see Overview of MinRole Server Roles in SharePoint Server 2016 and Planning for a MinRole server deployment in SharePoint Server 2016. Mobile experience When you use a mobile device to access the home page for a SharePoint Server 2016 team site, you can tap tiles or links on the screen to navigate the site. You can also switch from the mobile view to PC view, which displays site pages as they are seen on a client computer. This view is also touch enabled. New controls for working with OneDrive You can sslect a control to create new Office documents, upload files, synchronize your files for offline use, and share your files. For more info, see "Simple controls" on The OneDrive Blog. Open Document Format (ODF) available for document libraries The Open Document Format (ODF) lets you create new files in a document library and save as ODF files so that users can edit the new file with a program they choose. For more info, see Set Open Document Format (ODF) as the default file template for a library. Project Server 2016 Project Server 2016 for SharePoint Server 2016 has many new capabilities and features, including: Resource Engagements: Now project managers can request needed resources from resource managers to complete their projects. Also, resource managers can use the new heat map functionality to see where resources are spending their time. Multiple Timelines: Project and Portfolio managers can now create richer timelines that display multiple timelines in a single view. Simpler administration: Project Server now has multi-tenant storage capabilities and has combined data storage with SharePoint. This greatly reduces IT overhead by eliminating the dedicated Project Server database and improves backup and restore capabilities. Cloud grade performance and scale: Many performance and scalability improvements that have been added to Project Online have also been added to Project Server 2016. For more info, see What's new for IT pros in Project Server 2016 Preview. [!IMPORTANT] Project Server 2016 is installed with SharePoint Server 2016 Enterprise, though is licensed separately. For more info about Project Server licensing, see Licensing Project. Power Pivot add-in and Power View are now available to use with SharePoint Server 2016 SQL Server 2016 CTP 3.1 is now available. You can now download SQL Server 2016 CTP 3.1 to use the Power Pivot for SharePoint add-in. You can also use Power View by installing SQL Server Reporting Services (SSRS) in SharePoint-integrated mode and the SSRS front-end add-in from the SQL Server installation media. Download SQL Server 2016 CTP 3.1 from Microsoft Download Center. The following SharePoint Server 2016 business intelligence features are available when you upgrade to SQL Server 2016 CTP 3.1: Power Pivot Gallery Scheduled Data Refresh Workbooks as a Data Source Power Pivot Management Dashboard Power View reports Power View Subscriptions Report Alerting For more info, download the new Deploying SQL Server 2016 PowerPivot and Power View in SharePoint 2016 white paper. For details about configuring and deploying business intelligence in a multiple server SharePoint Server 2016 farm, download Deploying SQL Server 2016 PowerPivot and Power View in a Multi-Tier SharePoint 2016 Farm. Request Manager service improvements SharePoint Request Manager now provisions on the server roles shown in the following list, to support both throttling and routing scenarios: Application Distributed Cache Front-End Additionally, the Request Manager service will no longer prevent sites from rendering when the service is enabled while you have no routing rules defined. Sharing The following list shows the sharing improvements that are available for SharePoint Server 2016: Create and Share folder Sharing Hint See who the folder is shared with when viewing a folder Members can share Improved invitation mail One-click email to approve or deny a request for access Recently Shared Items cache, see Enable the Recently Shared Items (RSI) cache to quickly populate the Shared with Me view. SharePoint Search Service application SharePoint Search supports indexing of up to 500 million items per Search Server application. For more info, see Overview of search architecture in SharePoint Server. For info about SharePoint cloud hybrid search, see Learn about cloud hybrid search for SharePoint. Simplified SSL configuration for Central Administration site We've simplified the process for configuring Central Administration to use SSL bindings. The following command parameters are now available to use: New-SPCentralAdministration -Port -SecureSocketsLayer Set-SPCentralAdministration -Port -SecureSocketsLayer Psconfig.exe -cmd adminvs -port -ssl You must assign a server certificate to the Central Administration IIS web site by using the IIS administration tools. The Central Administration web application won't be accessible until you do this. If you specify port 443, it will automatically create an SSL binding instead of an HTTP binding even if you don't include the SecureSocketsLayer or SSL parameters. The Central Administration public AAM URL will be automatically updated to use the appropriate protocol scheme, server name, and port number. Site collection upgrades There are three options available for upgrading site collections. For more info, see Upgrade a site collection to SharePoint Server 2016. SMTP connection encryption The following list shows the SharePoint 2016 requirements that are needed to negotiate connection encryption with an SMTP server: STARTTLS must be enabled on the SMTP server. The SMTP server must support the TLS 1.0, TLS 1.1, or TLS 1.2 protocol. [!IMPORTANT] SSL 2.0 and SSL 3.0 protocols are not supported. The SMTP server must have a server certificate installed. The server certificate must be valid. Typically, this means that the name of the server certificate must match the name of the SMTP server provided to SharePoint. The server certificate must also be issued by a certificate authority that is trusted by the SharePoint server. SharePoint must be configured to use SMTP connection encryption. To configure SharePoint to always use SMTP connection encryption, open the SharePoint Central Administration website and browse to System Settings > Configure outgoing e-mail settings and set the Use TLS connection encryption drop-down menu to Yes. To configure SharePoint to always use SMTP connection encryption in Microsoft PowerShell, use the Set-SPWebApplication cmdlet without the DisableSMTPEncryption parameter. For example: $WebApp = Get-SPWebApplication -IncludeCentralAdministration ? { $_.IsAdministrationWebApplication -eq $true } Set-SPWebApplication -Identity $WebApp -SMTPServer smtp.internal.contoso.com -OutgoingEmailAddress sharepoint@contoso.com -ReplyToEmailAddress sharepoint@contoso.com To configure SharePoint to never use SMTP connection encryption in SharePoint Central Administration, browse to System Settings > Configure outgoing email settings and set the Use TLS connection encryption drop-down menu to No. To configure SharePoint to never use SMTP connection encryption in PowerShell, use the Set-SPWebApplication cmdlet with the DisableSMTPEncryption parameter. For example: $WebApp = Get-SPWebApplication -IncludeCentralAdministration ? { $_.IsAdministrationWebApplication -eq $true } Set-SPWebApplication -Identity $WebApp -SMTPServer smtp.internal.contoso.com -DisableSMTPEncryption -OutgoingEmailAddress sharepoint@contoso.com -ReplyToEmailAddress sharepoint@contoso.com [!NOTE] If SharePoint is configured to use SMTP connection encryption, it will only send email messages if it successfully negotiates connection encryption with the SMTP server. It will not fall back and send email messages unencrypted if connection encryption negotiation fails. If SharePoint is not configured to use SMTP connection encryption, it will always send email messages unencrypted, even if the SMTP server supports connection encryption. > Using SMTP connection encryption does not enable SMTP authentication. SMTP requests are always sent anonymously. Site folders view For more info, see "Site folders" in The OneDrive Blog. Sites page pinning You can now pin sites that you see on the sites page. A pinned site shows at the top of the list of sites that you're following. Suite Navigation is themable You can now apply themes to your Suite Navigation. Use SMTP ports other than the default (25) To configure SharePoint to use a non-default SMTP port open SharePoint Central Administration, browse to System Settings > Configure outgoing email settings, and set the SMTP server port to the port number of your SMTP server. To configure SharePoint to use a non-default SMTP port in PowerShell, use the Set-SPWebApplication cmdlet with the SMTPServerPort \ parameter. For example: $WebApp = Get-SPWebApplication -IncludeCentralAdministration ? { $_.IsAdministrationWebApplication -eq $true } Set-SPWebApplication -Identity $WebApp -SMTPServer smtp.internal.contoso.com -SMTPServerPort 587 -OutgoingEmailAddress sharepoint@contoso.com -ReplyToEmailAddress sh arepoint@contoso.com Related Topics What is SharePoint?
OfficeDocs-SharePoint/SharePoint/SharePointServer/what-s-new/new-and-improved-features-in-sharepoint-server-2016.md/0
New and improved features in SharePoint Server 2016
OfficeDocs-SharePoint/SharePoint/SharePointServer/what-s-new/new-and-improved-features-in-sharepoint-server-2016.md
OfficeDocs-SharePoint
5,681
76
ms.date: 10182018 title: "Migrate content to OneDrive in Microsoft 365" ms.reviewer: ms.author: heidip author: MicrosoftHeidi manager: jtremper recommendations: true audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: microsoft-365-migration ms.localizationpriority: high ms.collection: - IT_Sharepoint_Server_Top - SPMigration - M365-collaboration - m365initiative-migratetom365 ms.custom: - seo-marvel-apr2020 - admindeeplinkSPO description: "You can use the SharePoint Migration Tool to migrate content to OneDrive (for work or school accounts) for free." Migrate content to a OneDrive work or school account You can use either of the following tools to migrate files and folders on your computer or a network drive to your OneDrive work or school account for FREE! Migration Manager Go to the Migration center. Sign in with an account that has admin permissions for your organization. To learn more, see Get started with Migration Manager. SharePoint Migration tool Download and install the SharePoint Migration Tool for FREE. As an admin, you can also run the tool for your users. To learn more, see SharePoint and OneDrive Migration Tool. Microsoft FastTrack Microsoft offers FastTrack service to help migrate your files and folders not only from file shares but also from Google Drive and Box. To get started, visit FastTrack.microsoft.com. Sign in, review the available resources, and submit a request for assistance. Migration service providers Your organization may have business needs that require you to use third-party services or applications to migrate enterprise content to Microsoft 365. Explore the professional services and applications available from the Microsoft Partner Center.
OfficeDocs-SharePoint/migration/Migrating-content-to-OneDrive-for-Business.md/0
Migrate content to a OneDrive work or school account
OfficeDocs-SharePoint/migration/Migrating-content-to-OneDrive-for-Business.md
OfficeDocs-SharePoint
422
77
ms.date: 01152020 title: Migrating OneNote folders with the SharePoint Migration Tool (SPMT) ms.author: heidip author: MicrosoftHeidi manager: jtremper recommendations: true audience: ITPro f1.keywords: - NOCSH ms.topic: troubleshooting ms.service: microsoft-365-migration ms.localizationpriority: high ms.collection: - SPMigration - M365-collaboration - m365initiative-migratetom365 search.appverid: MET150 description: "How to migrate OneNote folder using the SharePoint Migration Tool SPMT." How the SharePoint Migration Tool (SPMT) migrates OneNote folders The SharePoint Migration Tool (SPMT) supports migrating your OneNote folders to Microsoft 365. But before migrating your OneNote folders, it's important to understand a little about their file structure. On your computer, a OneNote Notebook is presented as a normal folder. For each Notebook, there's a .onetoc2 file created under the root folder of the Notebook folder. You can have as many Notebooks as you want. If you create section groups in your Notebook, those groups are also presented as a folder. Under each section group, you can create multiple sections, and each one of those sections will be presented as .one file in file system. You can create multiple pages within a section, but the content of those pages will be contained in the same .one file as the section to which they belong. When you open the OneNote application, they appear like this: Folders are migrated to SharePoint as OneNote Notebook content rather than a normal folder with files. They'll appear in SharePoint like this: For this OneNote Notebook to appear in your Notebooks list in OneNote, select the Show Actions ellipses next to the notebook name in the Document library, then select Open > Open in app. Remember to first close the OneNote application for the already migrated notebooks.
OfficeDocs-SharePoint/migration/migrate-onenote-spmt.md/0
How the SharePoint Migration Tool (SPMT) migrates OneNote folders
OfficeDocs-SharePoint/migration/migrate-onenote-spmt.md
OfficeDocs-SharePoint
455
78
title: "Migration Assessment Scan IRM Enabled Lists" ms.reviewer: ms.author: heidip author: MicrosoftHeidi manager: jtremper recommendations: true ms.date: 9122017 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: microsoft-365-migration ms.localizationpriority: high ms.collection: - IT_SharePoint_Hybrid_Top - IT_Sharepoint_Server_Top - Strat_SP_gtc - SPMigration - M365-collaboration - m365initiative-migratetom365 ms.custom: ms.assetid: fce14caf-dc41-485d-91c6-4d533c8d1097 description: "Learn how to mitigate issues with IRM enabled lists during migration." Migration Assessment Scan: IRM Enabled Lists Learn how to mitigate issues with IRM enabled lists during migration. Overview Information Rights Management (IRM) is a feature that lets you encrypt content when a user accesses it to ensure it can't be forwarded or manipulated. The files are stored in an unencrypted format in SharePoint. When a user accesses a file in an IRM protected list, the file is protected prior to transit. The file can only be opened in an IRM-supported client application such as Microsoft Office. There are two main components to the IRM migration process: Configure the target environment to support Microsoft Entra Rights Management. Disable IRM on the source and target SharePoint libraries. This is required as the migration tooling will access the files in the same manner as a user. If IRM is enabled on the source, the migration tooling will receive an encrypted file and upload that encrypted file to the target environment. This results in a file that can no longer be opened successfully. Data Migration IRM settings associated with lists and libraries aren't migrated. The following process is required to enable the migration tooling to properly handle IRM protected libraries. This process ensures that the content is transferred and accessible post migration. Disable IRM on the source and target list. Migration tooling will copy the files from the source and place them in the target. Enable IRM on the source and target list. [!IMPORTANT] Any site that is configured as "No Access" (locked), in SharePoint will be skipped. To see a list of locked site collections see the Locked Sites scan output. Preparing for Migration IRM will need to be configured for SharePoint. IRM will need to be disabled on the source list prior to the migration event for that site collection. Post Migration Enable IRM on the migrated content list. Perform the following steps to ensure documents in IRM protected libraries are protected. Download a document from an IRM protected list. Open the document on the client machine. If the document is protected, there will be a status displayed beneath the ribbon. Scan Result Reports The following table describes the columns in the IRMEnabledLibrary-detail.csv report. This scan report contains lists and libraries that have IRM enabled. If IRM is disabled on the farm, the scan won't execute, and the output file will indicate this. ColumnDescription :-----:----- SiteId Unique identifier of the impacted site collection. SiteURL URL to the impacted site collection. SiteOwner Owner of the site collection. SiteAdmins List of people listed as site admins. SiteSizeInMB Size of the size collection in megabytes (MB) NumOfWebs Number of webs that exist in the site collection. ContentDBName Name of the content database hosting the site collection. ContentDBServerName SQL Server hosting the content database. ContentDBSizeInMB Size of the content database hosting the site collection. LastContentModifiedDate DateTime the site collection had content modified. TotalItemCount Total number of items found in the site collection. Hits Number of requests logged for the site collection. Relies on data from the usage logging service. If the usage logging service is disabled this row will show NA. DistinctUsers Number of distinct users that have accessed the site collection. Relies on data from the usage logging service. If the usage logging service is disabled this row will show NA. DaysOfUsageData Number of days the usage logging service retains data. This provides context for Hits and DistinctUsers. For example, if this is 14 days, the Hits and DistinctUsers data is for the last 14 days. ListTitle Title of the list or library with IRM enabled. URL URL to the default list view. ItemCount Number of items in the list. ScanID Unique identifier assigned to a specific execution of the SharePoint Migration Assessment Tool.
OfficeDocs-SharePoint/migration/migration-assessment-scan-irm-enabled-lists.md/0
Migration Assessment Scan: IRM Enabled Lists
OfficeDocs-SharePoint/migration/migration-assessment-scan-irm-enabled-lists.md
OfficeDocs-SharePoint
1,069
79
title: "Migration Assessment Scan Web Application Policies" ms.reviewer: ms.author: heidip author: MicrosoftHeidi manager: jtremper recommendations: true ms.date: 752017 audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: microsoft-365-migration ms.localizationpriority: high ms.collection: - IT_SharePoint_Hybrid_Top - IT_Sharepoint_Server_Top - SPMigration - M365-collaboration - m365initiative-migratetom365 ms.custom: ms.assetid: d538651b-75e4-4221-9ea1-c2d0be1e0589 description: "Learn how to fix issues with Web Application policies during migration." Migration Assessment Scan: Web Application Policies Learn how to fix issues with Web Application policies during migration. Overview In the source environment, there are typically discrete web applications for Team, Portal, Partner, and MySite (OneDrive). SharePoint Server allows the use of web application policies to grant or deny blanket-level permissions to entire web applications. These permissions override any permissions set at the site collection, site, listlibrary, or item level. The target environment uses a single web application to host all site collections. We do not currently offer a permission feature that applies uniquely to specific root site names and all child items together. Data Migration None of the web application policies are migrated to the target environment. [!IMPORTANT] Any site that is configured as "No Access" (locked), in SharePoint will be skipped. To see a list of locked site collections see the Locked Sites scan output. Preparing for Migration Web application policies are not migrated. Some alternatives at this time include: Change administrative procedures to manage all permissions at the site collection level (this can be performed via Tenant Admin) instead of using web application policies. Use licensing to grant or limit specific capabilities to specific users and groups. Post Migration Ensure the alternative options function correctly during the User Acceptance Testing phase. Scan Result Reports WebApplicationPolicy-detail.csv This scan report lists all policies for all of your web applications. ColumnDescription :-----:----- WebApplication The source web application. PolicyDisplayName Display Name of the user or group. PolicyUserName The login ID of the user or group. PolicyRoleBinding Permission granted to the user or group in the source. ScanID Unique identifier assigned to a specific execution of the SharePoint Migration Assessment Tool.
OfficeDocs-SharePoint/migration/migration-assessment-scan-web-application-policies.md/0
Migration Assessment Scan: Web Application Policies
OfficeDocs-SharePoint/migration/migration-assessment-scan-web-application-policies.md
OfficeDocs-SharePoint
584
80
ms.date: 03312021 title: "Troubleshooting Migration Manager Box" ms.reviewer: ms.author: heidip author: MicrosoftHeidi manager: jtremper recommendations: true audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: microsoft-365-migration ms.localizationpriority: high ms.collection: - M365-collaboration - SPMigration - m365initiative-migratetom365 search.appverid: MET150 description: "Troubleshooting the Migration Manager Box feature." Troubleshooting after your Box migration Review these areas if you're experiencing issue with your Box migration. Automatic reruns When a run ends, an automatic rerun may occur if the conditions listed under each scenario are met. ScenarioConditions :-----:----- The task is being scanned OR migrated for the first timeWhen a task is first scanned or migrated, it may trigger reruns. When a task scan is started and then canceled. If that task is scanned again, it will NOT trigger reruns, because it wasn't the first time the task was scanned. More automatic reruns are still availableA task is automatically rerun a maximum of three times. A first task scanmigrate action can result in triggering a total of four transactions: the original transaction (run), and three more attempts (reruns). Six reruns can be triggered at the most: 3 for the initial scan, and 3 for the initial migration. Last transaction status codesAn automatic rerun may occur if the last transaction ends with any of the following status codes: 201, 202, 210, 220, 211, 401, 403,404, 405, 406, or 491. Canceling a transfer A transfer can be canceled under the following conditions: - The task is "Queued", and has a status code 600 or 601. OR - The task is "Running", and has a status code 620 or 300. Incremental feature Our incrementals are delta operations that compare files in your source to files in Microsoft 365. Using this comparison, we copy anything that is new or has changed. This lets us keep Microsoft 365 data up to date when the final cut-over of users occurs. These incremental passes are an important part of our process. Technical clarification: We compare what you have in your source to what is in Microsoft 365. We only transfer anything that doesn't already exist, or has a newer timestamp. 'Lost files' During a transition where sharing paradigms change, there are many users who claim, “My files are lost!” This assumption is common if they aren't in clear communication about how the sharing structure changes when they sign in to Microsoft 365. A clear communication strategy helps users understand the changes.
OfficeDocs-SharePoint/migration/mm-box-troubleshooting.md/0
Troubleshooting after your Box migration
OfficeDocs-SharePoint/migration/mm-box-troubleshooting.md
OfficeDocs-SharePoint
624
81
ms.date: 01212021 title: "Step 3: Copy to migrations tab for Egnyte migration" ms.reviewer: ms.author: heidip author: MicrosoftHeidi manager: jtremper audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: microsoft-365-migration localization_priority: Priority ms.collection: - m365solution-migratefileshares - m365solution-migratetom365 - m365solution-scenario - M365-collaboration - SPMigration - highpri - m365initiative-migratetom365 search.appverid: MET150 ROBOTS: NOINDEX description: "Learn what the third step in using Migration Manager to migrate Egnyte." Step 3: Copy to migrations After an Egnyte account has been scanned and determined ready, add them to your migration list. Select the Users tab. The table will list all users that have been copied to migration. Select the users that are ready to be added to the Users migrations list. Select Copy to User migrations. Review the settings. Only content mathcing these settings will be migrated to your target destination. Select Customize settings if you want to change any of the settings. Select Edit for each area you want to update. :::image type="content" source="mediamm-file-folder-filters.png" alt-text="select edit to update any migration setting"::: Select Go to User Migrations, and proceed to the next step. Step 4: Review destinations [!NOTE] Migration Manager Egnyte isn't available for users of Office 365 operated by 21Vianet in China. This feature is also not supported for users of the Government Cloud, including GCC, Consumer, GCC High, or DoD.
OfficeDocs-SharePoint/migration/mm-egnyte-step3-copy-to-migrations.md/0
Step 3: Copy to migrations
OfficeDocs-SharePoint/migration/mm-egnyte-step3-copy-to-migrations.md
OfficeDocs-SharePoint
437
82
ms.date: 08112023 title: "Overview: Migrate Google Workspace to Microsoft 365 with Migration Manager" ms.reviewer: ms.author: heidip author: MicrosoftHeidi manager: jtremper audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: microsoft-365-migration ms.localizationpriority: high ms.collection: - m365solution-migratetom365 - m365solution-scenario - M365-collaboration - SPMigration - highpri - m365initiative-migratetom365 ms.custom: admindeeplinkSPO search.appverid: MET150 description: Overview of how to migrate from Google Workspace to Microsoft 365 with Migration Manager. Migrate Google Workspace to Microsoft 365 with Migration Manager Collaborate all in one place by migrating your Google Workspace files, metadata, and permissions to OneDrive and SharePoint in Microsoft 365. How does it work? Step 1: Connect to Google. Sign in to your Google account and install Microsoft 365 migration app in Google Workspace Marketplace. Step 2: Scan and assess. Add Google Drives for scanning. Once the scans are complete, download Scan reports to investigate any possible issues that might block your migration. Step 3: Copy to Migrations list. After a Google Drive has been scanned as "Ready to migrate", add them to your migration list. Step 4: Review destination paths. We automatically map source paths to any exactly matching destination paths. Ensure content is being copied to the right place by reviewing and modifying as needed for each destination path. Step 5: Map identities. Map your groups and users in Google Drive to those in Microsoft 365 to migrate metadata and permissions correctly. Step 6: Migrate and Monitor. After reviewing your migration setup, migrate your Google Drives and monitor the progress. [!Tip] Watch this video to help get started: Migrate Google files to Microsoft 365 with Migration Manager Get started To get started: Navigate to Microsoft 365 Admin Center Home - Setup - Migration and imports, select Google Drive or Google Workspace to create a migration project, and sign in with an account that has admin permissions for your organization. Make sure that you have: Access to the destination: You must either be a global admin or OneDriveSharePoint admin to the Microsoft 365 tenant where you want to migrate your content. Access to the source: Have Google account credentials that have read access to any Google user account you plan to migrate. Prerequisites installed: Make sure you have the necessary prerequisites installed. Google Shared Drives and permissions Google Shared drives can now be discovered and migrated normally. Google Shared Drive permissions are migrated according to what you have set in Project settings, under general permission setting. Folder permissions are migrated by default. File permissions are migrated on demand. We recommend the following steps when migrating permissions in your shared drive: Recreate a Microsoft 365 group with the same memberships as the Google Drive group. You can either create a new group or edit the group linked to the Team site designated as the migration destination for the Google Shared Drive. In the 'Map Identities' setting, map the original Google Drive group of the shared drive to the Microsoft 365 group. What isn't migrated Google Sites and Google Maps migration are not supported, while Google DocsSheetsSlidesForms are migrated as equivalent file types in Microsoft 365. Learn more about the unsupported files File size of Google proprietary files Google only started calculating the size of its proprietary files, including Google Docs, Sheets, Forms, and Slides, on May 2, 2022. Any Google proprietary files created and modified before May 2, 2022 won't include file size in the metadata info we get from the API calls. As a result, all Google proprietary files created before May 2, 2022 default to a scanned size of 1 byte and are reported as such in our ScanSummary report. Files marked as restricted Google WorkspaceDrive allows owners to control the ability for users to copy, download, or print files on a per-file basis. By default, this feature is enabled for each file. To ensure a successful migration, this setting must remain enabled. Disabling it may result in the following error when migrating a file owned by another user: Permissions issue: File marked as restricted or not copyable To enable this setting: 1. Navigate to the Share panel for the file. 1. Click on the Settings Icon located at the top right corner. 1. Select the checkbox for the setting "Viewers and commenters can see the option to download, print, and copy."
OfficeDocs-SharePoint/migration/mm-google-overview.md/0
Migrate Google Workspace to Microsoft 365 with Migration Manager
OfficeDocs-SharePoint/migration/mm-google-overview.md
OfficeDocs-SharePoint
1,018
83
title: "Project settings in Migration Manager" ms.date: 11152023 ms.reviewer: ms.author: heidip author: MicrosoftHeidi manager: jtremper recommendations: true audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: microsoft-365-migration ms.localizationpriority: high ms.collection: - M365-collaboration - SPMigration - m365initiative-migratetom365 search.appverid: MET150 description: Learn about configuring project settings in Migration Manager. Project settings in Migration Manager Project settings in Migration Manager are easily accessed from the menu bar at the top of your screen. :::image type="content" source="mediamm-project-settings-toolbar.png" alt-text="menu bar with project settings option"::: The settings are designed to support each cloud provider. Depending on what cloud provider you're migrating from, you may see different options. :::image type="content" source="mediamm-project-settings-tab-names.png" alt-text="just the tab names of the settings categories"::: Setting tabDescription :-----:----- File & folder filtersYou can limit what is migrated by customizing the settings on this page. Specify if invalid characters are allowed in a file or folder name, or choose to exclude specific file types or folder names, or by when they date create, and date modified. PermissionsThese settings ensure that the same users with access to files, folders, and metadata will continue to have access after migration. Learn more: Permission settings Project detailsEdit your project, find your Project ID, or disconnect from your source. AdvancedIf you're migrating Google Sheets, this setting allows a scan to identify incompatible formulas and embedded links, which may affect the converted Excel files. Learn more: Scan Google Sheet spreadsheets. Advanced features are being developed to include other cloud migrations. [!Note] It's important to note that these settings are applied to all migrations unless you have customized individually. Changes won't be applied to migrations in progress.
OfficeDocs-SharePoint/migration/mm-project-settings.md/0
Project settings in Migration Manager
OfficeDocs-SharePoint/migration/mm-project-settings.md
OfficeDocs-SharePoint
481
84
ms.date: 01072019 title: "Migrate from My Sites to OneDrive in Office 365" ms.reviewer: ms.author: heidip author: MicrosoftHeidi manager: jtremper recommendations: true audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: microsoft-365-migration ms.localizationpriority: high ms.collection: - IT_Sharepoint_Server_Top - SPMigration - M365-collaboration - m365initiative-migratetom365 ms.custom: - seo-marvel-apr2020 search.appverid: MET150 description: "Learn how to evaluate the environment, prepare, and migrate content from My Sites to OneDrive in Microsoft 365." My Sites to OneDrive migration guide This guide will help you prepare to migrate from My Sites to OneDrive in Microsoft 365. Most migrations include these phases: planning, assessing and remediating, preparing the target environment, migrating and onboarding users. Migration planningAssess and remediatePrepare your OneDrive environmentMigrateUser onboarding :-----:-----:-----:-----:----- What content goes whereUnderstanding permissions vs. sharingWhat to expect before and afterMigration and network performance considerationsChange management and communicationsRun SMATAssess key areasRemediate issuesPre-provision Microsoft 365 and usersMigration stepsConfigure SharePoint hybridMigration service providersSend regular emails to usersProvide trainingTell users how they're affectedProvide documentation for making the switch Migration planning Before you begin migration, assess your current source environment. What you discover will influence your overall strategy and timing, including: The mapping of content from your source My Sites to the destination OneDrive. The amount of content to migrate. Determine what content is redundant, out of date, or still relevant. Set permissions so IT can readwrite from the source to the destination. We strongly recommend that you consider setting up a hybrid environment. To learn more, see SharePoint server hybrid configuration roadmaps. What migrates? When you migrate to OneDrive by using the SharePoint Migration Tool, you'll migrate content from your My Sites document library into OneDrive. Assess and remediate your content Before you start your migration, it's important that you analyze your current environment. Only you know your data and how and who uses it. Think about how and what My Sites features you use in production. An assessment can begin by working with your users in two main areas: Identify older content. Determine if content is obsolete or redundant and can be deleted. Using the SharePoint Migration Assessment Tool The SharePoint Migration Assessment Tool (SMAT) is a simple command-line tool that scans the contents of your SharePoint Server 2010, 2013, or 2016 farm to help identify any issues before you migrate your content. After the scan is complete, SMAT generates summary and detailed reports that identify areas that could affect your migration. SMAT includes the SharePoint Migration Identity Management Tool, which does identity mapping by scanning SharePoint, Active Directory, and Microsoft Entra ID. SMAT scans many areas. The following table shows common areas of concern when migrating from My Sites. Your environment isn't affected when SMAT performs its scan. ScanDescription ------ File versions The more versions of a file you have, the longer it will take to migrate. Note: By default, versioning is enabled for all lists and libraries on the target platform. In the destination SharePoint site, there's no limit when versioning is enabled.See Migration assessment scan: File versions Large lists Lists of more than 20,000 items may cause migration issues, making it more difficult to predict how long migrating these sites will take. List data will still migrate, but the larger the list the more unpredictable the migration process. Extremely large lists can cause extended migration.See Migration assessment scan: Large lists Long OneDrive URLs Content with long URLs that exceed a limit will be skipped. They won't migrate.See Migration assessment scan: Long OneDrive URLs Checked-out files Only checked-in content will be migrated. Make sure that users check in their files before migration to avoid data loss.See Migration assessment scan: Checked-out files Large Excel files If you try to open a file larger than 10 MB from OneDrive (online), you'll be prompted you to open the file in the Excel client.See Migration assessment scan: Large Excel files Large list viewsIn your My Site, you can configure list-view throttling so the throttle on views is lifted during certain hours of the day. In OneDrive, the limit is in place around the clock. While your lists and data will still be migrated, some of your list views may be throttled.See Migration assessment scan: Large list views Browser file handlingSharePoint Server allows settings that range from "strict" to "permissive." But in SharePoint and OneDrive in Microsoft 365, the "strict" setting is enforced and can't be modified. All data will be migrated, but the behavior with the HTM and HTML files will change from opening within the browser to prompting the user to download.See Migration assessment scan: Browser file handling InfoPathInfoPath lets developers build custom forms to accept user input in various locations throughout SharePoint. Note that some features of custom InfoPath forms won't be migrated.See Migration assessment scan: InfoPath Prepare your OneDrive environment Before you migrate your My Sites content, you must pre-provision your users in OneDrive: Prepare to provision users through directory synchronization to Microsoft 365. Provisioning users with directory synchronization requires more planning and preparation than simply managing your work or school account directly in Microsoft 365. These additions ensure that your on-premises Active Directory synchronizes properly to Microsoft Entra ID. Pre-provision OneDrive for users in your organization. By default, the first time that a user browses to their OneDrive, it's automatically provisioned for them. In some cases, such as when your organization plans to migrate from your on-premises My Sites, you'll want your users' OneDrive locations ready beforehand (pre-provisioned). Configure Microsoft 365 for SharePoint hybrid (optional). With SharePoint Server hybrid, productivity services in SharePoint in Microsoft 365 can be integrated with on-premises SharePoint Server to provide unified functionality and access to data. For enterprises that want to gradually move their existing on-premises SharePoint Server services to the cloud, SharePoint Server hybrid provides a staged migration path by extending high-impact SharePoint Server workloads to SharePoint. A SharePoint Server hybrid environment enables trusted communication between SharePoint in Microsoft 365 and SharePoint Server. After you establish this trust framework, you can configure integrated functionality between services and features such as Search, Follow, and user profiles. You need to set up basic integration between Microsoft 365 for enterprises and SharePoint Server before you can configure a hybrid environment. Migrate Use the SharePoint Migration Tool to easily migrate your existing My Sites to OneDrive. Install and launch the SharePoint Migration Tool. You'll select the bulk migration option using the .json or .csv file that you created. For details, see Using the SharePoint Migration Tool. Create a mapping file. Create a mapping file with source and target paths and save it as .csv. For details, see How to format your JSON or CSV for data content migration. Migration best practices The following information describes a typical migration process that follows Microsoft best-practices guidance. Select a small set of users for a pilot migration. The goals of the pilot are to validate the process, including performance and user communication, and to get a sample of user feedback. Run the pilot migration. Use an incremental migration method that runs in the background with no user impact, followed by a cutover event in which users' on-premises My Sites accounts are disabled. Direct users to the target OneDrive environment. This method is preferred, as it reduces user impact. Assess the data from the pilot migration to determine the rest of your migration schedule, and make any changes. For example, you might update your user communication template to address questions you received from pilot users. Do the rest of the migration. Use an incremental migration method, just like the pilot. We recommend a single cutover event for all users to switch to OneDrive and then disable their My Sites accounts. This approach helps eliminate any confusion resulting from users having to collaborate by using both My Sites and OneDrive at the same time. User adoption Develop a plan to prepare your users for the upcoming change. Consider these factors: Evangelize the move: Emphasize the benefits, collaborative capabilities, and reasons for the move. End-user training: Provide training to your users on OneDrive features. Train your helpdesk: Before the cutover, train your helpdesk in key features and common user questions. Downtime: Prepare for any possible downtime that the migration may involve. Communicate: Develop a plan for sending communications to your users. Provide clear statements about timing, expectations, and impact to individuals. Be public about the timeline: Publish the migration timeline with details about user impacts. Include any user calls to action. Reassure your users: Assure users that content already in OneDrive is safe and won't be overwritten. Opting out: Tell users whether they can opt out of the migration process. Adoption-related resources Microsoft 365 Adoption Guide: Outlines methodology and resources for implementing proven adoption success factors. OneDrive Adoption: This Resource Center serves as your one stop shop for all adoption and change management-related content. Make the switch! The following articles can help your users "make the switch" from My Sites to OneDrive. They show how to do common tasks in OneDrive. Upload and save files and folders to OneDrive Manage files and folders in OneDrive Collaborate in OneDrive Set up your mobile apps Stay connected with OneDrive Advanced support Your enterprise may have specific business needs that require you to use third-party services or applications to help with your migration to Microsoft 365. Explore the professional services and applications available from partners in the Microsoft Partner Center.
OfficeDocs-SharePoint/migration/mysites-to-onedrive-migration-guide.md/0
My Sites to OneDrive migration guide
OfficeDocs-SharePoint/migration/mysites-to-onedrive-migration-guide.md
OfficeDocs-SharePoint
2,184
85
ms.date: 11092022 title: "SharePoint Migration Tool assessment risk errors " ms.reviewer: ms.author: heidip author: MicrosoftHeidi manager: jtremper recommendations: true audience: ITPro f1.keywords: - NOCSH ms.topic: article ms.service: microsoft-365-migration ms.localizationpriority: high ms.collection: - IT_Sharepoint_Server_Top - Strat_SP_gtc - SPMigration - M365-collaboration - m365initiative-migratetom365 ms.custom: - seo-marvel-mar2020 description: "Learn about the risk assessment errors in the SharePoint Migration Tool (SPMT)." SharePoint Migration Tool (SPMT) scan assessment error codes The scan assessment feature is available only in SPMT 4.0, which is currently in public preview. To download, see SPMT 4.0 public preview. To learn more about scan assessment, see Scan and assess a SharePoint Server site with SPMT]. Error Code DescriptionUser Action Explanation :----- :----- :----- :----- FEATURE_UNSUPPORTEDFeature {0} isn't supportedSPMT doesn't support custom features. Example, uploading a solution to SharePoint is a custom feature. UNKNOWN_CONTENT_TYPESPMT doesn't support custom content types. WORKFLOW_UNSUPPORTEDWorkflow {0} in {1} isn't supportedSPMT doesn't support custom workflows, and only some out of the box (OOTB) workflows. Once SharePoint Designer workflow migration is enabled, only a few OOTB is not supported. LIST_TEMPLATE_UNSUPPORTEDList template isn't supportedSPMT doesn't support hidden taxonomy lists and style libraries. However, this limitation doesn’t impact term store or site migration. THICKET_FOLDER_UNSUPPORTED Thicket folder {0} isn't supportedFolders that end with _file or _files are treated as a thicket folder, which can't be created in SharePoint Online. Rename the folder to Files. SharePoint creates a hidden linked folder (thicket) to store the more content, which can't exist in the HTM file when you save a file as an HTM that contains content such as graphics. To avoid the confusion of mixing the thicket folder with the user-created folders, SharePoint doesn't allow these folders to be created. PAGE_UNSUPPORTED Web part {0} isn't supportedSPMT doesn’t support custom pages. The exceptions are wiki and web part pages. CUSTOM_SOLUTION_UNSUPPORTEDSolution {0} isn't supported SPMT doesn’t support custom solutions. SITE_LOAD_FAILURE Invalid site URL '{0}' Check your SharePoint Server ULS log, or SPMT scan log for more details, then try again. This unknown error occurs when loading site info. This error isn't a credential issue. WEB_LOAD_FAILURECouldn't get web information in site '{0}'Check your SharePoint Server ULS log, or SPMT scan log for more details, then try again. This unknown error occurs when loading site info. This error isn't a credential issue. SITE_USER_ERRORCan't access site groupsCheck your SharePoint Server ULS log, or SPMT scan log for more details, then try again. This unknown error occurs when fetching site users. SITE_GROUP_ERRORCan't access site usersCheck your SharePoint Server ULS log, or SPMT scan log for more details, then try again. This unknown error occurs when fetching site groups. TERM_STORE_ERRORFail to scan the source term storeCheck your SharePoint Server ULS log, or SPMT scan log for more details, then try again. This unknown error occurs when fetching term store info (term groups, term sets, or terms). LIST_LOAD_FAILURE Couldn't get list information Check your SharePoint Server ULS log, or SPMT scan log for more details, then try again. This unknown error occurs when loading list info. This error isn't a credential issue. LIST_ITEM_LOAD_FAILUREPopulate list item meta info failed Check your SharePoint Server ULS log, or SPMT scan log for more details, then try again. This unknown error occurs when fetching list items CANNOT_ACCCESS Can't access Check your SharePoint Server ULS log, or SPMT scan log for more details. If the auth is expired, then reauth and rescan. Also check if the user has permission on this site, web, or list. LOOKUP_LIST_IN_UNKNOWN_WEBLookup column {0} in {1} refers to unknown list out of current scan scope.Find the lookup list and include it within the migration scope by .json file. LOOKUP_LIST_IN_PARENT_WEB Lookup column {0} in {1} refers to list in its parent web.You need to migrate the parent web and the current site. LIST_TARGET_AUDIENCE_ENABLEDTarget audience is enabled. This list may be skipped during migration.There's no 'audience targeting setting' in SharePoint Online. Turn on the migration setting 'Skip list with audience targeting enabled' to skip this list. Otherwise, it's still migrated but the audience targeting is missing. ITEM_COUNT_EXCEED_LIMITFile count '{0}' exceeds limit '{1}'SharePoint can store up to 30 million items per list. If item count exceeds 30 million, migration will fail. To learn more, see: Manage large lists and libraries ITEM_COUNT_EXCEED_INDEX_LIMIT File count '{0}' is too large to create index SharePoint Online has 20,000 index creation limit. If the item count exceeds 20,000, then adding column index is prohibited in that library. SPMT will give a warning in that case but won’t block migration. LIST_VIEW_EXCEED_LIMIT The list view shows more than {0} items. You may run into a list view threshold error in the classic experience. SharePoint Online has a 5, 000 list view limit. This threshold is a barrier of efficient searching but won’t impact migration. UNIQUE_PERMISSION_EXCEED_LIMIT Unique permissions per list should remain below {0} in total The supported limit of unique permissions for items in a list or library is 50,000. However, the recommended general limit is 5,000. Making changes to more than 5,000 uniquely permitted items at a time takes longer. Reduce the number of uniquely permitted items in a list or library. Learn more: Error when breaking SharePoint inheritance LIST_VIEW_LOOKUP_EXCEED_LIMIT The number of people, lookup, and managed metadata columns exceed list view lookup threshold. Modify the resource throttling setting in the admin center. Displaying or querying 12 or more columns of the following types can cause scan failure: people, lookup, and managed metadata. Go to central admin, then select Manage web application. Select the web application and then the drop-down Resource Throttling beneath General Settings. Increase the value of List View Lookup Threshold. PAGES_BLOCKED_COZ_NO_SCRIPT Web files might be blocked if custom script is disallowed in SharePoint admin center settings page. Turn on the migration setting 'Temporarily allow migration of scripts' (takes effect immediately) or go to Admin center and classic settings page. Choose allow users to run custom script (can take up to 24 hours) Learn more: Allow or prevent custom scripts
OfficeDocs-SharePoint/migration/spmt-scan-risk-codes.md/0
SharePoint Migration Tool (SPMT) scan assessment error codes
OfficeDocs-SharePoint/migration/spmt-scan-risk-codes.md
OfficeDocs-SharePoint
1,726
86
The first phase of Retrieval-Augmented Generation (RAG) development and experimentation is the preparation phase. During this phase, you first define the business domain for your solution. Once you have the domain defined, you begin the parallel process of gathering documents and sample questions that are pertinent to the domain. The steps are done in parallel because they're interrelated. The questions must be answerable by content in the documents and the documents must answer relevant questions. While gathering the test documents and queries, perform an analysis of your documents to get a better understanding of the structure and content. This article is part of a series. Read the introduction. Determine solution domain The first step in this process is to clearly define the business requirements for the solution or the use case. These requirements help determine what kind of questions the solution intends to address and what source data or documents help address those questions. In later stages, the solution domain helps inform your embedding model strategy. Gather representative test documents In this step, you're gathering documents that are the best representation of the documents that you use in your production solution. The documents must address the defined use case and be able to answer the questions gathered in the question gathering parallel phase. Considerations Consider these areas when evaluating potential representative test documents: Pertinence - The documents must meet the business requirements of the conversational application. For example, if you're building a chat bot tasked with helping customers perform banking operations, the documents should match that requirement, such as documents showing how to open or close a bank account. The documents must be able to address the test questions that are being gathered in the parallel step. If the documents don't have the information relevant to the questions, it cannot produce a valid response. Representative - The documents should be representative of the different types of documents that your solution will use. For example, a car insurance document is different to a health insurance or life insurance document. Suppose the use case requires the solution to support all three types, and you only had two car insurance documents, your solution would perform poorly for both health and life insurance. You should have at least 2 for each variation. Physical document quality - The documents need to be in a usable shape. Scanned images, for example, might not allow you to extract usable information. Document content quality - The documents must have high content quality. There should not be misspellings or grammatical errors. Large language models don't perform well if you provide them with poor quality content. The success factor in this step is being qualitatively confident that you have a good representation of test documents for your particular domain. Test document guidance Prefer real documents over synthetic. Real documents must go through a cleaning process to remove personally identifiable information (PII). Consider augmenting your documents with synthetic data to ensure you're handling all kinds of scenarios. If you must use synthetic data, do your best to make it as close to real data as possible. Make sure that the documents can address the questions that are being gathered. You should have at least two documents for each document variant. You can use large language models or other tools to help evaluate the document quality. Gather test queries In this step, you're gathering test queries that you'll use to evaluate your chunks, search solution, and your prompt engineering. You'll do this in lockstep with gathering the representative documents, as you're not only gathering the queries, you're also gathering how the representative documents address the queries. Having both the sample queries, combined with the parts of the sample documents that address those queries, allow you to evaluate every stage of the RAG solution as you're experimenting with different strategies and approaches. Gather test query output The output of this phase includes content from both the Gather representative test queries step, and the Gather representative test documents step. The output is a collection containing the following data: Query - The question, representing a legitimate user's potential prompt. Context - A collection of all the actual text in the documents that address the query. For each bit of context, you should include the page and the actual text. Answer - A valid response to the query. The response be content directly from the documents or it might be rephrased from one or more pieces of context. Creating synthetic queries It's often challenging for the subject matter experts (SMEs) for a particular domain to put together a comprehensive list of questions for the use case. One solution to this challenge is to generate synthetic questions from the representative test documents that were gathered. The following is a real-world approach for generating synthetic questions from representative documents: Chunk the documents - Break down the documents into chunks. This chunking step isn't using the chunking strategy for your overall solution. It's a one-off step that will be used for generating synthetic queries. The chunking can be done manually if the number of documents is reasonable. Generate queries per chunk - For each chunk, generate queries either manually or using a large language model. When using a large language model, we generally start by generating two queries per chunk. The large language model can also be used to create the answer. The following example shows a prompt that generates questions and answers for a chunk. ```text Please read the following CONTEXT and generate two question and answer json objects in an array based on the CONTEXT provided. The questions should require deep reading comprehension, logical inference, deduction, and connecting ideas across the text. Avoid simplistic retrieval or pattern matching questions. Instead, focus on questions that test the ability to reason about the text in complex ways, draw subtle conclusions, and combine multiple pieces of information to arrive at an answer. Ensure that the questions are relevant, specific, and cover the key points of the CONTEXT. Provide concise answers to each question, directly quoting the text from provided context. Provide the array output in strict JSON format as shown in output format. Ensure that the generated JSON is 100 percent structurally correct, with proper nesting, comma placement, and quotation marks. There should not be any comma after last element in the array. Output format: [ { "question": "Question 1", "answer": "Answer 1" }, { "question": "Question 2", "answer": "Answer 2" } ] CONTEXT: ``` Verify output - Verify that the questions are pertinent to the use case and that the answers address the question. This verification should be performed by a SME. Unaddressed queries It's important to gather queries that the documents don't address, along with queries that are addressed. When you test your solution, particularly when you test the large language model, you need to determine how the solution should respond to queries it doesn't have sufficient context to answer. Approaches to responding to queries you can't address include: Responding that you don't know Responding that you don't know and providing a link where the user might find more information Gather test queries guidance Determine whether there is a system that contains real customer questions that you can use. For example, if you're building a chat bot to answer customer questions, you might be able to use customer questions from your help desk, FAQs, or ticketing system. The customer or SME for the use case should act as a quality gate to determine whether or not the gathered documents, the associated test queries, and the answers to the queries from the documents are comprehensive, representative, and are correct. Reviewing the body of questions and answers should be done periodically to ensure that they continue to accurately reflect the source documents. Document analysis The goal of document analysis is to determine the following three things: What in the document you want to ignore or exclude What in the document you want to capture in chunks How you want to chunk the document The following are some common questions you can ask when analyzing a document type that helps you make those three determinations: Does the document contain a table of content? Does the document contain images? Are they high resolution images? What kind of data do you have on images? Are there captions for the images? Is there text embedded in the images? Does the document have charts with numbers? Does the document contain tables? Are the tables complex (nested tables) or noncomplex? Are there captions for the tables? Is there multi-column data or multi column paragraphs? You don't want to parse multi-column content as though it were a single column. How many paragraphs are there? How long are the paragraphs? Are the paragraphs roughly equal length? What languages, language variant, or dialects are in the documents? Does the document contain Unicode characters? How are numbers formatted? Are they using commas or decimals? Are there headers and footers? Do you need them? Are there copyrights or disclaimers? Do you need them? What in the document is uniform and what isn't uniform? Is there a header structure where semantic meaning can be extracted? Are there footnotes or endnotes? Are there watermarks? Are there annotations or comments (for example, in PDFs or Word documents) Are there other types of embedded media like videos or audio? Are there any mathematical equationsscientific notations in the document? Are there bullets or meaningful indentations? The answers to these questions help you identify the document structure, determine your chunking approach, and identify content to chunk and what not to. Next steps [!div class="nextstepaction"] Chunking phase Related resources Automate document processing by using AI Document Intelligence Get started with the Python enterprise chat sample using RAG
architecture-center/docs/ai-ml/guide/rag/rag-preparation-phase-content.md/0
Determine solution domain
architecture-center/docs/ai-ml/guide/rag/rag-preparation-phase-content.md
architecture-center
2,026
87
This article describes how to extract insights from customer conversations at a call center by using Azure AI services and Azure OpenAI Service. Use these real-time and post-call analytics to improve call center efficiency and customer satisfaction. Architecture :::image type="content" source="_imagescall-center-analytics.svg" alt-text="Diagram that shows the call center AI architecture." border="false" lightbox="_imagescall-center-analytics.svg"::: Download a PowerPoint file of this architecture. Dataflow A phone call between an agent and a customer is recorded and stored in Azure Blob Storage. Audio files are uploaded to an Azure Storage account via a supported method, such as the UI-based tool, Azure Storage Explorer, or a Storage SDK or API. Azure AI Speech is used to transcribe audio files in batch mode asynchronously with speaker diarization enabled. The transcription results are persisted in Blob Storage. Azure AI Language is used to detect and redact personal data in the transcript. For batch mode transcription and personal data detection and redaction, use the AI services Ingestion Client tool. The Ingestion Client tool uses a no-code approach for call center transcription. Azure OpenAI is used to process the transcript and extract entities, summarize the conversation, and analyze sentiments. The processed output is stored in Blob Storage and then analyzed and visualized by using other services. You can also store the output in a datastore for keeping track of metadata and for reporting. Use Azure OpenAI to process the stored transcription information. Power BI or a custom web application that's hosted by App Service is used to visualize the output. Both options provide near real-time insights. You can store this output in a customer relationship management (CRM), so agents have contextual information about why the customer called and can quickly solve potential problems. This process is fully automated, which saves the agents time and effort. Components Blob Storage is the object storage solution for raw files in this scenario. Blob Storage supports libraries for languages like .NET, Node.js, and Python. Applications can access files on Blob Storage via HTTP or HTTPS. Blob Storage has hot, cool, and archive access tiers for storing large amounts of data, which optimizes cost. Azure OpenAI provides access to the Azure OpenAI language models, including GPT-3, Codex, and the embeddings model series, for content generation, summarization, semantic search, and natural language-to-code translation. You can access the service through REST APIs, Python SDK, or the web-based interface in the Azure OpenAI Studio. Azure AI Speech is an AI-based API that provides speech capabilities like speech-to-text, text-to-speech, speech translation, and speaker recognition. This architecture uses the Azure AI Speech batch transcription functionality. Azure AI Language consolidates the Azure natural-language processing services. For information about prebuilt and customizable options, see Azure AI Language available features. Language Studio provides a UI for exploring and analyzing AI services for language features. Language Studio provides options for building, tagging, training, and deploying custom models. Power BI is a software-as-a-service (SaaS) that provides visual and interactive insights for business analytics. It provides transformation capabilities and connects to other data sources. Alternatives Depending on your scenario, you can add the following workflows. Perform conversation summarization by using the prebuilt model in Azure AI Language. Depending on the size and scale of your workload, you can use Azure Functions as a code-first integration tool to perform text-processing steps, like text summarization on extracted data. Deploy and implement a custom speech-to-text solution. Scenario details This solution uses Azure AI Speech to convert audio into written text. Azure AI Language redacts sensitive information in the conversation transcription. Azure OpenAI extracts insights from customer conversation to improve call center efficiency and customer satisfaction. Use this solution to process transcribed text, recognize and remove sensitive information, and perform sentiment analysis. Scale the services and the pipeline to accommodate any volume of recorded data. Potential use cases This solution provides value to organizations in industries like telecommunications and financial services. It applies to any organization that records conversations. Customer-facing or internal call centers or support desks benefit from using this solution. Considerations These considerations implement the pillars of the Azure Well-Architected Framework, which is a set of guiding tenets that can be used to improve the quality of a workload. For more information, see Microsoft Azure Well-Architected Framework. Reliability Reliability ensures your application can meet the commitments you make to your customers. For more information, see Overview of the reliability pillar. Find the availability service-level agreement (SLA) for each component in SLAs for online services. To design high-availability applications with Storage accounts, see the configuration options. To ensure resiliency of the compute services and datastores in this scenario, use failure mode for services like Azure Functions and Storage. For more information, see the resiliency checklist for Azure services. Back up and recover your Form Recognizer models. Security Security provides assurances against deliberate attacks and the abuse of your valuable data and systems. For more information, see Overview of the security pillar. Implement data protection, identity and access management, and network security recommendations for Blob Storage, AI services, and Azure OpenAI. Configure AI services virtual networks. Cost optimization Cost optimization is about looking at ways to reduce unnecessary expenses and improve operational efficiencies. For more information, see Overview of the cost optimization pillar. The total cost of this solution depends on the pricing tier of your services. Factors that can affect the price of each component are: The number of documents that you process. The number of concurrent requests that your application receives. The size of the data that you store after processing. Your deployment region. For more information, see the following resources: Azure OpenAI pricing Blob Storage pricing Azure AI Language pricing Azure Machine Learning pricing Use the Azure pricing calculator to estimate your solution cost. Performance efficiency Performance efficiency is the ability of your workload to meet the demands placed on it by users in an efficient manner. For more information, see Overview of the performance efficiency pillar. When high volumes of data are processed, it can expose performance bottlenecks. To ensure proper performance efficiency, understand and plan for the scaling options to use with the AI services autoscale feature. The batch speech API is designed for high volumes, but other AI services APIs might have request limits, depending on the subscription tier. Consider containerizing AI services APIs to avoid slowing down large-volume processing. Containers provide deployment flexibility in the cloud and on-premises. Mitigate side effects of new version rollouts by using containers. For more information, see Container support in AI services. Contributors This article is maintained by Microsoft. It was originally written by the following contributors. Principal authors: Dixit Arora Senior Customer Engineer, ISV DN CoE Jyotsna Ravi Principal Customer Engineer, ISV DN CoE To see non-public LinkedIn profiles, sign in to LinkedIn. Next steps What is Azure AI Speech? What is Azure OpenAI? What is Azure Machine Learning? Introduction to Blob Storage What is Azure AI Language? Introduction to Azure Data Lake Storage Gen2 What is Power BI? Ingestion Client with AI services Post-call transcription and analytics Related resources Use a speech-to-text transcription pipeline to analyze recorded conversations Deploy a custom speech-to-text solution Create custom language and acoustic models Deploy a custom speech-to-text solution
architecture-center/docs/ai-ml/openai/architecture/call-center-openai-analytics-content.md/0
Architecture
architecture-center/docs/ai-ml/openai/architecture/call-center-openai-analytics-content.md
architecture-center
1,641
88
This article describes four deployment patterns that you can choose from when you deploy Microsoft Fabric. Learn about considerations, recommendations, and potential nonreversible decisions for each deployment pattern. The following design areas are outlined for each Fabric deployment pattern: Governance Security Administration DevOps Usability Performance and scale Billing and cost management Levels in a Fabric deployment A Fabric deployment has four levels: tenant, capacity, workspace, and item. At the top level is the Fabric tenant. Each tenant can have one or more capacities, each capacity can contain one or more workspaces, and each workspace can contain zero or more Fabric items. An organization's structure or objectives in the areas of security, scale, governance, and application lifecycle might influence its choice of deployment pattern. Different deployment patterns offer varying flexibility and emphasis in the levels of a deployment. For example, an organization can use domains to group workspaces in Fabric. Similarly, if an organization must have a centralized option that it can use to collaborate and to find content, a OneLake data hub in Fabric offers a centralized access point and is integrated with other familiar products, like Microsoft Teams and Excel. In Fabric, a large organization that has business units in separate geographical locations can use capacities to control where its data resides. It can manage a business unit that operates from a different geographical location as a single unit by using Fabric domains because domains can span workspaces that are in different regions. For more information about Fabric levels and their role in choosing a deployment pattern, see Microsoft Fabric concepts and licenses. How Fabric deployment patterns align All Fabric deployment patterns: Use Fabric workspaces as boundaries for scale, governance, and security. Use Fabric domains for delegation, to manage multiple workspaces that might belong to the same business unit, or when data that belongs to a business domain spans more than one workspace. You can set some tenant-level settings for managing and governing data at the domain level and use domain-specific configuration for those settings. Use Fabric capacities to scale compute resources while provisioning dedicated capacities per workspace when specific performance levels must be met. Extend to use equivalent features from a Microsoft cloud (Microsoft Azure, Microsoft 365, and others) when a feature isn't available in Fabric. Use a OneLake data hub to promote discovery and the use of data assets. Use OneSecurity to set up data security policies for data assets. Scenarios based on business requirements This article uses the following scenarios to describe how each deployment pattern can address various business requirements: Scenario 1: For organizations that want to have faster (or slower) time to market by organizing teams that can cross-collaborate, with lower restrictions on data usage. In this scenario, an organization can benefit by using a monolithic deployment pattern. The organization operates in and manages a single workspace. For more information, see Pattern 1: Monolithic deployment. Scenario 2: For organizations that want to provide isolated environments for teams to work in, with a central team that is responsible for providing and managing infrastructure. This scenario also suits organizations that want to implement data mesh. In this scenario, an organization can implement multiple workspaces that either use a shared capacity or have separate capacities. For more information, see Pattern 2: Multiple workspaces backed by a single Fabric capacity and Pattern 3: Multiple workspaces backed by separate capacities. Scenario 3: For organizations that want an entirely decentralized model that gives business units or teams the freedom to control and manage their own data platforms. In this scenario, an organization can choose a deployment model in which it uses separate workspaces, each with dedicated capacity, or possibly with multiple Fabric tenants. For more information, see Pattern 3: Multiple workspaces backed by separate capacities and Pattern 4: Multiple Fabric tenants. Scenario 4: An organization might choose to use a hybrid approach in which it combines multiple patterns to achieve its requirements. For example, an organization might set up a single workspace for specific business units (a monolithic deployment pattern) while using separate, dedicated workspaces and separate capacities for other business units. Pattern 1: Monolithic deployment In this deployment pattern, you provision a single workspace to cater to all your use cases. All business units work within the same, single workspace. :::image type="content" source=".._imagesfabric-deployment-pattern-1-monolithic-deployment.svg" alt-text="Diagram that shows a single Fabric tenant that has a single capacity and a single workspace." border="false"::: When you provision a single Fabric capacity and attach a single workspace to it, the following points are true: All Fabric items share the same provisioned capacity. The amount of time a query or job takes to finish varies because other workloads use the same capacity. The workspace maximum capacity units (CUs) are limited to the largest possible F SKU or P SKU. For data engineering experiences, you can provision separate Spark pools to move the compute capacity that Fabric Spark requires outside of provisioned CUs. Features that are scoped to a workspace apply across all business units that share that workspace. All workspace items and data are in one region. You can't use this pattern for multi-geo scenarios. Features that rely on multiple workspaces, like deployment pipelines and lifecycle management, aren't available. Limitations that are associated with a single workspace apply. Capacity limitations that are associated with a specific SKU apply. You might choose to implement this deployment pattern for one or more of the following reasons: Your organization doesn't have complex engineering requirements, it has a small user base, or its semantic models are small. Your organization operates in a single region. You're not primarily concerned with organizational separation between business units. Your organization doesn't require workspace-scoped features, such as sharing code repositories with Git. You want to implement a lakehouse medallion architecture. When your organization is limited to a single workspace, you can achieve separation between bronze, silver, and gold layers by creating separate lakehouses within the workspace. Your organization's business units share roles, and it's acceptable to have the same workspace-level permissions for users in the workspace. For example, when multiple users who belong to different business units are administrators of a single workspace, they have the same rights on all items in the workspace. Your organization can tolerate variable job completion times. If an organization doesn't have any requirements for performance guarantees (for example, a job must finish in a specific time period), it's acceptable to share a single provisioned capacity across business units. When a capacity is shared, users can run their queries at any time. The number of CUs that are available to run a job varies depending on what other queries are running on the capacity. It can lead to variable job completion times. Your organization can achieve all its business requirements (from a CU perspective) by using a single Fabric capacity. The following table presents considerations that might influence your decision to adopt this deployment pattern: Aspect Considerations ------ Governance - Lower governance mandates and restrictions on the platform are required. - It suits smaller organizations that prefer faster time to market. - Challenges might develop if governance requirements evolve to become more complex. Security - Data plane - Data can be shared across teams, so there's no need to have restrictions on data between teams. - Teams have ownership rights on the semantic models. They can read, edit, and modify data in OneLake. Security - Control plane - All users can collaborate in the same workspace. - There are no restrictions on items. All users can read and edit all items. Administration The organization has:- Lower administration costs. - No stringent need to track and monitor access and usage per team. - Less stringent Fabric workload load monitoring across teams. DevOps DevOps benefits from:- A single release for the entire platform. - Less complicated release pipelines. Usability - Administrators - It's easier for administrators to manage because they have fewer items to manage. - There's no need for other provisioning or to handle requests from teams for new capacities or workspaces. - Capacity administrators can be tenant administrators, so there's no need to create or manage other groups or teams. Usability - Other roles - It's acceptable to share the workspace with other users. - Collaboration among users is encouraged. Performance - Isolation of workloads isn't mandatory. - No strict performance service-level agreements (SLAs) need to be met. - Throttling isn't likely. Billing and cost management - One, single team can handle costs. - There's no need to charge back to different teams. Pattern 2: Multiple workspaces backed by a single Fabric capacity In this deployment pattern, you use separate workspaces. Because a single capacity is shared across workspaces, workloads that run concurrently at any time might affect the performance of jobs and interactive queries. :::image type="content" source=".._imagesfabric-deployment-pattern-2-multiple-workspaces-single-capacity.svg" alt-text="Diagram that shows a single Fabric tenant that contains a single capacity and two workspaces." border="false"::: When you provision a single Fabric capacity and attach multiple workspaces to it, the following points are true: All Fabric items share the same provisioned capacity. The amount of time a query or job takes to finish varies because other workloads use the same capacity. The maximum CUs that a workspace can use is limited to the largest possible F SKU or P SKU. For data engineering experiences, you can provision separate Spark pools to move the compute capacity that Fabric Spark requires outside of provisioned CUs. Features that are scoped to a workspace apply across all business units that share that workspace. All workspace items and data are in one region. You can't use this pattern for multi-geo scenarios. You can use DevOps features that require separate workspaces, like for deployment pipelines and lifecycle management. Limitations that are associated with a single workspace apply. Capacity limitations that are associated with a specific SKU apply. You might choose to implement this deployment pattern for one or more of the following reasons: You want a hub-and-spoke architecture in which your organization centralizes some aspects of operating the analytics environment and decentralizes others. You want decentralization from an operational and management aspect but to varying degrees. For example, you might choose to have bronze and silver layers of a medallion architecture deployed to one workspace and the gold layer deployed to a different workspace. Your rationale might be that one team is responsible for the bronze and silver layers and a different team is responsible for operating and managing the gold layer. You aren't primarily concerned about performance management and isolating workloads from a performance perspective. From the perspective of a lakehouse medallion architecture, your organization can create separate workspaces to implement bronze, silver, and gold layers. Your organization doesn't need to deploy workloads across different geographical regions (all data must reside in one region). Your organization might require separation of workspaces for one or more of the following reasons: Members of the team that is responsible for workloads are in different workspaces. You want to create separate workspaces for each type of workload. For example, you might create a workspace for data ingestion (data pipelines, dataflow Gen2, or data engineering) and create a separate workspace for consumption through a data warehouse. This design works well when separate teams are responsible for each of the workloads. You want to implement a data mesh architecture in which one or more workspaces are grouped together in a Fabric domain. Your organization might choose to deploy separate workspaces based on data classification. The following table presents considerations that might influence your decision to choose this deployment pattern: Aspect Considerations ------ Governance - Medium governance mandates and restrictions on the platform are required. - The organization needs more granular control to govern departments, teams, and roles. Security - Data plane - Data restrictions are required, and you need to provide data protection based on access controls for departments, teams, and members. Security - Control plane - To avoid accidental corruption or actions by malicious users, you might need to provide controlled access on Fabric items by role. Administration - You don't need to manage capacities because it's a single-capacity model. - You can use workspaces to isolate departments, teams, and users. DevOps - You can do independent releases per department, team, or workload. - It's easier to meet development, testing, acceptance, and production (DTAP) requirements for teams when multiple workspaces are provisioned to address each release environment. Usability - Administrators - You don't need to provision multiple capacities. - Tenant administrators typically administer capacity, so you don't need to manage other groups or teams. Usability - Other roles - Workspaces are available for each medallion layer. - Fabric items are isolated per workspace, which helps to prevent accidental corruption. Performance - Strict performance SLAs don't need to be met. - Throttling is acceptable during peak periods. Billing and cost management - You don't have a specific requirement to charge back per team. - A central team bears all costs. - Infrastructure teams are owners of Fabric capacities in the organization. Pattern 3: Multiple workspaces backed by separate capacities In this deployment pattern, you achieve separation between business units for governance and performance. :::image type="content" source=".._imagesfabric-deployment-pattern-3-multiple-workspaces-multiple-capacites.svg" alt-text="Diagram that shows a single Fabric tenant that contains two capacities. The first capacity has two workspaces. The second capacity has one workspace." border="false"::: When you provision multiple Fabric capacities with their own workspaces, the following points are true: The largest possible F SKU or P SKU attached to a workspace determines the maximum CUs that a workspace can use. Organizational and management decentralization is achieved by provisioning separate workspaces. Organizations can scale beyond one region by provisioning capacities and workspaces in different geographical regions. You can use the full capabilities of Fabric because business units can have one or more workspaces that are in separate capacities and grouped together through Fabric domains. Limitations that are associated with a single workspace apply, but you can scale beyond these limits by creating new workspaces. Capacity limitations that are associated with a specific SKU apply, but you can scale CUs by provisioning separate capacities. All Fabric items in all workspaces in the tenant and their certification statuses can be discovered by using a OneLake data hub. Domains can group workspaces together so that a single business unit can operate and manage multiple workspaces. OneLake shortcuts reduce data movement, and they also reduce data usability across workspaces. You might choose to implement this deployment pattern for one or more of the following reasons: Your organization wants to deploy architectural frameworks like data mesh or data fabric. You want to prioritize flexibility in how you structure capacities and workspaces. You operate in different geographical regions. In this case, provisioning a separate capacity and workspace is the driving force to move toward this multi-capacity and multi-workspace deployment pattern. You operate at large scale and have requirements to scale beyond the limits of a single-capacity SKU or a single workspace. You have workloads that must always finish within a specific time or meet a specific performance SLA. You can provision a separate workspace that's backed by a Fabric capacity to meet performance guarantees for those workloads. The following table presents considerations that might influence your decision to choose this deployment pattern: Aspect Considerations ------ Governance - You have a high degree of governance and management, and you need independence for each workspace. - You can manage usage per department or business unit. - You can conform to data residency requirements. - You can isolate data based on regulatory requirements. Security - Data plane - Data access can be controlled per department, team, or users. - You can isolate data based on Fabric item type. Security - Control plane - You can provide controlled access on Fabric items by role to avoid accidental corruption or actions by malicious users. Administration - Granular administrator capabilities are restricted to departments, teams, or users. - You have access to detailed monitoring requirements on usage or patterns of workloads. DevOps - You can isolate DTAP environments by using different capacities. - Independent releases are based on a department, team, or workload. Usability - Administrators - You get granular visibility into usage by department or team. - You have delegated capacity rights to capacity administrators per department or team, which helps with scaling and granular configuration. Usability - Other roles - Workspaces are available per medallion layer and capacity. - Fabric items are isolated per workspace, which helps prevent accidental corruption. - You have more options to prevent throttling that's caused by surges on shared capacity. Performance - Performance requirements are high, and workloads need to meet higher SLAs. - You have flexibility in scaling up individual workloads per department or team. Billing and cost management - Cross-charging requirements can be easily met by assigning dedicated capacities to an organizational entity (department, team, or project). - Cost management can be delegated to respective teams to manage. Pattern 4: Multiple Fabric tenants When separate Fabric tenants are deployed, all instances of Fabric are separate entities with respect to governance, management, administration, scale, and storage. The following points are true when you use multiple Fabric tenants: Tenant resources are strictly segregated. Management planes between tenants are separate. Tenants are separate entities and can have their own processes for governance and management, but you can administer them separately. You can use data pipelines or data engineering capabilities to share or access data between Fabric tenants. You might choose to implement this deployment pattern for the following reasons: The organization might end up with multiple Fabric tenants because of a business acquisition. The organization might choose to set up a Fabric tenant specifically for a business unit or smaller subsidiary. Contributors This article is maintained by Microsoft. It was originally written by the following contributors. Holly Kelly Principal Program Manager Gabi Muenster Senior Program Manager Sarath Sasidharan Senior Program Manager Amanjeet Singh Principal Program Manager
architecture-center/docs/analytics/architecture/fabric-deployment-patterns-content.md/0
Levels in a Fabric deployment
architecture-center/docs/analytics/architecture/fabric-deployment-patterns-content.md
architecture-center
3,867
89
title: Synchronous IO antipattern titleSuffix: Performance antipatterns for cloud apps description: Blocking the calling thread while IO completes can reduce performance and affect vertical scalability. ms.author: robbag author: RobBagby categories: azure ms.date: 06052017 ms.topic: design-pattern ms.service: architecture-center ms.subservice: anti-pattern azureCategories: - analytics - storage - web products: - azure-blob-storage ms.custom: - article Synchronous IO antipattern Blocking the calling thread while IO completes can reduce performance and affect vertical scalability. Problem description A synchronous IO operation blocks the calling thread while the IO completes. The calling thread enters a wait state and is unable to perform useful work during this interval, wasting processing resources. Common examples of IO include: Retrieving or persisting data to a database or any type of persistent storage. Sending a request to a web service. Posting a message or retrieving a message from a queue. Writing to or reading from a local file. This antipattern typically occurs because: It appears to be the most intuitive way to perform an operation. The application requires a response from a request. The application uses a library that only provides synchronous methods for IO. An external library performs synchronous IO operations internally. A single synchronous IO call can block an entire call chain. The following code uploads a file to Azure blob storage. There are two places where the code blocks waiting for synchronous IO, the CreateIfNotExists method and the UploadFromStream method. ```csharp var blobClient = storageAccount.CreateCloudBlobClient(); var container = blobClient.GetContainerReference("uploadedfiles"); container.CreateIfNotExists(); var blockBlob = container.GetBlockBlobReference("myblob"); Create or overwrite the "myblob" blob with contents from a local file. using (var fileStream = File.OpenRead(HostingEnvironment.MapPath("~FileToUpload.txt"))) { blockBlob.UploadFromStream(fileStream); } ``` Here's an example of waiting for a response from an external service. The GetUserProfile method calls a remote service that returns a UserProfile. ```csharp public interface IUserProfileService { UserProfile GetUserProfile(); } public class SyncController : ApiController { private readonly IUserProfileService _userProfileService; public SyncController() { _userProfileService = new FakeUserProfileService(); } This is a synchronous method that calls the synchronous GetUserProfile method. public UserProfile GetUserProfile() { return _userProfileService.GetUserProfile(); } } ``` You can find the complete code for both of these examples here. How to fix the problem Replace synchronous IO operations with asynchronous operations. This frees the current thread to continue performing meaningful work rather than blocking, and helps improve the utilization of compute resources. Performing IO asynchronously is particularly efficient for handling an unexpected surge in requests from client applications. Many libraries provide both synchronous and asynchronous versions of methods. Whenever possible, use the asynchronous versions. Here is the asynchronous version of the previous example that uploads a file to Azure blob storage. ```csharp var blobClient = storageAccount.CreateCloudBlobClient(); var container = blobClient.GetContainerReference("uploadedfiles"); await container.CreateIfNotExistsAsync(); var blockBlob = container.GetBlockBlobReference("myblob"); Create or overwrite the "myblob" blob with contents from a local file. using (var fileStream = File.OpenRead(HostingEnvironment.MapPath("~FileToUpload.txt"))) { await blockBlob.UploadFromStreamAsync(fileStream); } ``` The await operator returns control to the calling environment while the asynchronous operation is performed. The code after this statement acts as a continuation that runs when the asynchronous operation has completed. A well designed service should also provide asynchronous operations. Here is an asynchronous version of the web service that returns user profiles. The GetUserProfileAsync method depends on having an asynchronous version of the User Profile service. ```csharp public interface IUserProfileService { Task GetUserProfileAsync(); } public class AsyncController : ApiController { private readonly IUserProfileService _userProfileService; public AsyncController() { _userProfileService = new FakeUserProfileService(); } This is a synchronous method that calls the Task based GetUserProfileAsync method. public Task GetUserProfileAsync() { return _userProfileService.GetUserProfileAsync(); } } ``` For libraries that don't provide asynchronous versions of operations, it may be possible to create asynchronous wrappers around selected synchronous methods. Follow this approach with caution. While it may improve responsiveness on the thread that invokes the asynchronous wrapper, it actually consumes more resources. An extra thread may be created, and there is overhead associated with synchronizing the work done by this thread. Some tradeoffs are discussed in this blog post: Should I expose asynchronous wrappers for synchronous methods? Here is an example of an asynchronous wrapper around a synchronous method. csharp Asynchronous wrapper around synchronous library method private async Task LibraryIOOperationAsync() { return await Task.Run(() => LibraryIOOperation()); } Now the calling code can await on the wrapper: csharp Invoke the asynchronous wrapper using a task await LibraryIOOperationAsync(); Considerations IO operations that are expected to be very short lived and are unlikely to cause contention might be more performant as synchronous operations. An example might be reading small files on a solid-state drive (SSD) drive. The overhead of dispatching a task to another thread, and synchronizing with that thread when the task completes, might outweigh the benefits of asynchronous IO. However, these cases are relatively rare, and most IO operations should be done asynchronously. Improving IO performance may cause other parts of the system to become bottlenecks. For example, unblocking threads might result in a higher volume of concurrent requests to shared resources, leading in turn to resource starvation or throttling. If that becomes a problem, you might need to scale out the number of web servers or partition data stores to reduce contention. How to detect the problem For users, the application may seem unresponsive periodically. The application might fail with timeout exceptions. These failures could also return HTTP 500 (Internal Server) errors. On the server, incoming client requests might be blocked until a thread becomes available, resulting in excessive request queue lengths, manifested as HTTP 503 (Service Unavailable) errors. You can perform the following steps to help identify the problem: Monitor the production system and determine whether blocked worker threads are constraining throughput. If requests are being blocked due to lack of threads, review the application to determine which operations may be performing IO synchronously. Perform controlled load testing of each operation that is performing synchronous IO, to find out whether those operations are affecting system performance. Example diagnosis The following sections apply these steps to the sample application described earlier. Monitor web server performance For Azure web applications and web roles, it's worth monitoring the performance of the IIS web server. In particular, pay attention to the request queue length to establish whether requests are being blocked waiting for available threads during periods of high activity. You can gather this information by enabling Azure diagnostics. For more information, see: Monitor Apps in Azure App Service Create and use performance counters in an Azure application Instrument the application to see how requests are handled once they have been accepted. Tracing the flow of a request can help to identify whether it is performing slow-running calls and blocking the current thread. Thread profiling can also highlight requests that are being blocked. Load test the application The following graph shows the performance of the synchronous GetUserProfile method shown earlier, under varying loads of up to 4000 concurrent users. The application is an ASP.NET application running in an Azure Cloud Service web role. The synchronous operation is hard-coded to sleep for 2 seconds, to simulate synchronous IO, so the minimum response time is slightly over 2 seconds. When the load reaches approximately 2500 concurrent users, the average response time reaches a plateau, although the volume of requests per second continues to increase. Note that the scale for these two measures is logarithmic. The number of requests per second doubles between this point and the end of the test. In isolation, it's not necessarily clear from this test whether the synchronous IO is a problem. Under heavier load, the application may reach a tipping point where the web server can no longer process requests in a timely manner, causing client applications to receive time-out exceptions. Incoming requests are queued by the IIS web server and handed to a thread running in the ASP.NET thread pool. Because each operation performs IO synchronously, the thread is blocked until the operation completes. As the workload increases, eventually all of the ASP.NET threads in the thread pool are allocated and blocked. At that point, any further incoming requests must wait in the queue for an available thread. As the queue length grows, requests start to time out. Implement the solution and verify the result The next graph shows the results from load testing the asynchronous version of the code. Throughput is far higher. Over the same duration as the previous test, the system successfully handles a nearly tenfold increase in throughput, as measured in requests per second. Moreover, the average response time is relatively constant and remains approximately 25 times smaller than the previous test.
architecture-center/docs/antipatterns/synchronous-io/index.md/0
Synchronous I/O antipattern
architecture-center/docs/antipatterns/synchronous-io/index.md
architecture-center
2,121
90
This article describes how Azure Kubernetes Service (AKS) monitoring compares to Amazon Elastic Kubernetes Service (Amazon EKS). The article guides you on different options to monitor and manage the logs of an AKS cluster and its workloads. [!INCLUDE eks-aks] Amazon EKS monitoring and logging Like any Kubernetes service, EKS has two major components, the control plane and worker nodes. There are specific capabilities for each layer. Amazon EKS control plane and cluster monitoring Amazon EKS integrates with Amazon CloudWatch Logs to provide logging and monitoring for the Amazon EKS control plane. This integration isn't enabled by default, but when configured, it gathers logs on: API server and API calls. Audit logs and user interactions. Authenticator logs. Scheduler and controller logs. Amazon EKS exposes control plane metrics at the metrics endpoint, in Prometheus text format. CloudWatch Container Insights can collect and store Prometheus metrics. You can deploy and self-manage Prometheus on top of your EKS cluster, or use Amazon Managed service for Prometheus. Amazon EKS also integrates with Amazon Web Services (AWS) CloudTrail to track actions and API calls. For more information, see Logging Amazon EKS API calls with AWS CloudTrail. Amazon EKS workload monitoring CloudWatch Container Insights can collect and aggregate metrics and logs from containerized applications deployed in EKS. You can implement Container Insights on Amazon EKS with a containerized version of the CloudWatch agent, or by using the AWS Distro for OpenTelemetry as a DaemonSet. You can send logs with FluentBit. AKS monitoring and logging Like other Azure resources, AKS generates platform metrics and resource logs that you can use to monitor its basic health and performance. Download a Visio file of this architecture. Azure Monitor AKS natively integrates with Azure Monitor. Azure Monitor stores metrics and logs in a central location called a Log Analytics workspace. This data is processed and analyzed to provide insights and alerts. For more information, see Monitor Azure Kubernetes Service (AKS) with Azure Monitor. Container Insights is the feature of Azure Monitor that collects, indexes, and stores the data your AKS cluster generates. You can configure Container Insights to monitor managed Kubernetes clusters hosted on AKS and other cluster configurations. Container Insights can monitor AKS health and performance with visualization tailored to Kubernetes environments. Similar to EKS, enabling Container Insights for your AKS cluster deploys a containerized version of the Log Analytics agent, which is responsible for sending data to your Log Analytics workspace. Microsoft Sentinel Microsoft Sentinel delivers intelligent security analytics and threat intelligence across the enterprise. With Microsoft Sentinel, you get a single solution for attack detection, threat visibility, proactive hunting, and threat response. Microsoft Sentinel must be connected with your AKS. This connector lets you stream your Azure Kubernetes Service (AKS) diagnostics logs into Microsoft Sentinel, allowing you to continuously monitor activity in all your instances. Once you have connected your data sources to Microsoft Sentinel, you can visualize and monitor the data using the Microsoft Sentinel and Azure Monitor Workbooks, which provides versatility in creating custom dashboards. AKS cluster and workload monitoring An AKS deployment can divide into cluster level components, managed AKS components, Kubernetes objects and workloads, applications, and external resources. The following table shows a common strategy for monitoring an AKS cluster and workload applications. Each level has distinct monitoring requirements. Level Description Monitoring requirements --------- Cluster level components Virtual machine scale sets abstracted as AKS nodes and node pools Node status and resource utilization including CPU, memory, disk, and network Managed AKS components AKS control plane components including API servers, cloud controller, and kubelet Control plane logs and metrics from the kube-system namespace Kubernetes objects and workloads Kubernetes objects such as deployments, containers, and replica sets Resource utilization and failures Applications Application workloads running on the AKS cluster Monitoring specific to architecture, but including application logs and service transactions External External resources that aren't part of AKS but are required for cluster scalability and management Specific to each component Cluster level components: You can use existing Container Insights views and reports to monitor cluster level components to understand their health, readiness, performance, CPU and memory resource utilization, and trends. Managed AKS components: You can use Metrics Explorer to view the Inflight Requests counter. This view includes request latency and work queue processing time. Kubernetes objects and workloads: You can use existing Container Insights views and reports to monitor deployment, controllers, pods, and containers. Use the Nodes and Controllers views to view the health and performance of the pods that are running on nodes and controllers, and their resource consumption in terms of CPU and memory. From the Container Insights Containers view, you can view the health and performance of containers, or select an individual container and monitor its events and logs in real time. For details about using this view and analyzing container health and performance, see Monitor your Kubernetes cluster performance with Container Insights. Applications: You can use Application Insights to monitor applications that are running on AKS and other environments. Application Insights is an application performance management tool that provides support for many programming languages. Depending on your needs, you can instrument your application code to capture requests, traces, logs, exceptions, custom metrics, and end-to-end transactions, and send this data to Application Insights. If you have a Java application, you can provide monitoring without instrumenting your code. For more information, see Zero instrumentation application monitoring for Kubernetes. External components: You can monitor external components like service mesh, ingress, and egress with Prometheus and Grafana or other tools. You can use Azure Monitor features to monitor any platform as a service (PaaS) that your workload applications use, such as databases and other Azure resources. Third-party monitoring solutions You can set up third-party monitoring solutions like Grafana or Prometheus in your AKS node pools. For Grafana, Grafana Labs provides a dashboard with views of critical API server metrics. You can use this dashboard on your existing Grafana server or set up a new Grafana server in Azure. For more information, see Monitor your Azure services in Grafana. Prometheus is a popular open-source metrics monitoring solution from the Cloud Native Compute Foundation. You can integrate Prometheus with Azure Monitor so you don't need to set up and manage a Prometheus server with a store. Container Insights provides a seamless onboarding experience to collect Prometheus metrics. You can expose the Prometheus metrics endpoint through your exporters or pod applications, and the containerized agent for Container Insights can scrape the metrics. Container Insights complements and completes end-to-end AKS monitoring, including log collection, which Prometheus as a stand-alone tool doesn't provide. For more information, see Configure scraping of Prometheus metrics with Container insights. AKS monitoring costs The Azure Monitor pricing model is primarily based on the amount of data that's ingested per day into your Log Analytics workspace. The cost varies by the plan and retention periods you select. Before enabling Container Insights, estimate costs and understand how to control data ingestion and its costs. For detailed guidance, see Estimating costs to monitor your AKS cluster. Contributors This article is maintained by Microsoft. It was originally written by the following contributors. Principal authors: Ketan Chawda Senior Customer Engineer Paolo Salvatori Principal Service Engineer Laura Nicolas Senior Software Engineer Other contributors: Chad Kittel Principal Software Engineer Ed Price Senior Content Program Manager Theano Petersen Technical Writer To see non-public LinkedIn profiles, sign in to LinkedIn. Next steps AKS for Amazon EKS professionals Kubernetes identity and access management Secure network access to Kubernetes Storage options for a Kubernetes cluster Cost management for Kubernetes Kubernetes node and node pool management Cluster governance Related resources Use Azure Monitor Private Link Scope Monitor Azure Kubernetes Service (AKS) with Azure Monitor Monitoring AKS data reference Container Insights overview Enable Container Insights AKS resource logs Configure scraping of Prometheus metrics with Container Insights How to query logs from Container Insights Azure Monitor data source for Grafana Monitor and back up Azure resources Instrument solutions to support monitoring and logging Design a solution to log and monitor Azure resources Monitor the usage, performance, and availability of resources with Azure Monitor
architecture-center/docs/aws-professional/eks-to-aks/monitoring-content.md/0
Amazon EKS monitoring and logging
architecture-center/docs/aws-professional/eks-to-aks/monitoring-content.md
architecture-center
1,843
91
title: Security and identity with Azure and AWS description: Get guidance for integrating security and identity services across Azure and AWS. Explore strong authentication and explicit trust validation, PIM, and more. author: dougkl007 ms.author: dougkl ms.date: 01022022 ms.topic: conceptual ms.service: architecture-center ms.subservice: cloud-fundamentals categories: - security - identity products: - entra-id Multicloud security and identity with Azure and Amazon Web Services (AWS) Many organizations are finding themselves with a de facto multicloud strategy, even if that wasn't their deliberate strategic intention. In a multicloud environment, it's critical to ensure consistent security and identity experiences to avoid increased friction for developers, business initiatives and increased organizational risk from cyberattacks taking advantage of security gaps. Driving security and identity consistency across clouds should include: Multicloud identity integration Strong authentication and explicit trust validation Cloud Platform Security (multicloud) Microsoft Defender for Cloud Privilege Identity Management (Azure) Consistent end-to-end identity management Multicloud identity integration Customers using both Azure and AWS cloud platforms benefit from consolidating identity services between these two clouds using Microsoft Entra ID and Single Sign-on (SSO) services. This model allows for a consolidated identity plane through which access to services in both clouds can be consistently accessed and governed. This approach allows for the rich role-based access controls in Microsoft Entra ID to be enabled across the identity and access management (IAM) services in AWS using rules to associate the user.userprincipalname and user.assignrole attributes from Microsoft Entra ID into IAM permissions. This approach reduces the number of unique identities users and administrators are required to maintain across both clouds including a consolidation of the identity per account design that AWS employs. The AWS IAM solution allows for and specifically identifies Microsoft Entra ID as a federation and authentication source for their customers. A complete walk-through of this integration can be found in the Tutorial: Microsoft Entra single sign-on (SSO) integration with Amazon Web Services (AWS). Strong authentication and explicit trust validation Because many customers continue to support a hybrid identity model for Active Directory services, it's increasingly important for security engineering teams to implement strong authentication solutions and block legacy authentication methods associated primarily with on-premises and legacy Microsoft technologies. A combination of multifactor authentication and conditional access policies enable enhanced security for common authentication scenarios for end users in your organization. While multifactor authentication itself provides an increase level of security to confirm authentications, additional controls can be applied using conditional access controls to block legacy authentication to both Azure and AWS cloud environments. Strong authentication using only modern authentication clients is only possible with the combination of multifactor authentication and conditional access policies. Cloud Platform Security (multicloud) Once a common identity has been established in your multicloud environment, the Cloud Platform Security (CPS) service of Microsoft Defender for Cloud Apps can be used to discover, monitor, assess, and protect those services. Using the Cloud Discovery dashboard, security operations personnel can review the apps and resources being used across AWS and Azure cloud platforms. Once services are reviewed and sanctioned for use, the services can then be managed as enterprise applications in Microsoft Entra ID to enable Security Assertion Markup Language (SAML), password-based, and linked Single Sign-On mode for the convenience of users. CPS also provides for the ability to assess the cloud platforms connected for misconfigurations and compliance using vendor specific recommended security and configuration controls. This design enables organizations to maintain a single consolidated view of all cloud platform services and their compliance status. CPS also provides access and session control policies to prevent and protect your environment from risky endpoints or users when data exfiltration or malicious files are introduced into those platforms. Microsoft Defender for Cloud Microsoft Defender for Cloud provides unified security management and threat protection across your hybrid and multicloud workloads, including workloads in Azure, Amazon Web Services (AWS), and Google Cloud Platform (GCP). Defender for Cloud helps you find and fix security vulnerabilities, apply access and application controls to block malicious activity, detect threats using analytics and intelligence, and respond quickly when under attack. To protect your AWS-based resources on Microsoft Defender for Cloud, you can connect an account with either the Classic cloud connectors experience or the Environment settings page (in preview), which is recommended. Privileged Identity Management (Azure) To limit and control access for your highest privileged accounts in Microsoft Entra ID, Privileged Identity Management (PIM) can be enabled to provide just-in-time access to Azure services. Once deployed, PIM can be used to control and limit access using the assignment model for roles, eliminate persistent access for these privileged accounts, and provide additional discover and monitoring of users with these account types. When combined with Microsoft Sentinel, workbooks and playbooks can be established to monitor and raise alerts to your security operations center personnel when there is lateral movement of accounts that have been compromised. Consistent end-to-end identity management Ensure that all processes include an end-to-end view of all clouds as well as on-premises systems and that security and identity personnel are trained on these processes. Using a single identity across Microsoft Entra ID, AWS Accounts and on-premises services enable this end-to-end strategy and allows for greater security and protection of accounts for privileged and non-privileged accounts. Customers who are currently looking to reduce the burden of maintaining multiple identities in their multicloud strategy adopt Microsoft Entra ID to provide consistent and strong control, auditing, and detection of anomalies and abuse of identities in their environment. Continued growth of new capabilities across the Microsoft Entra ecosystem helps you stay ahead of threats to your environment as a result of using identities as a common control plane in your multicloud environments. Next steps Microsoft Entra B2B: enables access to your corporate applications from partner-managed identities. Azure Active Directory B2C: service offering support for single sign-on and user management for consumer-facing applications. Microsoft Entra Domain Services: hosted domain controller service, allowing Active Directory compatible domain join and user management functionality. Getting started with Microsoft Azure security Azure Identity Management and access control security best practices
architecture-center/docs/aws-professional/security-identity.md/0
Multicloud security and identity with Azure and Amazon Web Services (AWS)
architecture-center/docs/aws-professional/security-identity.md
architecture-center
1,340
92
title: Azure service retry guidance titleSuffix: Best practices for cloud applications description: Learn about the retry mechanism features for many Azure services. Retry mechanisms differ because services have different characteristics and requirements. ms.author: robbag author: RobBagby ms.date: 09162020 ms.topic: conceptual ms.service: architecture-center ms.subservice: best-practice categories: - azure products: - entra-id ms.custom: - best-practice Retry guidance for Azure services Most Azure services and client SDKs include a retry mechanism. However, these differ because each service has different characteristics and requirements, and so each retry mechanism is tuned to a specific service. This guide summarizes the retry mechanism features for most Azure services, and includes information to help you use, adapt, or extend the retry mechanism for that service. For general guidance on handling transient faults, and retrying connections and operations against services and resources, see Retry guidance. The following table summarizes the retry features for the Azure services described in this guidance. Service Retry capabilities Policy configuration Scope Telemetry features --- --- --- --- --- Microsoft Entra ID Native in MSAL library Embedded into MSAL library Internal None Azure Cosmos DB Native in service Non-configurable Global TraceSource Data Lake Store Native in client Non-configurable Individual operations None Event Hubs Native in client Programmatic Client None IoT Hub Native in client SDK Programmatic Client None Azure Cache for Redis Native in client Programmatic Client TextWriter Search Native in client Programmatic Client Event Tracing for Windows (ETW) or Custom Service Bus Native in client Programmatic Namespace Manager, Messaging Factory, and Client ETW Service Fabric Native in client Programmatic Client None SQL Database with ADO.NET Polly Declarative and programmatic Single statements or blocks of code Custom SQL Database with Entity Framework Native in client Programmatic Global per AppDomain None SQL Database with Entity Framework Core Native in client Programmatic Global per AppDomain None Storage Native in client Programmatic Client and individual operations TraceSource [!NOTE] For most of the Azure built-in retry mechanisms, there is currently no way apply a different retry policy for different types of error or exception. You should configure a policy that provides the optimum average performance and availability. One way to fine-tune the policy is to analyze log files to determine the type of transient faults that are occurring. Microsoft Entra ID Microsoft Entra ID is a comprehensive identity and access management cloud solution that combines core directory services, advanced identity governance, security, and application access management. Microsoft Entra ID also offers developers an identity management platform to deliver access control to their applications, based on centralized policy and rules. [!NOTE] For retry guidance on Managed Service Identity endpoints, see How to use an Azure VM Managed Service Identity (MSI) for token acquisition. Retry mechanism There's a built-in retry mechanism for Microsoft Entra ID in the Microsoft Authentication Library (MSAL). To avoid unexpected lockouts, we recommend that third-party libraries and application code do not retry failed connections, but allow MSAL to handle retries. Retry usage guidance Consider the following guidelines when using Microsoft Entra ID: When possible, use the MSAL library and the built-in support for retries. If you're using the REST API for Microsoft Entra ID, retry the operation if the result code is 429 (Too Many Requests) or an error in the 5xx range. Don't retry for any other errors. For 429 errors, only retry after the time indicated in the Retry-After header. For 5xx errors, use exponential back-off, with the first retry at least 5 seconds after the response. Don't retry on errors other than 429 and 5xx. Next steps Microsoft Authentication Library (MSAL) Azure Cosmos DB Azure Cosmos DB is a fully managed multi-model database that supports schemaless JSON data. It offers configurable and reliable performance, native JavaScript transactional processing, and is built for the cloud with elastic scale. Retry mechanism The Azure Cosmos DB SDKs automatically retry on certain error conditions, and user applications are encouraged to have their own retry policies. See the guide to designing resilient applications with Azure Cosmos DB SDKs for a complete list of error conditions and when to retry. Telemetry Depending on the language of your application, diagnostics and telemetry are exposed as logs or promoted properties on the operation responses. For more information, see the "Capture the diagnostics" section in Azure Cosmos DB C SDK and Azure Cosmos DB Java SDK. Data Lake Store Data Lake Storage Gen2 makes Azure Storage the foundation for building enterprise data lakes on Azure. Data Lake Storage Gen2 allows you to easily manage massive amounts of data. The Azure Storage Files Data Lake client library includes all the capabilities required to make it easy for developers, data scientists, and analysts to store data of any size, shape, and speed, and do all types of processing and analytics across platforms and languages. Retry mechanism The DataLakeServiceClient allows you to manipulate Azure Data Lake service resources and file systems. The storage account provides the top-level namespace for the Data Lake service. When you create the client you could provide the client configuration options for connecting to Azure Data Lake service (DataLakeClientOptions). The DataLakeClientOptions includes a Retry property (inherited from Azure.Core.ClientOptions) that can be configured (RetryOptions class). Telemetry Monitoring the use and performance of Azure Storage is an important part of operationalizing your service. Examples include frequent operations, operations with high latency, or operations that cause service-side throttling. All of the telemetry for your storage account is available through Azure Storage logs in Azure Monitor. This feature integrates your storage account with Log Analytics and Event Hubs, while also enabling you to archive logs to another storage account. To see the full list of metrics and resources logs and their associated schema, see Azure Storage monitoring data reference. Event Hubs Azure Event Hubs is a hyperscale telemetry ingestion service that collects, transforms, and stores millions of events. Retry mechanism Retry behavior in the Azure Event Hubs Client Library is controlled by the RetryPolicy property on the EventHubClient class. The default policy retries with exponential backoff when Azure Event Hubs returns a transient EventHubsException or an OperationCanceledException. Default retry policy for Event Hubs is to retry up to 9 times with an exponential back-off time of up to 30 seconds. Example csharp EventHubClient client = EventHubClient.CreateFromConnectionString("[event_hub_connection_string]"); client.RetryPolicy = RetryPolicy.Default; Next steps Azure Event Hubs client library for .NET IoT Hub Azure IoT Hub is a service for connecting, monitoring, and managing devices to develop Internet of Things (IoT) applications. Retry mechanism The Azure IoT device SDK can detect errors in the network, protocol, or application. Based on the error type, the SDK checks whether a retry needs to be performed. If the error is recoverable, the SDK begins to retry using the configured retry policy. The default retry policy is exponential back-off with random jitter, but it can be configured. Policy configuration Policy configuration differs by language. For more information, see IoT Hub retry policy configuration. Next steps IoT Hub retry policy Troubleshoot IoT Hub device disconnection Azure Cache for Redis Azure Cache for Redis is a fast data access and low latency cache service based on the popular open-source Redis cache. It's secure, managed by Microsoft, and is accessible from any application in Azure. The guidance in this section is based on using the StackExchange.Redis client to access the cache. A list of other suitable clients can be found on the Redis website, and these may have different retry mechanisms. The StackExchange.Redis client uses multiplexing through a single connection. The recommended usage is to create an instance of the client at application startup and use this instance for all operations against the cache. For this reason, the connection to the cache is made only once, and so all of the guidance in this section is related to the retry policy for this initial connection—and not for each operation that accesses the cache. Retry mechanism The StackExchange.Redis client uses a connection manager class that is configured through a set of options, including: ConnectRetry. The number of times a failed connection to the cache will be retried. ReconnectRetryPolicy. The retry strategy to use. ConnectTimeout. The maximum waiting time in milliseconds. Policy configuration Retry policies are configured programmatically by setting the options for the client before connecting to the cache. This can be done by creating an instance of the ConfigurationOptions class, populating its properties, and passing it to the Connect method. The built-in classes support linear (constant) delay and exponential backoff with randomized retry intervals. You can also create a custom retry policy by implementing the IReconnectRetryPolicy interface. The following example configures a retry strategy using exponential backoff. csharp var deltaBackOffInMilliseconds = TimeSpan.FromSeconds(5).TotalMilliseconds; var maxDeltaBackOffInMilliseconds = TimeSpan.FromSeconds(20).TotalMilliseconds; var options = new ConfigurationOptions { EndPoints = {"localhost"}, ConnectRetry = 3, ReconnectRetryPolicy = new ExponentialRetry(deltaBackOffInMilliseconds, maxDeltaBackOffInMilliseconds), ConnectTimeout = 2000 }; ConnectionMultiplexer redis = ConnectionMultiplexer.Connect(options, writer); Alternatively, you can specify the options as a string, and pass this to the Connect method. The ReconnectRetryPolicy property can't be set this way, only through code. csharp var options = "localhost,connectRetry=3,connectTimeout=2000"; ConnectionMultiplexer redis = ConnectionMultiplexer.Connect(options, writer); You can also specify options directly when you connect to the cache. csharp var conn = ConnectionMultiplexer.Connect("redis0:6380,redis1:6380,connectRetry=3"); For more information, see Stack Exchange Redis Configuration in the StackExchange.Redis documentation. The following table shows the default settings for the built-in retry policy. Context Setting Default value(v 1.2.2) Meaning --- --- --- --- ConfigurationOptions ConnectRetryConnectTimeoutSyncTimeoutReconnectRetryPolicy 3Maximum 5000 ms plus SyncTimeout1000LinearRetry 5000 ms The number of times to repeat connect attempts during the initial connection operation.Timeout (ms) for connect operations. Not a delay between retry attempts.Time (ms) to allow for synchronous operations.Retry every 5000 ms. [!NOTE] For synchronous operations, SyncTimeout can add to the end-to-end latency, but setting the value too low can cause excessive timeouts. See How to troubleshoot Azure Cache for Redis. In general, avoid using synchronous operations, and use asynchronous operations instead. For more information, see Pipelines and Multiplexers. Retry usage guidance Consider the following guidelines when using Azure Cache for Redis: The StackExchange Redis client manages its own retries, but only when establishing a connection to the cache when the application first starts. You can configure the connection timeout, the number of retry attempts, and the time between retries to establish this connection, but the retry policy doesn't apply to operations against the cache. Instead of using a large number of retry attempts, consider falling back by accessing the original data source instead. Telemetry You can collect information about connections (but not other operations) using a TextWriter. csharp var writer = new StringWriter(); ConnectionMultiplexer redis = ConnectionMultiplexer.Connect(options, writer); An example of the output this generates is shown below. ```text localhost:6379,connectTimeout=2000,connectRetry=3 1 unique nodes specified Requesting tie-break from localhost:6379 > __Booksleeve_TieBreak... Allowing endpoints 00:00:02 to respond... localhost:6379 faulted: SocketFailure on PING localhost:6379 failed to nominate (Faulted) UnableToResolvePhysicalConnection on GET No masters detected localhost:6379: Standalone v2.0.0, master; keep-alive: 00:01:00; int: Connecting; sub: Connecting; not in use: DidNotRespond localhost:6379: int ops=0, qu=0, qs=0, qc=1, wr=0, sync=1, socks=2; sub ops=0, qu=0, qs=0, qc=0, wr=0, socks=2 Circular op-count snapshot; int: 0 (0.00 opss; spans 10s); sub: 0 (0.00 opss; spans 10s) Sync timeouts: 0; fire and forget: 0; last heartbeat: -1s ago resetting failing connections to retry... retrying; attempts left: 2... ... ``` Examples The following code example configures a constant (linear) delay between retries when initializing the StackExchange.Redis client. This example shows how to set the configuration using a ConfigurationOptions instance. ```csharp using System; using System.Collections.Generic; using System.IO; using System.Linq; using System.Text; using System.Threading.Tasks; using StackExchange.Redis; namespace RetryCodeSamples { class CacheRedisCodeSamples { public async static Task Samples() { var writer = new StringWriter(); { try { var retryTimeInMilliseconds = TimeSpan.FromSeconds(4).TotalMilliseconds; delay between retries Using object-based configuration. var options = new ConfigurationOptions { EndPoints = { "localhost" }, ConnectRetry = 3, ReconnectRetryPolicy = new LinearRetry(retryTimeInMilliseconds) }; ConnectionMultiplexer redis = ConnectionMultiplexer.Connect(options, writer); Store a reference to the multiplexer for use in the application. } catch { Console.WriteLine(writer.ToString()); throw; } } } } } ``` The next example sets the configuration by specifying the options as a string. The connection timeout is the maximum period of time to wait for a connection to the cache, not the delay between retry attempts. The ReconnectRetryPolicy property can only be set by code. ```csharp using System.Collections.Generic; using System.IO; using System.Linq; using System.Text; using System.Threading.Tasks; using StackExchange.Redis; namespace RetryCodeSamples { class CacheRedisCodeSamples { public async static Task Samples() { var writer = new StringWriter(); { try { Using string-based configuration. var options = "localhost,connectRetry=3,connectTimeout=2000"; ConnectionMultiplexer redis = ConnectionMultiplexer.Connect(options, writer); Store a reference to the multiplexer for use in the application. } catch { Console.WriteLine(writer.ToString()); throw; } } } } } ``` For more examples, see Configuration on the project website. Next steps Redis website Azure Search Azure Search can be used to add powerful and sophisticated search capabilities to a website or application, quickly and easily tune search results, and construct rich and fine-tuned ranking models. Retry mechanism Azure SDK for .NET includes an Azure.Search.Documents client library from the Azure SDK team that is functionally equivalent to the previous client library, Microsoft.Azure.Search. Retry behavior in Microsoft.Azure.Search is controlled by the SetRetryPolicy method on the SearchServiceClient and SearchIndexClient classes. The default policy retries with exponential backoff when Azure Search returns a 5xx or 408 (Request Timeout) response. Retry behavior in Azure.Search.Documents is controlled by SearchClientOptions (It is part of the SearchClient constructor) in the property Retry, which belongs to the class Azure.Core.RetryOptions(where all configurations are available). Telemetry Trace with ETW or by registering a custom trace provider. For more information, see the AutoRest documentation. Service Bus Service Bus is a cloud messaging platform that provides loosely coupled message exchange with improved scale and resiliency for components of an application, whether hosted in the cloud or on-premises. Retry mechanism The namespace and some of the configuration details depend on which Service Bus client SDK package is used: Package Description Namespace ----------------------------- Azure.Messaging.ServiceBus Azure Service Bus client library for .NET Azure.Messaging.ServiceBus WindowsAzure.ServiceBus This package is the older Service Bus client library. It requires .NET Framework 4.5.2. Microsoft.Azure.ServiceBus Retry usage guidance The ServiceBusRetryOptions property specifies the retry options for the ServiceBusClient object: Setting Default value Meaning --------------------------------- CustomRetryPolicy A custom retry policy to be used in place of the individual option values. Delay 0.8 seconds The delay between retry attempts for a fixed approach or the delay on which to base calculations for a backoff-based approach. MaxDelay 60 seconds The maximum permissible delay between retry attempts. MaxRetries 3 The maximum number of retry attempts before considering the associated operation to have failed. Mode Exponential The approach to use for calculating retry delays. TryTimeout 60 seconds The maximum duration to wait for completion of a single attempt, whether the initial attempt or a retry. Set the Mode property to configure the ServiceBusRetryMode with any of these values: PropertyValueDescription ------------------------ Exponential1Retry attempts will delay based on a backoff strategy, where each attempt will increase the duration that it waits before retrying. Fixed0Retry attempts happen at fixed intervals; each delay is a consistent duration. Example: ```csharp using Azure.Messaging.ServiceBus; string connectionString = ""; string queueName = ""; Because ServiceBusClient implements IAsyncDisposable, we'll create it with "await using" so that it is automatically disposed for us. var options = new ServiceBusClientOptions(); options.RetryOptions = new ServiceBusRetryOptions { Delay = TimeSpan.FromSeconds(10), MaxDelay = TimeSpan.FromSeconds(30), Mode = ServiceBusRetryMode.Exponential, MaxRetries = 3, }; await using var client = new ServiceBusClient(connectionString, options); ``` Telemetry Service Bus collects the same kinds of monitoring data as other Azure resources. You can Monitor Azure Service Bus using Azure Monitor. You also have various options for sending telemetry with the Service Bus .NET client libraries. Tracking with Azure Application Insights Tracking with OpenTelemetry Example The following code example shows how to use the Azure.Messaging.ServiceBus package to: Set the retry policy for a ServiceBusClient using a new ServiceBusClientOptions. Create a new message with a new instance of a ServiceBusMessage. Send a message to the Service Bus using the ServiceBusSender.SendMessageAsync(message) method. Receive using the ServiceBusReceiver, which are represented as ServiceBusReceivedMessage objects. ```csharp using Azure.Messaging.ServiceBus; using Azure.Messaging.ServiceBus; string connectionString = ""; string queueName = "queue1"; Because ServiceBusClient implements IAsyncDisposable, we'll create it with "await using" so that it is automatically disposed for us. var options = new ServiceBusClientOptions(); options.RetryOptions = new ServiceBusRetryOptions { Delay = TimeSpan.FromSeconds(10), MaxDelay = TimeSpan.FromSeconds(30), Mode = ServiceBusRetryMode.Exponential, MaxRetries = 3, }; await using var client = new ServiceBusClient(connectionString, options); The sender is responsible for publishing messages to the queue. ServiceBusSender sender = client.CreateSender(queueName); ServiceBusMessage message = new ServiceBusMessage("Hello world!"); await sender.SendMessageAsync(message); The receiver is responsible for reading messages from the queue. ServiceBusReceiver receiver = client.CreateReceiver(queueName); ServiceBusReceivedMessage receivedMessage = await receiver.ReceiveMessageAsync(); string body = receivedMessage.Body.ToString(); Console.WriteLine(body); ``` Next steps Asynchronous Messaging Patterns and High Availability Service Fabric Distributing reliable services in a Service Fabric cluster guards against most of the potential transient faults discussed in this article. Some transient faults are still possible, however. For example, the naming service might be in the middle of a routing change when it gets a request, causing it to throw an exception. If the same request comes 100 milliseconds later, it will probably succeed. Internally, Service Fabric manages this kind of transient fault. You can configure some settings by using the OperationRetrySettings class while setting up your services. The following code shows an example. In most cases, this shouldn't be necessary, and the default settings will be fine. ```csharp FabricTransportRemotingSettings transportSettings = new FabricTransportRemotingSettings { OperationTimeout = TimeSpan.FromSeconds(30) }; var retrySettings = new OperationRetrySettings(TimeSpan.FromSeconds(15), TimeSpan.FromSeconds(1), 5); var clientFactory = new FabricTransportServiceRemotingClientFactory(transportSettings); var serviceProxyFactory = new ServiceProxyFactory((c) => clientFactory, retrySettings); var client = serviceProxyFactory.CreateServiceProxy( new Uri("fabric:SomeAppSomeStatefulReliableService"), new ServicePartitionKey(0)); ``` Next steps Remoting exception handling SQL Database using ADO.NET SQL Database is a hosted SQL database available in a range of sizes and as both a standard (shared) and premium (non-shared) service. Retry mechanism SQL Database has no built-in support for retries when accessed using ADO.NET. However, the return codes from requests can be used to determine why a request failed. For more information about SQL Database throttling, see Azure SQL Database resource limits. For a list of relevant error codes, see SQL error codes for SQL Database client applications. You can use the Polly library to implement retries for SQL Database. See Transient fault handling with Polly. Retry usage guidance Consider the following guidelines when accessing SQL Database using ADO.NET: Choose the appropriate service option (shared or premium). A shared instance may suffer longer than usual connection delays and throttling due to the usage by other tenants of the shared server. If more predictable performance and reliable low latency operations are required, consider choosing the premium option. Ensure that you perform retries at the appropriate level or scope to avoid non-idempotent operations causing inconsistency in the data. Ideally, all operations should be idempotent so that they can be repeated without causing inconsistency. Where this isn't the case, the retry should be performed at a level or scope that allows all related changes to be undone if one operation fails; for example, from within a transactional scope. For more information, see Cloud Service Fundamentals Data Access Layer – Transient Fault Handling. A fixed interval strategy isn't recommended for use with Azure SQL Database except for interactive scenarios where there are only a few retries at short intervals. Instead, consider using an exponential back-off strategy for most scenarios. Choose a suitable value for the connection and command timeouts when defining connections. Too short a timeout may result in premature failures of connections when the database is busy. Too long a timeout may prevent the retry logic working correctly by waiting too long before detecting a failed connection. The value of the timeout is a component of the end-to-end latency; it's effectively added to the retry delay specified in the retry policy for every retry attempt. Close the connection after some retries, even when using an exponential back off retry logic, and retry the operation on a new connection. Retrying the same operation multiple times on the same connection can be a factor that contributes to connection problems. For an example of this technique, see Cloud Service Fundamentals Data Access Layer – Transient Fault Handling. When connection pooling is in use (the default) there's a chance that the same connection will be chosen from the pool, even after closing and reopening a connection. If so, a technique to resolve it's to call the ClearPool method of the SqlConnection class to mark the connection as not reusable. However, you should do this only after several connection attempts have failed, and only when encountering the specific class of transient failures such as SQL timeouts (error code -2) related to faulty connections. If the data access code uses transactions initiated as TransactionScope instances, the retry logic should reopen the connection and initiate a new transaction scope. For this reason, the retryable code block should encompass the entire scope of the transaction. Consider starting with the following settings for retrying operations. These settings are general purpose, and you should monitor the operations and fine-tune the values to suit your own scenario. Context Sample target E2Emax latency Retry strategy Settings Values How it works --- --- --- --- --- --- Interactive, UI,or foreground 2 sec FixedInterval Retry countRetry intervalFirst fast retry 3500 mstrue Attempt 1 - delay 0 secAttempt 2 - delay 500 msAttempt 3 - delay 500 ms Backgroundor batch 30 sec ExponentialBackoff Retry countMin back-offMax back-offDelta back-offFirst fast retry 50 sec60 sec2 secfalse Attempt 1 - delay 0 secAttempt 2 - delay ~2 secAttempt 3 - delay ~6 secAttempt 4 - delay ~14 secAttempt 5 - delay ~30 sec [!NOTE] The end-to-end latency targets assume the default timeout for connections to the service. If you specify longer connection timeouts, the end-to-end latency will be extended by this additional time for every retry attempt. Examples This section shows how you can use Polly to access Azure SQL Database using a set of retry policies configured in the Policy class. The following code shows an extension method on the SqlCommand class that calls ExecuteAsync with exponential backoff. ```csharp public async static Task ExecuteReaderWithRetryAsync(this SqlCommand command) { GuardConnectionIsNotNull(command); var policy = Policy.Handle().WaitAndRetryAsync( retryCount: 3, Retry 3 times sleepDurationProvider: attempt => TimeSpan.FromMilliseconds(200 Math.Pow(2, attempt - 1)), Exponential backoff based on an initial 200 ms delay. onRetry: (exception, attempt) => { Capture some information for loggingtelemetry. logger.LogWarn($"ExecuteReaderWithRetryAsync: Retry {attempt} due to {exception}."); }); Retry the following call according to the policy. await policy.ExecuteAsync(async token => { This code is executed within the Policy if (conn.State != System.Data.ConnectionState.Open) await conn.OpenAsync(token); return await command.ExecuteReaderAsync(System.Data.CommandBehavior.Default, token); }, cancellationToken); } ``` This asynchronous extension method can be used as follows. ```csharp var sqlCommand = sqlConnection.CreateCommand(); sqlCommand.CommandText = "[some query]"; using (var reader = await sqlCommand.ExecuteReaderWithRetryAsync()) { Do something with the values } ``` Next steps Cloud Service Fundamentals Data Access Layer – Transient Fault Handling SQL Database using Entity Framework 6 SQL Database is a hosted SQL database available in a range of sizes and as both a standard (shared) and premium (non-shared) service. Entity Framework is an object-relational mapper that enables .NET developers to work with relational data using domain-specific objects. It eliminates the need for most of the data-access code that developers usually need to write. Retry mechanism Retry support is provided when accessing SQL Database using Entity Framework 6.0 and higher through a mechanism called Connection resiliency retry logic. The main features of the retry mechanism are: The primary abstraction is the IDbExecutionStrategy interface. This interface: Defines synchronous and asynchronous Execute methods. Defines classes that can be used directly or can be configured on a database context as a default strategy, mapped to provider name, or mapped to a provider name and server name. When configured on a context, retries occur at the level of individual database operations, of which there might be several for a given context operation. Defines when to retry a failed connection, and how. It includes several built-in implementations of the IDbExecutionStrategy interface: Default: no retrying. Default for SQL Database (automatic): no retrying, but inspects exceptions and wraps them with suggestion to use the SQL Database strategy. Default for SQL Database: exponential (inherited from base class) plus SQL Database detection logic. It implements an exponential back-off strategy that includes randomization. The built-in retry classes are stateful and aren't thread-safe. However, they can be reused after the current operation is completed. If the specified retry count is exceeded, the results are wrapped in a new exception. It doesn't bubble up the current exception. Policy configuration Retry support is provided when accessing SQL Database using Entity Framework 6.0 and higher. Retry policies are configured programmatically. The configuration can't be changed on a per-operation basis. When configuring a strategy on the context as the default, you specify a function that creates a new strategy on demand. The following code shows how you can create a retry configuration class that extends the DbConfiguration base class. csharp public class BloggingContextConfiguration : DbConfiguration { public BlogConfiguration() { Set up the execution strategy for SQL Database (exponential) with 5 retries and 4 sec delay this.SetExecutionStrategy( "System.Data.SqlClient", () => new SqlAzureExecutionStrategy(5, TimeSpan.FromSeconds(4))); } } You can then specify this as the default retry strategy for all operations using the SetConfiguration method of the DbConfiguration instance when the application starts. By default, EF will automatically discover and use the configuration class. csharp DbConfiguration.SetConfiguration(new BloggingContextConfiguration()); You can specify the retry configuration class for a context by annotating the context class with a DbConfigurationType attribute. However, if you have only one configuration class, EF will use it without the need to annotate the context. csharp [DbConfigurationType(typeof(BloggingContextConfiguration))] public class BloggingContext : DbContext If you need to use different retry strategies for specific operations, or disable retries for specific operations, you can create a configuration class that allows you to suspend or swap strategies by setting a flag in the CallContext. The configuration class can use this flag to switch strategies, or disable the strategy you provide and use a default strategy. For more information, see Suspend Execution Strategy (EF6 onwards). Another technique for using specific retry strategies for individual operations is to create an instance of the required strategy class and supply the desired settings through parameters. You then invoke its ExecuteAsync method. csharp var executionStrategy = new SqlAzureExecutionStrategy(5, TimeSpan.FromSeconds(4)); var blogs = await executionStrategy.ExecuteAsync( async () => { using (var db = new BloggingContext("Blogs")) { Acquire some values asynchronously and return them } }, new CancellationToken() ); The simplest way to use a DbConfiguration class is to locate it in the same assembly as the DbContext class. However, this isn't appropriate when the same context is required in different scenarios, such as different interactive and background retry strategies. If the different contexts execute in separate AppDomains, you can use the built-in support for specifying configuration classes in the configuration file or set it explicitly using code. If the different contexts must execute in the same AppDomain, a custom solution will be required. For more information, see Code-Based Configuration (EF6 onwards). The following table shows the default settings for the built-in retry policy when using EF6. Setting Default value Meaning --------------------------------- Policy Exponential Exponential back-off. MaxRetryCount 5 The maximum number of retries. MaxDelay 30 seconds The maximum delay between retries. This value doesn't affect how the series of delays are computed. It only defines an upper bound. DefaultCoefficient 1 second The coefficient for the exponential back-off computation. This value can't be changed. DefaultRandomFactor 1.1 The multiplier used to add a random delay for each entry. This value can't be changed. DefaultExponentialBase 2 The multiplier used to calculate the next delay. This value can't be changed. Retry usage guidance Consider the following guidelines when accessing SQL Database using EF6: Choose the appropriate service option (shared or premium). A shared instance may suffer longer than usual connection delays and throttling due to the usage by other tenants of the shared server. If predictable performance and reliable low latency operations are required, consider choosing the premium option. A fixed interval strategy isn't recommended for use with Azure SQL Database. Instead, use an exponential back-off strategy because the service may be overloaded, and longer delays allow more time for it to recover. Choose a suitable value for the connection and command timeouts when defining connections. Base the timeout on both your business logic design and through testing. You may need to modify this value over time as the volumes of data or the business processes change. Too short a timeout may result in premature failures of connections when the database is busy. Too long a timeout may prevent the retry logic working correctly by waiting too long before detecting a failed connection. The value of the timeout is a component of the end-to-end latency, although you can't easily determine how many commands will execute when saving the context. You can change the default timeout by setting the CommandTimeout property of the DbContext instance. Entity Framework supports retry configurations defined in configuration files. However, for maximum flexibility on Azure you should consider creating the configuration programmatically within the application. The specific parameters for the retry policies, such as the number of retries and the retry intervals, can be stored in the service configuration file and used at runtime to create the appropriate policies. This allows the settings to be changed without requiring the application to be restarted. Consider starting with the following settings for retrying operations. You can't specify the delay between retry attempts (it's fixed and generated as an exponential sequence). You can specify only the maximum values, as shown here; unless you create a custom retry strategy. These settings are general purpose, and you should monitor the operations and fine-tune the values to suit your own scenario. Context Sample target E2Emax latency Retry policy Settings Values How it works --- --- --- --- --- --- Interactive, UI,or foreground 2 seconds Exponential MaxRetryCountMaxDelay 3750 ms Attempt 1 - delay 0 secAttempt 2 - delay 750 msAttempt 3 – delay 750 ms Background or batch 30 seconds Exponential MaxRetryCountMaxDelay 512 seconds Attempt 1 - delay 0 secAttempt 2 - delay ~1 secAttempt 3 - delay ~3 secAttempt 4 - delay ~7 secAttempt 5 - delay 12 sec [!NOTE] The end-to-end latency targets assume the default timeout for connections to the service. If you specify longer connection timeouts, the end-to-end latency will be extended by this additional time for every retry attempt. Examples The following code example defines a simple data access solution that uses Entity Framework. It sets a specific retry strategy by defining an instance of a class named BlogConfiguration that extends DbConfiguration. ```csharp using System; using System.Collections.Generic; using System.Data.Entity; using System.Data.Entity.SqlServer; using System.Threading.Tasks; namespace RetryCodeSamples { public class BlogConfiguration : DbConfiguration { public BlogConfiguration() { Set up the execution strategy for SQL Database (exponential) with 5 retries and 12 sec delay. These values could be loaded from configuration rather than being hard-coded. this.SetExecutionStrategy( "System.Data.SqlClient", () => new SqlAzureExecutionStrategy(5, TimeSpan.FromSeconds(12))); } } Specify the configuration type if more than one has been defined. [DbConfigurationType(typeof(BlogConfiguration))] public class BloggingContext : DbContext { Definition of content goes here. } class EF6CodeSamples { public async static Task Samples() { Execution strategy configured by DbConfiguration subclass, discovered automatically or or explicitly indicated through configuration or with an attribute. Default is no retries. using (var db = new BloggingContext("Blogs")) { Add, edit, delete blog items here, then: await db.SaveChangesAsync(); } } } } ``` More examples of using the Entity Framework retry mechanism can be found in Connection resiliency retry logic. SQL Database using Entity Framework Core Entity Framework Core is an object-relational mapper that enables .NET Core developers to work with data using domain-specific objects. It eliminates the need for most of the data-access code that developers usually need to write. This version of Entity Framework was written from the ground up, and doesn't automatically inherit all the features from EF6.x. Retry mechanism Retry support is provided when accessing SQL Database using Entity Framework Core through a mechanism called connection resiliency. Connection resiliency was introduced in EF Core 1.1.0. The primary abstraction is the IExecutionStrategy interface. The execution strategy for SQL Server, including SQL Azure, is aware of the exception types that can be retried and has sensible defaults for maximum retries, delay between retries, and so on. Examples The following code enables automatic retries when configuring the DbContext object, which represents a session with the database. csharp protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder) { optionsBuilder .UseSqlServer( @"Server=(localdb)\mssqllocaldb;Database=EFMiscellaneous.ConnectionResiliency;Trusted_Connection=True;", options => options.EnableRetryOnFailure()); } The following code shows how to execute a transaction with automatic retries, by using an execution strategy. The transaction is defined in a delegate. If a transient failure occurs, the execution strategy will invoke the delegate again. ```csharp using (var db = new BloggingContext()) { var strategy = db.Database.CreateExecutionStrategy(); strategy.Execute(() => { using (var transaction = db.Database.BeginTransaction()) { db.Blogs.Add(new Blog { Url = "https:blogs.msdn.comdotnet" }); db.SaveChanges(); db.Blogs.Add(new Blog { Url = "https:blogs.msdn.comvisualstudio" }); db.SaveChanges(); transaction.Commit(); } }); } ``` Azure Storage Azure Storage services include blob storage, files, and storage queues. Blobs, Queues and Files The ClientOptions Class is the base type for all client option types and exposes various common client options like Diagnostics, Retry, Transport. To provide the client configuration options for connecting to Azure Queue, Blob, and File Storage you must use the corresponding derived type. In the next example, you use the QueueClientOptions class (derived from ClientOptions) to configure a client to connect to Azure Queue Service. The Retry property is the set of options that can be specified to influence how retry attempts are made, and how a failure is eligible to be retried. ```csharp using System; using System.Threading; using Azure.Core; using Azure.Identity; using Azure.Storage; using Azure.Storage.Queues; using Azure.Storage.Queues.Models; namespace RetryCodeSamples { class AzureStorageCodeSamples { public async static Task Samples() { Provide the client configuration options for connecting to Azure Queue Storage QueueClientOptions queueClientOptions = new QueueClientOptions() { Retry = { Delay = TimeSpan.FromSeconds(2), The delay between retry attempts for a fixed approach or the delay on which to base calculations for a backoff-based approach MaxRetries = 5, The maximum number of retry attempts before giving up Mode = RetryMode.Exponential, The approach to use for calculating retry delays MaxDelay = TimeSpan.FromSeconds(10) The maximum permissible delay between retry attempts }, GeoRedundantSecondaryUri = new Uri("https:...") If the GeoRedundantSecondaryUri property is set, the secondary Uri will be used for GET or HEAD requests during retries. If the status of the response from the secondary Uri is a 404, then subsequent retries for the request will not use the secondary Uri again, as this indicates that the resource may not have propagated there yet. Otherwise, subsequent retries will alternate back and forth between primary and secondary Uri. }; Uri queueServiceUri = new Uri("https:storageaccount.queue.core.windows.net"); string accountName = "Storage account name"; string accountKey = "storage account key"; Create a client object for the Queue service, including QueueClientOptions. QueueServiceClient serviceClient = new QueueServiceClient(queueServiceUri, new DefaultAzureCredential(), queueClientOptions); CancellationTokenSource source = new CancellationTokenSource(); CancellationToken cancellationToken = source.Token; Return an async collection of queues in the storage account. var queues = serviceClient.GetQueuesAsync(QueueTraits.None, null, cancellationToken); ``` Table Support [!NOTE] WindowsAzure.Storage Nuget Package and Microsoft.Azure.Cosmos.Table Nuget Package have been deprecated. For Azure table support, see Azure.Data.Tables Nuget Package Retry mechanism The client library is based on Azure Core library, which is a library that provides cross-cutting services to other client libraries. There are many reasons why failure can occur when a client application attempts to send a network request to a service. Some examples are timeout, network infrastructure failures, service rejecting the request due to throttlebusy, service instance terminating due to service scale-down, service instance going down to be replaced with another version, service crashing due to an unhandled exception, and so on. By offering a built-in retry mechanism (with a default configuration the consumer can override), our SDKs and the consumer’s application become resilient to these kinds of failures. Note that some services charge real money for each request and so consumers should be able to disable retries entirely if they prefer to save money over resiliency. Policy configuration Retry policies are configured programmatically. The configuration is based on the RetryOption class. There is an attribute on TableClientOptions inherited from ClientOptions csharp var tableClientOptions = new TableClientOptions(); tableClientOptions.Retry.Mode = RetryMode.Exponential; tableClientOptions.Retry.MaxRetries = 5; var serviceClient = new TableServiceClient(connectionString, tableClientOptions); The following tables show the possibilities for the built-in retry policies. RetryOption Setting Meaning -------------- ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Delay The delay between retry attempts for a fixed approach or the delay on which to base calculations for a backoff-based approach. If the service provides a Retry-After response header, the next retry will be delayed by the duration specified by the header value. MaxDelay The maximum permissible delay between retry attempts when the service does not provide a Retry-After response header. If the service provides a Retry-After response header, the next retry will be delayed by the duration specified by the header value. Mode The approach to use for calculating retry delays. NetworkTimeout The timeout applied to an individual network operations. RetryMode Setting Meaning ----------- ----------------------------------------------------------------------------------------------------------------------------------- Exponential Retry attempts will delay based on a backoff strategy, where each attempt will increase the duration that it waits before retrying. MaxDelay Retry attempts happen at fixed intervals; each delay is a consistent duration. Telemetry The simplest way to see the logs is to enable console logging. To create an Azure SDK log listener that outputs messages to console use AzureEventSourceListener.CreateConsoleLogger method. csharp Setup a listener to monitor logged events. using AzureEventSourceListener listener = AzureEventSourceListener.CreateConsoleLogger(); Examples Executing the following code example with the storage emulator shut down will allow us to see information about retries in the console. ```csharp using Azure.Core; using Azure.Core.Diagnostics; using Azure.Data.Tables; using Azure.Data.Tables.Models; namespace RetryCodeSamples { class AzureStorageCodeSamples { private const string connectionString = "UseDevelopmentStorage=true"; private const string tableName = "RetryTestTable"; public async static Task SamplesAsync() { Setup a listener to monitor logged events. using AzureEventSourceListener listener = AzureEventSourceListener.CreateConsoleLogger(); var tableClientOptions = new TableClientOptions(); tableClientOptions.Retry.Mode = RetryMode.Exponential; tableClientOptions.Retry.MaxRetries = 5; var serviceClient = new TableServiceClient(connectionString, tableClientOptions); TableItem table = await serviceClient.CreateTableIfNotExistsAsync(tableName); Console.WriteLine($"The created table's name is {table.Name}."); } } } ``` General REST and retry guidelines Consider the following when accessing Azure or third-party services: Use a systematic approach to managing retries, perhaps as reusable code, so that you can apply a consistent methodology across all clients and all solutions. Consider using a retry framework such as Polly to manage retries if the target service or client has no built-in retry mechanism. This will help you implement a consistent retry behavior, and it may provide a suitable default retry strategy for the target service. However, you may need to create custom retry code for services that have nonstandard behavior that do not rely on exceptions to indicate transient failures or if you want to use a Retry-Response reply to manage retry behavior. The transient detection logic will depend on the actual client API you use to invoke the REST calls. Some clients, such as the newer HttpClient class, won't throw exceptions for completed requests with a non-success HTTP status code. The HTTP status code returned from the service can help to indicate whether the failure is transient. You may need to examine the exceptions generated by a client or the retry framework to access the status code or to determine the equivalent exception type. The following HTTP codes typically indicate that a retry is appropriate: 408 Request Timeout 429 Too Many Requests 500 Internal Server Error 502 Bad Gateway 503 Service Unavailable 504 Gateway Timeout If you base your retry logic on exceptions, the following typically indicate a transient failure where no connection could be established: WebExceptionStatus.ConnectionClosed WebExceptionStatus.ConnectFailure WebExceptionStatus.Timeout WebExceptionStatus.RequestCanceled In the case of a service unavailable status, the service might indicate the appropriate delay before retrying in the Retry-After response header or a different custom header. Services might also send additional information as custom headers, or embedded in the content of the response. Don't retry for status codes representing client errors (errors in the 4xx range) except for a 408 Request Timeout and 429 Too Many Requests. Thoroughly test your retry strategies and mechanisms under a range of conditions, such as different network states and varying system loadings. Retry strategies The following are the typical types of retry strategy intervals: Exponential. A retry policy that performs a specified number of retries, using a randomized exponential back off approach to determine the interval between retries. For example: ```csharp var random = new Random(); var delta = (int)((Math.Pow(2.0, currentRetryCount) - 1.0) random.Next((int)(this.deltaBackoff.TotalMilliseconds 0.8), (int)(this.deltaBackoff.TotalMilliseconds 1.2))); var interval = (int)Math.Min(checked(this.minBackoff.TotalMilliseconds + delta), this.maxBackoff.TotalMilliseconds); retryInterval = TimeSpan.FromMilliseconds(interval); ``` Incremental. A retry strategy with a specified number of retry attempts and an incremental time interval between retries. For example: csharp retryInterval = TimeSpan.FromMilliseconds(this.initialInterval.TotalMilliseconds + (this.increment.TotalMilliseconds currentRetryCount)); LinearRetry. A retry policy that performs a specified number of retries, using a specified fixed time interval between retries. For example: csharp retryInterval = this.deltaBackoff; Transient fault handling with Polly Polly is a library to programmatically handle retries and circuit breaker strategies. The Polly project is a member of the .NET Foundation. For services where the client doesn't natively support retries, Polly is a valid alternative and avoids the need to write custom retry code, which can be hard to implement correctly. Polly also provides a way to trace errors when they occur, so that you can log retries. Next steps connection resiliency Data Points - EF Core 1.1
architecture-center/docs/best-practices/retry-service-specific.md/0
Retry guidance for Azure services
architecture-center/docs/best-practices/retry-service-specific.md
architecture-center
11,341
93
Customer activities required Pre-incident For Azure services Be familiar with Azure Service Health in the Azure portal. This page will act as the “one-stop shop” during an incident Consider the use of Service Health alerts, which can be configured to automatically produce notifications when Azure incidents occur For Power BI Be familiar with Service Health in the Microsoft 365 admin center. This page will act as the “one-stop shop” during an incident Consider the use of Microsoft 365 Admin mobile app to get automatic service incident alert notifications During the incident For Azure services Azure Service Health within their Azure management portal will provide the latest updates If there are issues accessing Service Health, refer to the Azure Status page If there are ever issues accessing the Status page, go to @AzureSupport on X (formerly Twitter) If impactissues don’t match the incident (or persist after mitigation), then contact support to raise a service support ticket For Power BI The Service Health page within their Microsoft 365 admin center will provide the latest updates If there are issues accessing Service Health, refer to the Microsoft 365 status page If impactissues don't match the incident (or if issues persist after mitigation), you should raise a service support ticket. Post Microsoft recovery See the sections below for this detail. Post incident For Azure Services Microsoft will publish a PIR to the Azure portal - Service Health for review For Power BI Microsoft will publish a PIR to the Microsoft 365 Admin - Service Health for review Wait for Microsoft process The “Wait for Microsoft” process is simply waiting for Microsoft to recover all components and services in the impacted, primary region. Once recovered, validate the binding of the data platform to enterprise shared or other services, the date of the dataset, and then execute the processes of bringing the system up to the current date. Once this process has been completed, technical and business subject matter expert (SME) validation can be completed enabling the stakeholder approval for the service recovery. Redeploy on disaster For a “Redeploy on Disaster” strategy, the following high-level process flow can be described. Recover Contoso – Enterprise Shared Services and source systems This step is a prerequisite to the recovery of the data platform This step would be completed by the various Contoso operational support groups responsible for the enterprise shared services and operational source systems Recover Azure services Azure Services refers to the applications and services that make the Azure Cloud offering, are available within the secondary region for deployment. Azure Services refers to the applications and services that make the Azure Cloud offering, are available within the secondary region for deployment. This step is a prerequisite to the recovery of data platform This step would be completed by Microsoft and other platform as a service (PaaS)software as a service (SaaS) partners Recover the data platform foundation This step is the entry point for the Platform recovery activities For the Redeployment strategy, each required componentservice would be procured and deployed into the secondary region See the Azure Service and Component Section in this series for a detailed breakdown of the components and deployment strategies This process should also include activities like the binding to the enterprise shared services, ensuring connectivity to accessauthentication, and validating that the log offloading is working, while also ensuring connectivity to both upstream and downstream processes DataProcessing should be confirmed. For example, validation of the timestamp of the recovered platform If there are questions about data integrity, the decision could be made to roll back further in time before executing the new processing to bring the platform up to date Having a priority order for processes (based upon business impact) will help in orchestrating the recovery This step should be closed out by technical validation unless business users directly interact with the services. If there is direct access, there will need to be a business validation step Once validation has been completed, a handover to the individual solution teams to start their own disaster recovery (DR) recovery process happens This handover should include confirmation of the current timestamp of the dataprocesses If core enterprise data processes are going to be executed, the individual solutions should be made aware of this - inboundoutbound flows, for example Recover the individual solutions hosted by the platform Each individual solution should have its own DR runbook. The runbooks should at least contain the nominated business stakeholders who will test and confirm that service recovery has been completed Depending on resource contention or priority, key solutionsworkloads may be prioritized over others - core enterprise processes over ad hoc labs, for example Once the validation steps have been completed, a handover to the downstream solutions to start their DR recovery process happens Handover to downstream, dependent systems Once the dependent services have been recovered, the E2E DR recovery process is complete [!NOTE] While it's theoretically possible to completely automate an E2E DR process, it’s unlikely given the risk of the event vs. the cost of the SDLC activities required to cover the E2E process Fallback to the primary region Fallback is the process of moving the data platform service and its data back to the primary region, once it's available for BAU. Depending on the nature of the source systems and various data processes, fallback of the data platform could be done independently of other parts of the data eco-system. Customers are advised to review their own data platform’s dependencies (both upstream and downstream) to make the appropriate decision. The following section assumes an independent recovery of the data platform. Once all required componentsservices have become available in the primary region, customers would complete a smoke-test to validate the Microsoft recovery ComponentService configuration would be validated. Deltas would be addressed via redeployment from source control The system date in the primary region would be established across stateful components. The delta between the established date and the datetimestamp in the secondary region should be addressed by re-executing or replaying the data ingestion processes from that point forward With approval from both business and technical stakeholders, a fallback window would be selected. Ideally, during a lull in system activity and processing During the fallback, the primary region would be brought into sync with the secondary region, before the system was switched over After a period of a parallel run, the secondary region would be taken offline from the system The components in the secondary region would either be dropped or stripped back, depending on the DR strategy selected Warm spare process For a “Warm Spare” strategy, the high-level process flow is closely aligned to that of the “Redeploy on Disaster”, the key difference being that components have already been procured in the secondary region. This strategy eliminates the risk of resource contention from other organizations looking to complete their own DR in that region. Hot spare process The "Hot Spare" strategy means that the Platform services including PaaS and infrastructure as a service (IaaS) systems will persist despite the disaster event as the secondary systems run in tandem with the primary systems. As with the "Warm Spare" strategy, this strategy eliminates the risk of resource contention from other organizations looking to complete their own DR in that region. Hot Spare customers would monitor the Microsoft recovery of componentsservices in the primary region. Once completed, customers would validate the primary region systems and complete the fallback to the primary region. This process would be similar to the DR Failover process that is, check the available codebase and data, redeploying as required. [!NOTE] A special note here should be made to ensure that any system metadata is consistent between the two regions. Once Fallback to the primary has been completed, the system load balancers can be updated to bring the primary region back into system topology. If available, a canary release approach can be used to incrementally switch the primary region on for the system. DR plan structure An effective DR plan presents a step-by-step guide for service recovery that can be executed by an Azure technical resource. As such, the following lists a proposed MVP structure for a DR Plan. Process Requirements Any customer DR process-specific detail, such as the correct authorization required to start DR, and make key decisions about the recovery as necessary (including “definition of done”), service support DR ticketing reference, and war room details Resource confirmation, including the DR lead and executor backup. All resources should be documented with primary and secondary contacts, escalation paths, and leave calendars. In critical DR situations, roster systems may need to be considered Laptop, power packs andor backup power, network connectivity and mobile phone details for the DR executor, DR backup and any escalation points The process to be followed if any of the process requirements aren’t met Contact Listing DR leadership and support groups Business SMEs who will complete the testreview cycle for the technical recovery Impacted Business Owners, including the service recovery approvers Impacted Technical Owners, including the technical recovery approvers SME support across all impacted areas, including key solutions hosted by the platform Impact Downstream systems – operational support Upstream Source systems – operational support Enterprise shared services contacts. For example, accessauthentication support, security monitoring and gateway support Any external or third party vendors, including support contacts for cloud providers Architecture design Describe the end-end to E2E scenario detail, and attach all associated support documentation Dependencies List out all the component’s relationships and dependencies DR Prerequisites Confirmation that upstream source systems are available as required Elevated access across the stack has been granted to the DR executor resources Azure services are available as required The process to be followed if any of the prerequisites haven’t been met Technical Recovery - Step-by-Step instructions Run order Step description Step prerequisite Detailed process steps for each discrete action, including URL’s Validation instructions, including the evidence required Expected time to complete each step, including contingency The process to be followed if the step fails The escalation points in the case of failure or SME support Technical Recovery - Post requisites Confirm the current date timestamp of the system across key components Confirm the DR system URLs & IPs Prepare for the Business Stakeholder review process, including confirmation of systems access and the business SMEs completing the validation and approval Business Stakeholder Review and Approval Business resource contact details The Business validation steps as per the technical recovery above The Evidence trail required from the Business approver signing off the recovery Recovery Post requisites Handover to operational support to execute the data processes to bring the system up to date Handover the downstream processes and solutions – confirming the date and connection details of the DR system Confirm recovery process complete with the DR lead – confirming the evidence trail and completed runbook Notify Security administration that elevated access privileges can be removed from the DR team Callouts It's recommended to include system screenshots of each step process. These screenshots will help address the dependency on system SMEs to complete the tasks To mitigate the risk from quickly evolving Cloud services, the DR plan should be regularly revisited, tested, and executed by resources with current knowledge of Azure and its services The technical recovery steps should reflect the priority of the component and solution to the organization. For example, core enterprise data flows are recovered before ad hoc data analysis labs The Technical recovery steps should follow the order of the workflows (typically left to right), once the foundation componentsservice like Key Vault have been recovered. This strategy will ensure upstream dependencies are available and components can be appropriately tested Once the step-by-step plan has been completed, a total time for activities with contingency should be obtained. If this total is over the agreed recovery time objective (RTO), there are several options available: Automate selected recovery processes (where possible) Look for opportunities to run selected recovery steps in parallel (where possible). However, noting that this strategy may require additional DR executor resources. Uplift key components to higher levels of service tiers such as PaaS, where Microsoft takes greater responsibility for service recovery activities Extend the RTO with stakeholders DR testing The nature of the Azure Cloud service offering results in constraints for any DR testing scenarios. Therefore, the guidance is to stand up a DR subscription with the data platform components as they would be available in the secondary region. From this baseline, the DR plan runbook can be selectively executed, paying specific attention to the services and components that can be deployed and validated. This process will require a curated test dataset, enabling the confirmation of the technical and business validation checks as per the plan. A DR plan should be tested regularly to not only ensure that it's up to date, but also to build "muscle memory" for the teams performing failover and recovery activities. Data and configuration backups should also be regularly tested to ensure they are “fit for purpose” to support any recovery activities. The key area to focus on during a DR test is to ensure the prescriptive steps are still correct and the estimated timings are still relevant. If the instructions reflect the portal screens rather than code – the instructions should be validated at least every 12 months due to the cadence of change in cloud. While the aspiration is to have a fully automated DR process, full automation may be unlikely due to the rarity of the event. Therefore, it's recommended to establish the recovery baseline with Desired State Configuration (DSC) infrastructure as code (IaC) used to deliver the platform and then uplift as new projects build upon the baseline. Over time as components and services are extended, an NFR should be enforced, requiring the production deployment pipeline to be refactored to provide coverage for DR. If your runbook timings exceed your RTO, there are several options: Extend the RTO with stakeholders Lower the time required for the recovery activities, via automation, running tasks in parallel or migration to higher cloud server tiers Azure Chaos Studio Azure Chaos Studio is a managed service for improving resilience by injecting faults into your Azure applications. Chaos Studio enables you to orchestrate fault injection on your Azure resources in a safe and controlled way, using experiments. See the product documentation for a description of the types of faults currently supported. The current iteration of Chaos Studio only covers a subset of Azure components and services. Until more fault libraries are added, Chaos Studio is a recommended approach for isolated resiliency testing rather than full system DR testing. More information on Chaos studio can be found here Azure Site Recovery For IaaS components, Azure Site Recovery will protect most workloads running on a supported VM or physical server There is strong guidance for: Executing an Azure VM Disaster Recovery Drill Executing a DR failover to a Secondary Region Executing a DR fallback to the Primary Region Enabling automation of a DR Plan Related resources Architecting for resiliency and availability Business continuity and disaster recovery Backup and disaster recovery for Azure applications Recover from the loss of an Azure region Resiliency in Azure Business continuity management in Azure service-level agreements (SLAs) Summary Azure Status Azure DevOps Status Five Best Practices to Anticipate Failure Next steps Now that you've learned how to deploy the scenario, you can read a summary of the DR for Azure data platform series. Related resources DR for Azure Data Platform - Overview DR for Azure Data Platform - Architecture DR for Azure Data Platform - Scenario details DR for Azure Data Platform - Recommendations
architecture-center/docs/data-guide/disaster-recovery/dr-for-azure-data-platform-deploy-this-scenario-content.md/0
Customer activities required
architecture-center/docs/data-guide/disaster-recovery/dr-for-azure-data-platform-deploy-this-scenario-content.md
architecture-center
3,305
94
Online analytical processing (OLAP) is a technology that organizes large business databases and supports complex analysis. It can be used to perform complex analytical queries without negatively affecting transactional systems. The databases that a business uses to store all its transactions and records are called online transaction processing (OLTP) databases. These databases usually have records that are entered one at a time. Often they contain a great deal of information that is valuable to the organization. The databases that are used for OLTP, however, were not designed for analysis. Therefore, retrieving answers from these databases is costly in terms of time and effort. OLAP systems were designed to help extract this business intelligence information from the data in a highly performant way. This is because OLAP databases are optimized for heavy read, low write workloads. Semantic modeling A semantic data model is a conceptual model that describes the meaning of the data elements it contains. Organizations often have their own terms for things, sometimes with synonyms, or even different meanings for the same term. For example, an inventory database might track a piece of equipment with an asset ID and a serial number, but a sales database might refer to the serial number as the asset ID. There is no simple way to relate these values without a model that describes the relationship. Semantic modeling provides a level of abstraction over the database schema, so that users don't need to know the underlying data structures. This makes it easier for end users to query data without performing aggregates and joins over the underlying schema. Also, usually columns are renamed to more user-friendly names, so that the context and meaning of the data are more obvious. Semantic modeling is predominately used for read-heavy scenarios, such as analytics and business intelligence (OLAP), as opposed to more write-heavy transactional data processing (OLTP). This is mostly due to the nature of a typical semantic layer: Aggregation behaviors are set so that reporting tools display them properly. Business logic and calculations are defined. Time-oriented calculations are included. Data is often integrated from multiple sources. Traditionally, the semantic layer is placed over a data warehouse for these reasons. There are two primary types of semantic models: Tabular. Uses relational modeling constructs (model, tables, columns). Internally, metadata is inherited from OLAP modeling constructs (cubes, dimensions, measures). Code and script use OLAP metadata. Multidimensional. Uses traditional OLAP modeling constructs (cubes, dimensions, measures). Relevant Azure service: Azure Analysis Services Example use case An organization has data stored in a large database. It wants to make this data available to business users and customers to create their own reports and do some analysis. One option is just to give those users direct access to the database. However, there are several drawbacks to doing this, including managing security and controlling access. Also, the design of the database, including the names of tables and columns, may be hard for a user to understand. Users would need to know which tables to query, how those tables should be joined, and other business logic that must be applied to get the correct results. Users would also need to know a query language like SQL even to get started. Typically this leads to multiple users reporting the same metrics but with different results. Another option is to encapsulate all of the information that users need into a semantic model. The semantic model can be more easily queried by users with a reporting tool of their choice. The data provided by the semantic model is pulled from a data warehouse, ensuring that all users see a single version of the truth. The semantic model also provides friendly table and column names, relationships between tables, descriptions, calculations, and row-level security. Typical traits of semantic modeling Semantic modeling and analytical processing tends to have the following traits: Requirement Description --- --- Schema Schema on write, strongly enforced Uses Transactions No Locking Strategy None Updateable No (typically requires recomputing cube) Appendable No (typically requires recomputing cube) Workload Heavy reads, read-only Indexing Multidimensional indexing Datum size Small to medium sized Model Multidimensional Data shape: Cube or starsnowflake schema Query flexibility Highly flexible Scale: Large (10s-100s GBs) When to use this solution Consider OLAP in the following scenarios: You need to execute complex analytical and ad hoc queries rapidly, without negatively affecting your OLTP systems. You want to provide business users with a simple way to generate reports from your data You want to provide a number of aggregations that will allow users to get fast, consistent results. OLAP is especially useful for applying aggregate calculations over large amounts of data. OLAP systems are optimized for read-heavy scenarios, such as analytics and business intelligence. OLAP allows users to segment multi-dimensional data into slices that can be viewed in two dimensions (such as a pivot table) or filter the data by specific values. This process is sometimes called "slicing and dicing" the data, and can be done regardless of whether the data is partitioned across several data sources. This helps users to find trends, spot patterns, and explore the data without having to know the details of traditional data analysis. Semantic models can help business users abstract relationship complexities and make it easier to analyze data quickly. Challenges For all the benefits OLAP systems provide, they do produce a few challenges: Whereas data in OLTP systems is constantly updated through transactions flowing in from various sources, OLAP data stores are typically refreshed at a much slower intervals, depending on business needs. This means OLAP systems are better suited for strategic business decisions, rather than immediate responses to changes. Also, some level of data cleansing and orchestration needs to be planned to keep the OLAP data stores up-to-date. Unlike traditional, normalized, relational tables found in OLTP systems, OLAP data models tend to be multidimensional. This makes it difficult or impossible to directly map to entity-relationship or object-oriented models, where each attribute is mapped to one column. Instead, OLAP systems typically use a star or snowflake schema in place of traditional normalization. OLAP in Azure In Azure, data held in OLTP systems such as Azure SQL Database is copied into the OLAP system, such as Azure Analysis Services. Data exploration and visualization tools like Power BI, Excel, and third-party options connect to Analysis Services servers and provide users with highly interactive and visually rich insights into the modeled data. The flow of data from OLTP data to OLAP is typically orchestrated using SQL Server Integration Services, which can be executed using Azure Data Factory. In Azure, all of the following data stores will meet the core requirements for OLAP: SQL Server with Columnstore indexes Azure Analysis Services SQL Server Analysis Services (SSAS) SQL Server Analysis Services (SSAS) offers OLAP and data mining functionality for business intelligence applications. You can either install SSAS on local servers, or host within a virtual machine in Azure. Azure Analysis Services is a fully managed service that provides the same major features as SSAS. Azure Analysis Services supports connecting to various data sources in the cloud and on-premises in your organization. Clustered Columnstore indexes are available in SQL Server 2014 and above, as well as Azure SQL Database, and are ideal for OLAP workloads. However, beginning with SQL Server 2016 (including Azure SQL Database), you can take advantage of hybrid transactionalanalytical processing (HTAP) through the use of updateable nonclustered columnstore indexes. HTAP enables you to perform OLTP and OLAP processing on the same platform, which removes the need to store multiple copies of your data, and eliminates the need for distinct OLTP and OLAP systems. For more information, see Get started with Columnstore for real-time operational analytics. Key selection criteria To narrow the choices, start by answering these questions: Do you want a managed service rather than managing your own servers? Do you require secure authentication using Microsoft Entra ID? Do you want to conduct real-time analytics? If so, narrow your options to those that support real-time analytics. Real-time analytics in this context applies to a single data source, such as an enterprise resource planning (ERP) application, that will run both an operational and an analytics workload. If you need to integrate data from multiple sources, or require extreme analytics performance by using pre-aggregated data such as cubes, you might still require a separate data warehouse. Do you need to use pre-aggregated data, for example to provide semantic models that make analytics more business user friendly? If yes, choose an option that supports multidimensional cubes or tabular semantic models. Providing aggregates can help users consistently calculate data aggregates. Pre-aggregated data can also provide a large performance boost when dealing with several columns across many rows. Data can be pre-aggregated in multidimensional cubes or tabular semantic models. Do you need to integrate data from several sources, beyond your OLTP data store? If so, consider options that easily integrate multiple data sources. Capability matrix The following tables summarize the key differences in capabilities. General capabilities Capability Azure Analysis Services SQL Server Analysis Services SQL Server with Columnstore Indexes Azure SQL Database with Columnstore Indexes --- --- --- --- --- Is managed service Yes No No Yes Supports multidimensional cubes No Yes No No Supports tabular semantic models Yes Yes No No Easily integrate multiple data sources Yes Yes No 1 No 1 Supports real-time analytics No No Yes Yes Requires process to copy data from source(s) Yes Yes No No Microsoft Entra integration Yes No No 2 Yes [1] Although SQL Server and Azure SQL Database cannot be used to query from and integrate multiple external data sources, you can still build a pipeline that does this for you using SSIS or Azure Data Factory. SQL Server hosted in an Azure VM has additional options, such as linked servers and PolyBase. For more information, see Pipeline orchestration, control flow, and data movement. [2] Connecting to SQL Server running on an Azure Virtual Machine is not supported using a Microsoft Entra account. Use a domain Active Directory account instead. Scalability Capabilities Capability Azure Analysis Services SQL Server Analysis Services SQL Server with Columnstore Indexes Azure SQL Database with Columnstore Indexes ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Redundant regional servers for high availability Yes No Yes Yes Supports query scale out Yes No Yes Yes Dynamic scalability (scale up) Yes No Yes Yes Contributors This article is maintained by Microsoft. It was originally written by the following contributors. Principal author: Zoiner Tejada CEO and Architect Next steps Columnstore indexes: Overview Create an Analysis Services server What is Azure Data Factory? What is Power BI? Related resources Big data architecture style Online analytical processing (OLAP)
architecture-center/docs/data-guide/relational-data/online-analytical-processing-content.md/0
Semantic modeling
architecture-center/docs/data-guide/relational-data/online-analytical-processing-content.md
architecture-center
2,332
95
Enterprise systems can have multiple sources of master data—the common data that's shared across systems. This fact can become apparent when you catalog data sources. Examples of master data include customer, product, location, asset, and vendor data. When you use Profisee to merge, validate, and correct your master data, you can make that data effective. Specifically, you can use it to build a common trusted platform for analytics and operational improvement. By using the governance definitions, insights, and expertise that are detailed in Microsoft Purview, you can build your platform effectively. This reference architecture presents a governance and data management solution that features Microsoft Purview and the Profisee master data management (MDM) platform. These services work together to provide a foundation of high-quality, trusted data that maximizes the business value of data in Azure. For a short video about this solution, see The power of fully integrated master data management in Azure. Architecture The following diagram shows the steps that you take when you develop and operate your master data solution. Think of these steps as highly iterative. As your solution evolves, you might repeat these steps and phases, sometimes automatically and sometimes manually. Whether you use automatic or manual steps depends on the changes that your master data solution, metadata, and data undergo. :::image type="content" source="_imagesmicrosoft-purview-microservice-design-architecture.png" alt-text="Architecture diagram of a data governance and management solution that uses Microsoft Purview and Profisee MDM in a microservice design architecture." lightbox="_imagesmicrosoft-purview-microservice-design-architecture.png" border="false"::: Download a Visio file of this architecture. Dataflow Metadata and data flow include these steps, which are shown in the preceding figure: Pre-built Microsoft Purview connectors are used to build a data catalog from source business applications. The connectors scan data sources and populate the Microsoft Purview Data Catalog. The master data model is published to Microsoft Purview. Master data entities that are created in Profisee MDM are seamlessly published to Microsoft Purview. This step further populates the Microsoft Purview Data Catalog and ensures that there's a record of this critical source of data in Microsoft Purview. Governance standards and policies for data stewardship are used to enrich master data entity definitions. The data is enriched in Microsoft Purview with data dictionary and glossary information, ownership data, and sensitive data classifications. Any definitions and metadata that are available in Microsoft Purview are visible in real time in Profisee as guidance for the MDM data stewards. Master data from source systems is loaded into Profisee MDM. A data integration toolset like Azure Data Factory extracts data from the source systems by using any of more than 100 pre-built connectors or a REST gateway. Multiple streams of master data are loaded into Profisee MDM. The master data is standardized, matched, merged, enriched, and validated according to governance rules. Other systems, like Microsoft Purview, might define data quality and governance rules. But Profisee MDM is the system that enforces these rules. Source records are matched and merged within and across source systems to create the most complete and correct record possible. Data quality rules check each record for compliance with business and technical requirements. Any record that fails validation or that returns a low probability score is subject to remediation. To remediate failed validations, a workflow process assigns records that require review to data stewards who are experts in their business data domain. After a record has been verified or corrected, it's ready to use as a golden record master. Transactional data is loaded into a downstream analytics solution. A data integration toolset like Data Factory extracts transactional data from source systems by using any of more than 100 pre-built connectors or a REST gateway. The toolset loads the data directly into an analytics data platform like Azure Synapse Analytics. Analysis on this raw information without the proper master golden data is subject to inaccuracy, because data overlaps, mismatches, and conflicts aren't yet resolved. Power BI connectors provide direct access to the curated master data. Power BI users can use the master data directly in reports. A dedicated Power BI connector recognizes and enforces role-based security. It also hides various system fields to simplify use. High-quality, curated master data is published to a downstream analytics solution. If master data records have been merged into a single golden record, parent-child links to the original records are preserved. The analytics platform has a set of data that's certified in the sense that it's complete, consistent, and accurate. That data includes properly curated master data and associated transactional data. That combination forms a solid foundation of trusted data that's available for further analysis. The high-quality master data is visualized and analyzed, and machine learning models are applied. The system delivers sound insights for driving the business. Components Microsoft Purview is a data governance solution that provides broad visibility into on-premises and cloud data estates. Microsoft Purview offers a combination of data discovery and classification, lineage, metadata search and discovery, and usage insights. All these features help you manage and understand data across your enterprise data landscape. Profisee MDM is a fast and intuitive MDM platform that integrates seamlessly with Microsoft technologies and the Azure data management ecosystem. Data Factory is a hybrid data integration service. You can use Data Factory to create, schedule, and orchestrate extract, transform, load (ETL) and extract, load, transform (ELT) workflows. Data Factory also offers more than 100 pre-built connectors and a REST gateway that you can use to extract data from source systems. Azure Synapse Analytics is a fast, flexible, and trusted cloud data warehouse that uses a massive parallel processing architecture. You can use Azure Synapse Analytics to scale, compute, and store data elastically and independently. Power BI is a suite of business analytics tools that delivers insights throughout your organization. You can use Power BI to connect to hundreds of data sources, simplify data preparation, and drive improvised analysis. You can also produce beautiful reports and then publish them for your organization to consume on the web and on mobile devices. Alternatives If you don't have a dedicated MDM application, you can find some of the technical capabilities that you need to build an MDM solution in Azure: Data quality. When you load data into an analytics platform, you can build data quality into integration processes. For example, you can use hard-coded scripts to apply data quality transformations in a Data Factory pipeline. Data standardization and enrichment. Azure Maps can provide data verification and standardization for address data. You can use the standardized data in Azure Functions and Data Factory. To standardize other data, you might need to develop hard-coded scripts. Duplicate data management. You can use Data Factory to deduplicate rows if sufficient identifiers are available for an exact match. You likely need custom hard-coded scripts to implement the logic that's needed to merge matched rows while applying appropriate data survivorship techniques. Data stewardship. You can use Power Apps to quickly develop basic data stewardship solutions to manage data in Azure. You can also develop appropriate user interfaces for reviews, workflows, alerts, and validations. In Microsoft-centric environments, Azure Synapse Analytics is generally preferred as an analytics service. But you can use any analytics database. Snowflake and Databricks are common choices. Scenario details As the amount of data that you load into Azure increases, the need to properly govern and manage that data across all your data sources and data consumers grows. Data that seems adequate in the source system is often found to be deficient when it's shared. It might have missing or incomplete information, or duplications and conflicts. Its overall quality might be poor. What's needed is data that's complete, consistent, and accurate. Without high-quality data in your Azure data estate, the business value of Azure is undermined, perhaps critically. The solution is to build a foundation for data governance and management that can produce and deliver a source of truth for high-quality, trusted data. Microsoft Purview and Profisee MDM work together to form this enterprise platform. :::image type="content" source="_imagesmicrosoft-purview-profisee-mdm-benefits.png" alt-text="Diagram that shows how Microsoft Purview and Profisee MDM transform ungoverned data into high-quality, trusted data." lightbox="_imagesmicrosoft-purview-profisee-mdm-benefits.png" border="false"::: Microsoft Purview catalogs all your data sources and identifies any sensitive information and lineage. It gives the data architect a place to consider the appropriate data standards to impose on all data. Microsoft Purview focuses on governance to find, classify, and define policies and standards. The tasks of enforcing policies and standards, cataloging data sources, and remediating deficient data fall to technologies like MDM systems. Profisee MDM is designed to accept master data from any source. Profisee MDM then matches, merges, standardizes, verifies, corrects, and synchronizes the data across systems. This process ensures that data can be properly integrated and that it meets the needs of downstream systems, such as business intelligence (BI) and machine learning applications. The integrative Profisee platform enforces governance standards across multiple data silos. Better together Microsoft Purview and Profisee MDM work better together. When integrated, they streamline data management tasks and ensure that all systems work to enforce the same standards. Profisee MDM publishes its master data model to Microsoft Purview, where it can participate in governance. Microsoft Purview then shares the output of governance, such as a data catalog and glossary information. Profisee can review the output and enforce standards. By working jointly, Microsoft Purview and Profisee create a natural, better-together synergy that goes deeper than each independent offering. For example, after you catalog enterprise data sources, you might determine that master data is present in multiple systems. Master data is the data that defines a domain entity. Examples of master data include customer, product, asset, location, vendor, patient, household, menu item, and ingredient data. Resolving differing definitions and matching and merging this data across systems is critical to the ability to use this data in a meaningful way. To be effective, you should merge, validate, and correct master data in Profisee MDM by using governance definitions, insights, and expertise that are detailed in Microsoft Purview. In this way, Microsoft Purview and Profisee MDM form a foundation for governance and data management, and they maximize the business value of data in Azure. The alternative is to use whatever information you can get. But when you take this approach, you risk generating misleading results that can damage your business. When you instead use high-quality master data, you eliminate common data quality issues. Then your system delivers sound insights that you can use to drive your business, no matter which tools you use for analysis, machine learning, and visualization. Well-curated master data is a key aspect of building a solid, reliable data foundation. When you use Profisee MDM with Microsoft Purview, you experience the following benefits: A common technical foundation. Profisee originated in Microsoft technologies. Profisee and Microsoft use common tools, databases, and infrastructure, which makes the Profisee solution familiar to anyone who works with Microsoft technologies. In fact, for many years, Profisee MDM was built on Microsoft Master Data Services. Now Master Data Services is nearing the end of its lifecycle, and Profisee is the premier upgrade and replacement solution. Developer collaboration and joint development. Profisee and Microsoft Purview developers collaborate extensively to ensure a good, complementary fit between their respective solutions. This collaboration delivers a seamless integration that meets customer needs. Joint sales and deployments. Profisee has more MDM deployments on Azure, and jointly with Microsoft Purview, than any other MDM vendor. You can purchase Profisee through Azure Marketplace. In fiscal year 2023, Profisee is the only MDM vendor with a top-tier Microsoft partner certification that has an infrastructure as a service (IaaS), containers as a service (CaaS), or software as a service (SaaS) offering on Azure Marketplace. Rapid and reliable deployment. A critical feature of all enterprise software is rapid and reliable deployment. According to the Gartner Peer Insights platform, Profisee has more implementations that take fewer than 90 days to complete than any other MDM vendor. Multiple domains. Profisee offers an approach to MDM that inherently uses multiple domains. There are no limitations to the number of master data domains that you can create. This design aligns well with customers who plan to modernize their data estates. A customer might start with a limited number of domains, but they ultimately benefit from maximizing their domain coverage across their whole data estate. This domain coverage is matched to their data governance coverage. Engineering that's designed for Azure. Profisee is engineered to be cloud native with options for SaaS and managed IaaS or CaaS deployments on Azure. Potential use cases For a detailed list of MDM use cases of this solution, see MDM use cases later in this article. Key MDM use cases include the following retail and manufacturing examples: Consolidating customer data for analytics. Having a 360-degree view of product data in a consistent and accessible form, such as each product's name, description, and characteristics. Establishing reference data to consistently augment descriptions of master data. For example, reference data includes lists of countriesregions, currencies, colors, sizes, and units of measure. These MDM solutions also help financial organizations that rely heavily on data for critical activities, such as timely reporting. MDM integration with Microsoft Purview The following diagram illustrates in detail the integration of Profisee MDM in Microsoft Purview. To support this integration, the Profisee governance subsystem provides bidirectional integration with Microsoft Purview, which consists of two distinct flows: Solution metadata publishing occurs when your data modelers make changes to your master data model, matching strategies, and their related subartifacts. These changes are seamlessly published to Microsoft Purview as they occur. Publishing these changes syncs the metadata that's related to your master data model and solution. As a result, the Microsoft Purview Data Catalog is further populated, and Microsoft Purview has a record of this critical data source. Governance details are returned and provided to data stewards and business users. These details are available as the users view data, enrich data, and remediate data quality issues by using the Profisee FastApp portal. :::image type="content" source="_imagesprofisee-microsoft-purview-interation-detail.png" alt-text="Diagram that shows how Profisee MDM integrates with Microsoft Purview to ingest, model, and govern data." border="false"::: Microsoft Purview integration capabilities The Microsoft Purview catalog and glossary can help you maximize integration. Master data model design One of the challenges of preparing an MDM solution is determining what constitutes master data and which data sources to use when you populate your master data model. You can use Microsoft Purview to help with this effort. You can take advantage of the ability to scan your critical data sources, and you can engage your data stewards and subject matter experts (SMEs). This way, you can enrich your Microsoft Purview Data Catalog with information that your stewards can then access, to better align your master data model with your line-of-business systems. You can reconcile conflicting terminology. This process yields a master data model that optimally reflects the terminology and definitions that you want to standardize for your business. It also avoids outdated and misleading verbiage. The following excerpt from the broader diagram illustrates this integration use case. First, you use Microsoft Purview system scanning functions to ingest metadata from your line-of-business systems. Next, your data stewards and SMEs prepare a solid catalog and contacts. Then the data modelers who work with Profisee MDM modeling services can prepare and evolve your master data model. This work aligns with the standards that you define in Microsoft Purview. :::image type="content" source="_imagesintegration-use-case.png" alt-text="Diagram that shows a use case of Profisee MDM integrating with Microsoft Purview to ingest, model, and govern data." border="false"::: As your data stewards evolve the model, the modeling services within the Profisee MDM platform publish changes that Profisee MDM governance services receive. In turn, Profisee MDM prepares and forwards those changes to Microsoft Purview for inclusion in its updated data catalog. These additions to the catalog ensure that your master data definitions are included in the broader data estate and that they can be governed and controlled in the same manner as your line-of-business system metadata. By ensuring that this information is cataloged together, you're in a better position to manage the relationships between your master data and your line-of-business system data. Data stewardship Large enterprises that have correspondingly complex and expansive data estates can present challenges to data stewards, who are responsible for managing and remediating issues as they arise. Key data domains can be complex, with many obscure attributes that only tenured employees who have significant institutional knowledge understand. Through the Profisee MDM integration with Microsoft Purview, you can capture this institutional knowledge within Microsoft Purview and make it available for use within Profisee MDM. As a result, you alleviate a great need for corporate data knowledge when you manage critical and time-sensitive information. The following figure illustrates the flow of information from Microsoft Purview to the data stewards who work in the Profisee FastApp portal. The governance data service integrates with Microsoft Purview and Microsoft Entra ID. This service provides lookup functionality. FastApp portal users can use this functionality to retrieve enriched governance data about the entities and the attributes that they work with. :::image type="content" source="_imagesmicrosoft-purview-data-flow-profisee-portal.png" alt-text="Diagram that shows how data stewards use the Profisee portal to work with data that Microsoft Purview and Profisee MDM manage." border="false"::: Governance services also resolve contacts that are received from Microsoft Purview to their full profile details, which are available in Microsoft Entra ID. With complete profile details, stewards can effectively collaborate with data owners and experts as they work to enhance the quality of your master data. The Profisee MDM Governance dialog is the interface through which data stewards and users interact with governance-level details. This UI renders information that's obtained from Microsoft Purview to users. By using this information, users can review the details behind the data from which the dialog was launched. If the information that's provided in the Governance dialog is insufficient, users can go directly to the Microsoft Purview user experience. Data stewards and business users can access three Profisee MDM data asset types via the FastApp portal: Profisee Instance, which provides the infrastructure properties of the specific instance of the Profisee MDM platform that the user is viewing Profisee Entity, which provides the properties of the master data entity (the table) that the steward or user is currently viewing Profisee Attribute, which provides the properties of the attribute (such as the field or column) in which the user is interested The following figure illustrates where users who are working in the FastApp portal can view governance details for each of these asset types. You can find instance-level details on the Help menu. You can access entity details from the page zone header, which contains an entity grid. For attribute details, go to the form that's associated with the entity grid. Access the details from the labels that are associated with the attribute. :::image type="content" source="_imagesexample-portal-view-governance.png" alt-text="Screenshot of the Profisee portal. Information about customers is visible. On the Help menu, Governance instance is highlighted." lightbox="_imagesexample-portal-view-governance.png"::: To see summary information, hover over the governance icon, such as Microsoft Purview. Select the icon to display the full governance dialog: :::image type="content" source="_imagesgovernance-summary-view.png" alt-text="Screenshot of the Profisee portal. On the Customer page, a dialog provides detailed information about the date of birth attribute." lightbox="_imagesgovernance-summary-view.png"::: To go to the full Microsoft Purview user experience, select the governance icon in the dialog header. Selecting the icon takes you to Microsoft Purview in the context of the asset that you're currently viewing. You then can easily move around in Microsoft Purview based on your discovery needs. MDM processing The power of an MDM solution is in the details. Data modeling The heart of your MDM solution is the underlying data model. It represents the definition of master data within your company. Developing a master data model involves the following tasks: Identify elements of source data from across your systems landscape that are critical to your company's operations and central to analyzing performance. Enrich the model with elements that you obtain from other third-party sources that render the data more useful, accurate, and trustworthy. Establish clear ownership and permissions related to the elements of your data model. This practice helps ensure that you factor visibility and change management into your model's design. Data governance provides a critical foundation of support: Your governance data catalog, dictionary, glossary, and supporting resources are invaluable sources of information to your governance data stewards. These resources help stewards determine what to include in your master data model. They also help determine ownership and sensitive data classifications in Microsoft Purview. You can reinforce terminology in your model. Through this practice, you can establish an official lexicon for your business. By integrating terminology, your master data model can also translate any esoteric terms that are in use in various source systems to the approved language of the business. Third-party systems are often a source of master data that's separate and apart from your line-of-business systems. It's critical to add elements to your model to capture the information that these systems add to your data, and to reflect these sources of information back into your data catalog. You can use ownership and data access, as identified in your governance catalog, to enforce access and change management permissions within your MDM solution. As a result, you align your corporate policies and needs with the tools that you use to manage and steward your master data. Source data load Ideally, your disparate line-of-business systems load data into your master data model with little or no change or transformation. The goal is to have a centralized version of the data as it exists in the source system. There should be as little loss of fidelity as possible between the source system and your master data repository. By limiting the complexity of your loading process, you make lineage simpler. And when you use technology like Data Factory pipelines, your governance solution can inspect the flow. Then your solution can identify the relationships between your source system and your master data model. Specifically, your solution can extract data from source systems by using any of more than 100 pre-built connectors and a REST gateway. Data enrichment and standardization After you load source data into your model, you can extend it by tapping into rich sources of third-party data. You can use these systems to improve the data that you obtain from your line-of-business systems. You can also use these systems to augment the source data with information that enhances its use for other downstream consumers. For example: You can use address-verification services like Bing to correct and improve source system addresses. These services can standardize and add missing information that's crucial to geo-location and mail delivery. Third-party information services like Dun & Bradstreet can provide general-purpose or industry-specific data. You can use this data to extend the value of your golden master record. Specifically, you might add information that was unavailable or in conflict in your disparate line-of-business systems. Profisee's publishsubscribe infrastructure makes it easy to integrate your own third-party sources into your solution as needed. The ability to understand the sources and meaning behind this data is as critical for third-party data as it is for your internal line-of-business systems. By integrating your master data model into your governance data catalog, you can manage the relationships between internal and external sources of data while enriching your model with governance details. Data quality validation and stewardship After you load and enrich your data, it's important to check it for quality and adherence to standards that you establish through your governance processes. Microsoft Purview can again be a rich source of standards information. You can use Microsoft Purview to drive the data quality rules that your MDM solution enforces. Profisee MDM can also publish data quality rules as assets to your governance catalog. The rules can be subject to review and approval, which helps you provide top-down oversight of quality standards that are associated with your master data. Your rules are tied to master data entities and attributes, and those attributes can be traced back to the source system. For these reasons, you can establish the root cause of the poor data quality that originates from your line-of-business systems. Data stewards are experts in their business domain. As stewards address issues that your master data solution reveals, they can use the Microsoft Purview data governance catalog. The catalog helps stewards understand and resolve quality issues as they arise. Backed by the support of data owners and experts, the stewards are prepared to address data quality issues quickly and accurately. Matching and survivorship With enriched, high-quality source data, you're positioned to produce a golden record master that represents the most accurate information across your disparate line-of-business systems. The following figure illustrates how all the steps culminate in high-quality data that's ready to use for business analysis. At any time, you can sync this data across your data estate. :::image type="content" source="_imagesmicrosoft-purview-microservice-design-maching.png" alt-text="Diagram that shows how survivorship and data lineage factor into a golden record and how data is enriched." lightbox="_imagesmicrosoft-purview-microservice-design-maching.png" border="false"::: The Profisee MDM matching engine produces a golden record master as part of the survivorship process. Survivorship rules selectively populate the golden record with information that you've chosen across all your source systems. The Profisee MDM history and audit tracking subsystem tracks changes that users make. This subsystem also tracks changes that system processes like survivorship make. Matching and survivorship make it possible to trace the flow of information from your source records to the master. Profisee MDM has a record of the source system that's responsible for a specific source record. You also know how disparate source records populate the golden record. As a result, you can achieve data lineage from your analytics back to the source data that your reports reference. MDM use cases Although there are numerous use cases for MDM, a few use cases cover most real-world MDM implementations. These use cases focus on a single domain, but they're unlikely to be built from only that domain. Even these focused use cases most likely involve multiple domains. In each use case, MDM meets the goal of providing a 360-degree, or unified, view of essential data types. Customer data Consolidating and standardizing customer data for BI analytics is the most common MDM use case. Organizations capture customer data across an increasing number of systems and applications. Duplicate customer data records result. These duplicates are located in and across applications, and they contain inconsistencies and discrepancies. The poor quality of the customer data limits the value of modern analytics solutions. Symptoms include the following challenges: It's hard to answer basic business questions like, "Who are our top customers?" and "How many new customers do we have?" Answering these questions requires significant manual effort. You have missing and inaccurate customer information, which makes it difficult to roll up or drill down into the data. You're unable to uniquely identify or verify a customer across organizational and system boundaries. As a result, you're unable to analyze your customer data across systems or business units. You have poor-quality insights from AI and machine learning due to the poor-quality input data. Product data Product data is often spread across multiple enterprise applications, such as enterprise resource planning (ERP), product lifecycle management (PLM), or e-commerce applications. As a result, it's challenging to understand the total catalog of products that have inconsistent definitions for properties, such as the product name, description, and characteristics. Different definitions of reference data complicate this situation. Symptoms include the following challenges: You're unable to support different alternative hierarchical roll-up and drill-down paths for product analytics. With finished goods or material inventory, you have difficulty evaluating product inventory and established vendors. You also have duplicate products, which leads to excess inventory. It's hard to rationalize products due to conflicting definitions. This situation leads to missing or inaccurate information in analytics. Reference data In the context of analytics, reference data exists as numerous lists of data. These lists are often used to further describe other sets of master data. For example, reference data includes lists of countriesregions, currencies, colors, sizes, and units of measure. Inconsistent reference data leads to obvious errors in downstream analytics. Symptoms are: Multiple representations of the same value. For example, the state of Georgia is listed as GA and Georgia, which makes it difficult to consistently aggregate and drill down into data. Difficulty streamlining data across systems due to an inability to crosswalk, or map, reference data values between systems. For example, the color red is represented by R in the ERP system and Red in the PLM system. Difficulty tying numbers across organizations due to differences in established reference data values that are used for data categorization. Financial data Financial organizations rely heavily on data for critical activities, such as monthly, quarterly, and annual reporting. Organizations that have multiple finance and accounting systems often have financial data across multiple general ledgers that needs to be consolidated to produce financial reports. MDM can provide a centralized hub to map and manage accounts, cost centers, business entities, and other financial datasets. Through the centralized hub, MDM provides a consolidated view of these datasets. Symptoms include the following challenges: Difficulty aggregating financial data across multiple systems into a consolidated view Lack of process for adding and mapping new data elements in financial systems Delays in producing end-of-period financial reports Considerations These considerations implement the pillars of the Azure Well-Architected Framework, which is a set of guiding tenets that can be used to improve the quality of a workload. For more information, see Microsoft Azure Well-Architected Framework. Consider these factors when you choose a data management solution for your organization. Reliability Reliability ensures your application can meet the commitments you make to your customers. For more information, see Overview of the reliability pillar. Profisee runs natively on Azure Kubernetes Service (AKS) and Azure SQL Database. Both services offer out-of-the-box capabilities to support high availability. Security Security provides assurances against deliberate attacks and the abuse of your valuable data and systems. For more information, see Overview of the security pillar. Profisee authenticates users by using OpenID Connect, which implements an Open Authorization (OAuth) 2.0 authentication flow. Most organizations configure Profisee MDM to authenticate users against Microsoft Entra ID, which ensures that you can apply and enforce your enterprise policies for authentication. Cost optimization Cost optimization is about looking at ways to reduce unnecessary expenses and improve operational efficiencies. For more information, see Overview of the cost optimization pillar. Running costs consist of a software license and Azure consumption. For more information, contact Profisee. Performance efficiency Performance efficiency is the ability of your workload to scale to meet the demands placed on it by users in an efficient manner. For more information, see Performance efficiency pillar overview. Profisee MDM runs natively on AKS and SQL Database. You can configure AKS to scale Profisee MDM up, down, and across your business functions. You can deploy SQL Database in numerous configurations to balance performance, scalability, and costs. Dynamic scaling is inherent in the cloud-native architecture of Profisee, which uses microservices and containers. If you run Profisee in your cloud tenant via Kubernetes, you can dynamically scale up and out based on your load. With the Profisee SaaS service that runs on AKS, you can configure large node pools for your pods. These pools scale dynamically based on the load on the system across the multitenant infrastructure. For detailed information about how to deploy Profisee and Microsoft Purview on AKS, see Microsoft Purview - Profisee MDM integration. Deploy this scenario Profisee MDM is a packaged Kubernetes service. You can deploy Profisee MDM as a platform as a service (PaaS) in your Azure tenant, in any other cloud tenant, or on-premises. You can also deploy Profisee MDM as a SaaS that Profisee hosts and manages. Contributors This article is maintained by Microsoft. It was originally written by the following contributor. Principal author: Gaurav Malhotra Principal Group PM Manager To see non-public LinkedIn profiles, sign in to LinkedIn. Next steps Understand the capabilities of the REST copy connector in Data Factory. Learn more about Profisee running natively in Azure. Learn how to deploy Profisee to Azure by using an Azure Resource Manager template (ARM template). View Profisee Data Factory templates. Related resources Architecture guides: Extract, transform, load (ETL) Choose a batch processing technology in Azure Reference architectures: Master data management with Profisee and Azure Data Factory Analytics end-to-end with Azure Synapse Modern analytics architecture with Azure Databricks Big data analytics with enterprise-grade security using Azure Synapse Automated enterprise BI Optimize marketing with machine learning Enterprise business intelligence
architecture-center/docs/databases/architecture/profisee-master-data-management-purview-content.md/0
Architecture
architecture-center/docs/databases/architecture/profisee-master-data-management-purview-content.md
architecture-center
7,303
96
The architecture described in this article demonstrates how you can use Teradata VantageCloud Enterprise together with Azure Data Factory to develop data integration pipelines with a low-code or no-code approach. It shows how to quickly ingest or extract Vantage data over an enhanced-security connection by using Data Factory. Apache®, Hadoop, and the flame logo are either registered trademarks or trademarks of the Apache Software Foundation in the United States andor other countries. No endorsement by the Apache Software Foundation is implied by the use of these marks. Architecture The following diagram illustrates a version of the architecture that uses virtual network peering connectivity. It uses a self-hosted integration runtime (IR) to connect to the analytics database. Teradata's VMs are deployed with only private IP addresses. :::image type="content" source="_imagesteradata-azure-data-factory-vnet-peering.png" alt-text="Diagram that shows a version of the architecture that uses virtual network peering connectivity." lightbox="_imagesteradata-azure-data-factory-vnet-peering.png" border="false"::: Download a Visio file of this architecture. The following diagram illustrates a version of the architecture that uses Azure Private Link connectivity. :::image type="content" source="_imagesteradata-vantage-azure-data-factory-private-link.png" alt-text="Diagram that shows a version of the architecture that uses Private Link connectivity." lightbox="_imagesteradata-vantage-azure-data-factory-private-link.png" border="false"::: Download a Visio file of this architecture. VantageCloud Enterprise on Azure is a fully managed service that's deployed in a Teradata-owned Azure subscription. You deploy cloud services in your own Azure subscription, which is then connected to the Teradata-managed subscription via one of the approved connectivity options. Teradata supports the following types of connectivity between your Azure subscription and VantageCloud Enterprise on Azure: Virtual network peering Private Link Azure Virtual WAN If you plan to use virtual network peering, work with Teradata support or your Teradata account team to ensure that required security group settings are in place to initiate traffic from the self-hosted IR to the database via the virtual network peering link. Components To implement this architecture, you need to be familiar with Data Factory, Azure Blob Storage, Teradata VantageCloud Enterprise, and Teradata Tools and Utilities (TTU). These components and versions are used in the integration scenarios: Teradata VantageCloud Enterprise 17.20, hosted on Azure Azure Data Factory Azure Blob Storage TTU 17.20 Teradata ODBC Driver 17.20.12 Teradata Studio 17.20 Teradata Vantage Vantage provides what Teradata calls Pervasive Data Intelligence. Users across your organization can use it to get real-time, intelligent answers to their questions. In this architecture, Vantage on Azure is used as a source or destination for data integration tasks. Vantage Native Object Storage (NOS) is used to integrate with data in Blob Storage. Data Factory Data Factory is a serverless cloud extract, transform, load (ETL) service. You can use it to orchestrate and automate data movement and transformation. It provides a code-free user interface for data ingestion and intuitive authoring and single-pane-of-glass monitoring and management. You can use Data Factory to create and schedule data-driven workflows (called pipelines) that can ingest data from various data stores. You can create complex ETL processes that visually transform data by using dataflows that run on Spark or compute services like Azure Batch, Azure Machine Learning, Apache Spark, SQL, Azure HDInsight with Hadoop, and Azure Databricks. Working with Data Factory involves the following layers, listed from the highest level of abstraction to the software that's closest to the data. Pipelines are graphical interfaces that contain activities and data paths. Activities perform operations on data. Sources and sinks are activities that specify where data comes from and where it goes. Datasets are well-defined sets of data that Data Factory ingests, loads, and transforms. Linked services enable Data Factory to access connection information for specific external data sources. The integration runtime (IR) provides a gateway between Data Factory and data or compute resources. Self-hosted IR The self-hosted IR can perform copy operations between cloud data stores and private-network data stores. You can also transform your compute resources in an on-premises network or an Azure virtual network. You need a local computer or virtual machine on your private network to install the self-hosted IR. For more information, see Considerations for using a self-hosted IR. This article describes how to use the self-hosted IR to connect to VantageCloud and extract data to load into Azure Data Lake Storage. Teradata connector In this architecture, Data Factory uses the Teradata connector to connect to Vantage. The Teradata connector supports: - Teradata versions 14.10, 15.0, 15.10, 16.0, 16.10, and 16.20. - Copying data by using basic, Windows, or LDAP authentication. - Parallel copying from a Teradata source. For more information, see Parallel copy from Teradata. This article describes how to set up linked services and datasets for the Data Factory Copy Data activity, which ingests data from Vantage and loads it into Data Lake Storage. Scenario details This article describes three scenarios: Data Factory pulling data from VantageCloud Enterprise and loading it into Blob Storage Data Factory loading data into VantageCloud Enterprise from Blob Storage Using Vantage NOS functionality to access data transformed and loaded into Blob Storage by Data Factory Scenario 1: Load data into Blob Storage from VantageCloud This scenario describes how to use Data Factory to extract data from VantageCloud Enterprise, perform some basic transformations, and then load the data into a Blob Storage container. The scenario highlights the native integration between Data Factory and Vantage and how easily you can build an enterprise ETL pipeline to integrate data in Vantage. To complete this procedure, you need to have a Blob Storage container in your subscription, as shown in the architecture diagrams. To create a native connector to Vantage, in your data factory, select the Manage tab, select Linked services, and then select New: :::image type="content" source="_imagescreate-linked-service.png" alt-text="Screenshot that shows the New button in Linked services." lightbox="_imagescreate-linked-service.png"::: Search for Teradata and then select the Teradata connector. Then select Continue: :::image type="content" source="_imagesteradata-linked-service.png" alt-text="Screenshot that shows the Teradata connector." lightbox="_imagesteradata-linked-service.png"::: Configure the linked service to connect to your Vantage database. This procedure shows how to use a basic authentication mechanism with a user ID and password. Alternatively, depending on your security needs, you can choose a different authentication mechanism and set other parameters accordingly. For more information, see Teradata connector linked service properties. You'll use a self-hosted IR. For more information, see these instructions for deploying a self-hosted IR. Deploy it in the same virtual network as your data factory. Use the following values to configure the linked service: Name: Enter a name for your linked service connection. Connect via integration runtime: Select SelfHostedIR. Server name: If you're connecting via virtual network peering, provide the IP address of a VM in the Teradata cluster. You can connect to the IP address of any VM in the cluster. If you're connecting via Private Link, provide the IP address of the private endpoint that you created in your virtual network to connect to the Teradata cluster via Private Link. Authentication type: Choose an authentication type. This procedure shows how to use basic authentication. User name and Password: Provide the credentials. Select Test connection, and then select Create. Be sure that interactive authoring is enabled for your IR so that the test connection functionality works. :::image type="content" source="_imagesteradata-linked-service-configuration.png" alt-text="Screenshot that shows the configuration for the Teradata connector." lightbox="_imagesteradata-linked-service-configuration.png"::: For testing, you can use a test database in Vantage that's called NYCTaxiADFIntegration. This database has a single table named Green_Taxi_Trip_Data. You can download the database from NYC OpenData. The following CREATE TABLE statement can help you understand the schema of the table. sql CREATE MULTISET TABLE NYCTaxiADFIntegration.Green_Taxi_Trip_Data, FALLBACK , NO BEFORE JOURNAL, NO AFTER JOURNAL, CHECKSUM = DEFAULT, DEFAULT MERGEBLOCKRATIO, MAP = TD_MAP1 ( VendorID BYTEINT, lpep_pickup_datetime DATE FORMAT ‘YYMMDD’, lpep_dropoff_datetime DATE FORMAT ‘YYMMDD’, store_and_fwd_flag VARCHAR(1) CHARACTER SET LATIN CASESPECIFIC, RatecodeID BYTEINT, PULocationID SMALLINT, DOLocationID SMALLINT, passenger_count BYTEINT, trip_distance FLOAT, fare_amount FLOAT, extra DECIMAL(18,16), mta_tax DECIMAL(4,2), tip_amount FLOAT, tolls_amount DECIMAL(18,16), ehail_fee BYTEINT, improvement_surcharge DECIMAL(3,1), total_amount DECIMAL(21,17), payment_type BYTEINT, trip_type BYTEINT, congestion_surcharge DECIMAL(4,2)) NO PRIMARY INDEX ; Next, you create a simple pipeline to copy the data from the table, perform some basic transformation, and then load the data into a Blob Storage container. As noted at the start of this procedure, you should have already created the Blob Storage container in your subscription. First, create a linked service to connect to the container, which is the sink that you'll copy the data into. Select the Manage tab in your data factory, select Linked services, and then select New: :::image type="content" source="_imagesnew-linked-service.png" alt-text="Screenshot that shows the New button." lightbox="_imagesnew-linked-service.png"::: Search for Azure Blob, select the Azure Blob Storage connector, and then select Continue: :::image type="content" source="_imagesblob-stroage-connector.png" alt-text="Screenshot that shows the Blob Storage linked service." lightbox="_imagesblob-stroage-connector.png"::: Configure the linked service to connect to the Blob Storage account: Name: Enter a name for your linked service connection. Connect via integration runtime: Select AutoResolveIntegrationRuntime. Authentication type: Select Account key. Azure subscription: Enter your Azure subscription ID. Storage account name: Enter your Azure storage account name. Select Test connection to verify the connection, and then select Create. :::image type="content" source="_imagesblob-storage-connector-configuration.png" alt-text="Screenshot that shows the configuration of the Blob Storage linked service." lightbox="_imagesblob-storage-connector-configuration.png"::: Create a Data Factory pipeline: Select the Author tab. Select the + button. Select Pipeline. Enter a name for the pipeline. :::image type="content" source="_imagesazure-data-factory-pipeline.png" alt-text="Screenshot that shows the steps for creating a pipeline." lightbox="_imagesazure-data-factory-pipeline.png"::: Create two datasets: Select the Author tab. Select the + button. Select Dataset. Create a dataset for the Green_Taxi_Trip_Data Teradata table: Select Teradata as the Data Store. Name: Enter a name for the dataset. Linked service: Select the linked service that you created for Teradata in steps 2 and 3. Table name: Select the table from the list. Select OK. :::image type="content" source="_imagesteradata-datasets.png" alt-text="Screenshot that shows the properties for the Teradata table." lightbox="_imagesteradata-datasets.png"::: Create an Azure Blob dataset: Select Azure Blob as the Data Store. Select the format of your data. Parquet is used in this demonstration. Linked service: Select the linked service that you created in step 6. File path: Enter the file path of the blob file. Import schema: Select None. Select OK. :::image type="content" source="_imagesazure-blob-dataset.png" alt-text="Screenshot that shows the properties for the Azure Blob Storage dataset." lightbox="_imagesazure-blob-dataset.png"::: Drag a Copy Data activity onto the pipeline. [!Note] The Teradata connector doesn't currently support the Data Flow activity in Data Factory. If you want to perform transformation on the data, we recommend that you add a Data Flow activity after the Copy activity. Configure the Copy Data activity: On the Source tab, under Source dataset, select the Teradata table dataset that you created in the previous step. For Use query, select Table. Use the default values for the other options. :::image type="content" source="_imagescopy-data-activity-source.png" alt-text="Screenshot that shows the steps for creating a copy data activity." lightbox="_imagescopy-data-activity-source.png"::: On the Sink tab, under Sink dataset, select the Azure Blob dataset that you created in the previous step. Use the default values for the other options. :::image type="content" source="_imagescopy-data-activity-sink.png" alt-text="Screenshot that shows the configuration for the sink dataset." lightbox="_imagescopy-data-activity-sink.png"::: Select Debug. The pipeline copies the data from the Teradata table to a Parquet file in Blob Storage. Scenario 2: Load data into VantageCloud from Blob Storage This scenario describes how to use an ODBC connector to connect to Vantage via the self-hosted IR VM to load data. Because the IR needs to be installed and configured with the Teradata ODBC driver, this option works only with a Data Factory self-hosted IR. You can also use TTU, Data Factory custom activities, and Azure Batch to load data into Vantage and transform it. For more information, see Connect Teradata Vantage to Azure Data Factory Using Custom Activity Feature. We recommend that you evaluate both options for performance, cost, and management considerations and choose the option that's best suited to your requirements. Start by preparing the self-hosted IR that you created in the previous scenario. You need to install the Teradata ODBC driver on it. This scenario uses a Windows 11 VM for the self-hosted IR. Use RDP to connect to the VM. Download and install the Teradata ODBC driver. If the JAVA JRE isn't already on the VM, download and install it. Create a 64-bit system DSN for the Teradata database by adding an ODBC data source. Be sure to use the 64-bit DSN window. Select the Teradata Database ODBC Driver, as shown in the following screenshot. Select Finish to open the driver setup window. :::image type="content" source="_imagesteradata-odbc-driver.png" alt-text="Screenshot that shows the steps for creating a data source."::: Configure the DSN properties. Name: Enter a name for the DSN. Under Teradata Server Info, in Name or IP address: If you're connecting via virtual network peering, provide the IP address of a VM in the Teradata cluster. You can connect to the IP address of any VM in the cluster. If you're connecting via Private Link, provide the IP address of the private endpoint that you created in your virtual network to connect to the Teradata cluster via Private Link. Optionally, provide the Username and select Test. You're prompted to enter the credentials. Select OK and ensure that the connection succeeds. Note that you'll provide the user name and password in Data Factory when you create the ODBC linked service that's used to connect to the Teradata database from Data Factory. Leave the other fields blank. Select OK. :::image type="content" source="_imagesodbc-driver-configuration.png" alt-text="Screenshot that shows the configuration for the driver." lightbox="_imagesodbc-driver-configuration.png"::: The ODBC Data Source Administrator window will look like the one in the following screenshot. Select Apply. You can now close the window. Your self-hosted IR is now ready to connect to Vantage by using ODBC. :::image type="content" source="_imagesodbc-driver-configuration-2.png" alt-text="Screenshot that shows the ODBC Data Source Administrator window." lightbox="_imagesodbc-driver-configuration-2.png"::: In Data Factory, create a linked service connection. Choose ODBC as the data store: :::image type="content" source="_imagesodbc-linked-service.png" alt-text="Screenshot that shows the ODBC linked service." lightbox="_imagesodbc-linked-service.png"::: Configure the linked service with the IR that you configured in the previous steps: Name: Provide a name for the linked service. Connect via integration runtime: Select SelfhostedIR. Connection string: Enter the DSN connection string with the name of the DSN that you created in the previous steps. Authentication type: Select Basic. Enter the user name and password for your Teradata ODBC connection. Select Test connection, and then select Create. :::image type="content" source="_imagesteradata-linked-service-configuration-2.png" alt-text="Screenshot that shows the configurations for the linked service." lightbox="_imagesteradata-linked-service-configuration-2.png"::: Complete the following steps to create a dataset with ODBC as the data store. Use the linked service that you created earlier. Select the Author tab. Select the + button. Select Dataset. Create a dataset for the Green_Taxi_Trip_DataIn Teradata table: Select ODBC as the data store, and then select Continue. Name: Provide a name for the dataset. Linked service: Select the ODBC linked service that you created in the previous steps. Table name: Select the table from the list. Select OK. [!Tip] When you load the data, use a staging table with generic data types to avoid data-type mismatch errors. For example, instead of using the Decimal data type for columns, use Varchar. You can then perform data-type transformations in the Vantage database. :::image type="content" source="_imagesodbc-dataset.png" alt-text="Screenshot that shows the properties for the Teradata table." lightbox="_imagesodbc-dataset.png"::: Create an Azure Blob connection to the source file that you want to load into Vantage by following steps 4 through 6 and step 8 in the first scenario. Note that you're creating this connection for the source file, so the path of the file will be different. Create a pipeline that contains a Copy Data activity, as described in scenario 1. Drag a Copy Data activity onto the pipeline. [!Note] The Teradata ODBC connector doesn't currently support the Data Flow activity in Data Factory. If you want to perform transformation on the data, we recommend that you create a Data Flow activity before the Copy Data activity. Configure the Copy Data activity: On the Source tab, select the file dataset that you want to load into Teradata. Use the default values for the other options. :::image type="content" source="_imagescopy-data-source.png" alt-text="Screenshot that shows the steps for creating a Copy Data activity." lightbox="_imagescopy-data-source.png"::: On the Sink tab, under Sink dataset, select the Teradata table dataset that you created through ODBC connection. Use the default values for the other options. :::image type="content" source="_imagescopy-data-sink.png" alt-text="Screenshot that shows the properties for the sink dataset." lightbox="_imagescopy-data-sink.png"::: Select Debug. The pipeline copies the data from the Parquet file to Vantage. Scenario 3: Access data in Blob Storage from VantageCloud This scenario describes how to use the Vantage Native Object Store (NOS) functionality to access data that's in Blob Storage. The previous scenario is ideal when you want to load data into Vantage on a continual or scheduled basis. This scenario describes how to access data in a one-off manner from Blob Storage, with or without loading the data into Vantage. [!Note] You can also use NOS to export data to Blob Storage. You can use the following query to read, from Vantage, data that's been transformed and loaded into Blob Storage via Data Factory, without loading the data into Vantage. You can use Teradata SQL Editor to run queries. To access the data that's in the blob, you supply the storage account name and access key in the Access_ID and Access_Key fields. The query also returns a field called Location that specifies the path of the file that the record was read from. sql FROM ( LOCATION='AZyourstorageaccount.blob.core.windows.netvantageadfdatainNYCGreenTaxi' AUTHORIZATION='{"ACCESS_ID":"yourstorageaccountname","ACCESS_KEY":"yourstorageaccesskey"}' ) as GreenTaxiData; :::image type="content" source="_imagesquery-blob.png" alt-text="Screenshot that shows a query for reading data." lightbox="_imagesquery-blob.png"::: Here's another example of querying data in place. It uses the READ_NOS table operator. :::image type="content" source="_imagesquery-blob-2.png" alt-text="Screenshot that shows another example of querying data in place." lightbox="_imagesquery-blob-2.png"::: You can also query data in place or load data into a Vantage database by creating a foreign table in the object store. You first need to create an authorization object that uses the storage account name and access key in USER and PASSWORD fields, respectively, as shown in the following syntax. You can use this object to create your foreign table so that you don't need to provide the keys when you create the table. USER 'YOUR-STORAGE-ACCOUNT-NAME' PASSWORD 'YOUR-ACCESS-KEY'; You can now create the foreign table to access the data. The following query creates the table for the Green Taxi data. It uses the authorization object. [!Note] When you load the Parquet file, be sure to map the data types correctly. For help with matching the data types, you can use the READ_NOS command to preview the Parquet schema. sql Create Foreign Table NYCTaxiADFIntegration.GreenTaxiForeignTable , External security definer trusted DefAuth3 ( VendorID INT, lpep_pickup_datetime TIMESTAMP, lpep_dropoff_datetime TIMESTAMP, store_and_fwd_flag VARCHAR(40) CHARACTER SET UNICODE CASESPECIFIC, RatecodeID INT, PULocationID INT, DOLocationID INT, passenger_count INT, trip_distance FLOAT, fare_amount FLOAT, extra DECIMAL(38,18), mta_tax DECIMAL(38,18), tip_amount FLOAT, tolls_amount DECIMAL(38,18), ehail_fee INT, improvement_surcharge DECIMAL(38,18), total_amount DECIMAL(38,18), payment_type INT, trip_type INT, congestion_surcharge DECIMAL(38,18) ) USING ( LOCATION('AZadfvantagestorageaccount.blob.core.windows.netvantageadfdatainNYCGreenTaxi') STOREDAS ('PARQUET')) NO PRIMARY INDEX , PARTITION BY COLUMN; You can now query the data from the foreign table just as you can query any other table: :::image type="content" source="_imagesquery-blob-3.png" alt-text="Screenshot that shows how to query the data from the foreign table." lightbox="_imagesquery-blob-3.png"::: You've seen how to query data in object storage in place. However, you might want to load the data permanently into a table in the database for better query performance. You can load data from Blob Storage into a permanent table by using the following statements. Some options might work only for certain data file formats. For details, see the Teradata documentation. For sample code, see Loading External Data into a Database. MethodDescription -- CREATE TABLE AS…WITH DATA Accesses table definitions and data from an existing foreign table and creates a new permanent table in the database CREATE TABLE AS...FROM READ_NOS Accesses data directly from the object store and creates a permanent table in the database INSERT SELECT Stores values from external data in a persistent database table The following samples show how to create a permanent table from GreenTaxiData: sql CREATE Multiset table NYCTaxiADFIntegration.GreenTaxiNosPermanent As ( SELECT D.PULocationID as PickupSite, Sum(fare_amount) AS TotalFarebyPickuploation FROM NYCTaxiADFIntegration.GreenTaxiForeignTable AS D GROUP BY 1 ) with Data No Primary Index; sql INSERT INTO NYCTaxiADFIntegration.GreenTaxiNosPermanent SELECT D.PULocationID as PickupSite, Sum(fare_amount) AS TotalFarebyPickuploation FROM NYCTaxiADFIntegration.GreenTaxiForeignTable AS D GROUP BY 1; Best practices Follow the connector performance tips and best practices that are described in Teradata as source. Be sure the self-hosted IR is sized correctly for your volume of data. You might want to scale out the IR to get better performance. For more information, see this self-hosted IR performance guide. Use the Copy activity performance and scalability guide to fine-tune Data Factory pipelines for performance. Use the Data Factory Copy Data tool to quickly set up a pipeline and run it on a schedule. Consider using an Azure VM with a self-hosted IR to manage the cost of running pipelines. If you want to run pipelines twice per day, you start the VM twice and then shut it down. Consider using CICD in Data Factory to implement Git-enabled continuous integration and development practices. Optimize your pipeline activity count. Unnecessary activities increase costs and make pipelines complex. Consider using mapping data flows to transform Blob Storage data visually with no-code and low-code processes to prepare Vantage data for uses like Power BI reporting. In addition to using schedule triggers, consider using a mix of tumbling window and event triggers to load Vantage data into destination locations. Reduce unnecessary triggers to reduce costs. Use Vantage NOS for ad-hoc querying to easily supply data for upstream applications. Contributors This article is maintained by Microsoft. It was originally written by the following contributors. Principal authors: Sunil Sabat Principal Program Manager Divyesh Sah Director WW Cloud Architecture Jianlei Shen Senior Program Manager Other contributors: Mick Alberts Technical Writer Emily Chen Principal PM Manager Wee Hyong Tok Partner Director PM Bunty Ranu Senior Director, Worldwide Cloud Architecture To see non-public LinkedIn profiles, sign in to LinkedIn. Next steps Teradata Vantage on Azure Teradata Tools and Utilities 17.20 Data Factory Azure virtual network peering Private Link service Data Factory Teradata connector Self-hosted IR Blob Storage documentation Related resources Big data architectures Choose an analytical data store in Azure
architecture-center/docs/databases/guide/teradata-vantage-data-factory-content.md/0
Architecture
architecture-center/docs/databases/guide/teradata-vantage-data-factory-content.md
architecture-center
6,486
97
[!INCLUDE header_file] This article presents a solution for automating data analysis and visualization using artificial intelligence (AI). Core components in the solution are Azure Functions, Azure Cognitive Services, and Azure Database for PostgreSQL. Architecture Download a Visio file of this architecture. Dataflow An Azure Function activity allows you to trigger an Azure Functions App in the Azure Data Factory pipeline. You create a linked service connection and use the linked service with an activity to specify the Azure Function you want to execute. Data comes from multiple sources including Azure Storage and Azure Event Hubs for high-volume data. When the pipeline receives new data, it triggers the Azure Functions App. The Azure Functions App calls the Cognitive Services API to analyze the data. The Cognitive Services API returns the results of the analysis in JSON format to the Azure Functions App. The Azure Functions App stores the data and results from the Cognitive Services API in Azure Database for PostgreSQL. Azure Machine Learning uses custom machine learning algorithms to provide further insights into the data. If you're approaching the machine learning step with a no-code perspective, you can implement further text analytics operations on the data, like feature hashing, Word2Vector, and n-gram extraction. If you prefer a code-first approach, you can run an open-source natural language processing (NLP) model as an experiment in Machine Learning studio. The PostgreSQL connector for Power BI makes it possible to explore human-interpretable insights in Power BI or a custom web application. Components Azure App Service provides a fully managed platform for quickly building, deploying, and scaling web apps and APIs. Functions is an event-driven serverless compute platform. For information about how to use an activity to run a function as part of a Data Factory pipeline, see Azure Function activity in Azure Data Factory. Event Hubs is a fully managed big data streaming platform. Cognitive Services provides a suite of AI services and APIs that you can use to build cognitive intelligence into apps. Azure Database for PostgreSQL is a fully managed relational database service. It provides high availability, elastic scaling, patching, and other management capabilities for PostgreSQL. Azure Machine Learning is a cloud service that you can use to train, deploy, and automate machine learning models. The studio supports code-first and no-code approaches. Power BI is a collection of software services and apps that display analytics information and help you derive insights from data. Scenario details The automated pipeline uses the following services to analyze the data: Cognitive Services uses AI for question answering, sentiment analysis, and text translation. Azure Machine Learning supplies machine-learning tools for predictive analytics. To store data and results, the solution uses Azure Database for PostgreSQL. The PostgreSQL database supports unstructured data, parallel queries, and declarative partitioning. This support makes Azure Database for PostgreSQL an effective choice for highly data-intensive AI and machine learning tasks. The solution automates the delivery of the data analysis. A connector links Azure Database for MySQL with visualization tools like Power BI. The architecture uses an Azure Functions App to ingest data from multiple data sources. It's a serverless solution that offers the following benefits: Infrastructure maintenance: Azure Functions is a managed service that allows developers to focus on innovative work that delivers value to the business. Scalability: Azure Functions provides compute resources on demand, so function instances scale as needed. As requests fall, resources and application instances drop off automatically. Potential use cases Azure Database for PostgreSQL is a cloud-based solution. As a result, this solution isn't recommended for mobile applications. It's more appropriate for downstream analysis in the following industries and others: Transportation: Maintenance prediction Finance: Risk assessment and fraud detection E-commerce: Customer churn prediction and recommendation engines Telecommunications: Performance optimization Utilities: Outage prevention Considerations These considerations implement the pillars of the Azure Well-Architected Framework, which is a set of guiding tenets that can be used to improve the quality of a workload. For more information, see Microsoft Azure Well-Architected Framework. For most features, the Cognitive Service for Language API has a maximum size of 5120 characters for a single document. For all features, the maximum request size is 1 MB. For more information about data and rate limits, see Service limits for Azure Cognitive Service for Language. In Azure Database for PostgreSQL, your ingress volume and velocity determine your selection of service and deployment mode. Two services are available: Azure Database for PostgreSQL Azure Cosmos DB for PostgreSQL, which was formerly known as Hyperscale (Citus) mode If you mine large workloads of customer opinions and reviews, use Azure Cosmos DB for PostgreSQL. Within Azure Database for PostgreSQL, two modes are available: single server and flexible server. To understand when to use each deployment mode, see What is Azure Database for PostgreSQL?. Previous versions of this solution used the Cognitive Services Text Analytics API. Azure Cognitive Service for Language now unifies three individual language services in Cognitive Services: Text Analytics, QnA Maker, and Language Understanding (LUIS). You can easily migrate from the Text Analytics API to the Cognitive Service for Language API. For instructions, see Migrate to the latest version of Azure Cognitive Service for Language. Security Security provides assurances against deliberate attacks and the abuse of your valuable data and systems. For more information, see Overview of the security pillar. All data in Azure Database for PostgreSQL is automatically encrypted and backed up. You can configure Microsoft Defender for Cloud for further mitigation of threats. For more information, see Enable Microsoft Defender for open-source relational databases and respond to alerts. DevOps You can configure GitHub Actions to connect to Azure Database for PostgreSQL database by using its connection string and setting up a workflow. For more information, see Quickstart: Use GitHub Actions to connect to Azure PostgreSQL. You can also automate your machine learning lifecycle by using Azure Pipelines. For information about how to implement an MLOps workflow and build a CICD pipeline for your project, see the GitHub repo MLOps with Azure ML. Cost optimization Cost optimization is about looking at ways to reduce unnecessary expenses and improve operational efficiencies. For more information, see Overview of the cost optimization pillar. Cognitive Service for Language offers various pricing tiers. The number of text records that you process affects your cost. For more information, see Cognitive Service for Language pricing. Next steps Azure Functions overview Azure Function activity in Azure Data Factory Azure Event Hubs—A big data streaming platform and event ingestion service What is Azure Cognitive Services? What is Azure Cognitive Service for Language? How to use Language service features asynchronously Azure Cognitive Services for Language API testing console Use DirectQuery to link PostgreSQL to Power BI Create an Azure Cosmos DB for PostgreSQL cluster in the Azure portal Tutorial: Consume Azure Machine Learning models in Power BI Extract insights from text with the Language service Microsoft Certified: Azure AI Engineer Associate Related resources Intelligent apps using Azure Database for MySQL
architecture-center/docs/databases/idea/intelligent-apps-using-azure-database-for-postgresql-content.md/0
Architecture
architecture-center/docs/databases/idea/intelligent-apps-using-azure-database-for-postgresql-content.md
architecture-center
1,497
98
title: Dashboards to visualize Azure Databricks metrics description: Learn how to set up a Grafana dashboard to monitor performance of Azure Databricks jobs. Azure Databricks is an Apache Spark-based analytics service. author: mssaperla ms.author: saperla categories: azure ms.date: 07252022 ms.topic: conceptual ms.service: architecture-center ms.subservice: azure-guide azureCategories: - databases - management-and-governance products: - azure-databricks - azure-monitor Use dashboards to visualize Azure Databricks metrics [!NOTE] This article relies on an open source library hosted on GitHub at: https:github.commspnpspark-monitoring. The original library supports Azure Databricks Runtimes 10.x (Spark 3.2.x) and earlier. Databricks has contributed an updated version to support Azure Databricks Runtimes 11.0 (Spark 3.3.x) and above on the l4jv2 branch at: https:github.commspnpspark-monitoringtreel4jv2. Please note that the 11.0 release is not backwards compatible due to the different logging systems used in the Databricks Runtimes. Be sure to use the correct build for your Databricks Runtime. The library and GitHub repository are in maintenance mode. There are no plans for further releases, and issue support will be best-effort only. For any additional questions regarding the library or the roadmap for monitoring and logging of your Azure Databricks environments, please contact azure-spark-monitoring-help@databricks.com. This article shows how to set up a Grafana dashboard to monitor Azure Databricks jobs for performance issues. Azure Databricks is a fast, powerful, and collaborative Apache Spark–based analytics service that makes it easy to rapidly develop and deploy big data analytics and artificial intelligence (AI) solutions. Monitoring is a critical component of operating Azure Databricks workloads in production. The first step is to gather metrics into a workspace for analysis. In Azure, the best solution for managing log data is Azure Monitor. Azure Databricks does not natively support sending log data to Azure monitor, but a library for this functionality is available in GitHub. This library enables logging of Azure Databricks service metrics as well as Apache Spark structure streaming query event metrics. Once you've successfully deployed this library to an Azure Databricks cluster, you can further deploy a set of Grafana dashboards that you can deploy as part of your production environment. Prerequisites Configure your Azure Databricks cluster to use the monitoring library, as described in the GitHub readme. Deploy the Azure Log Analytics workspace To deploy the Azure Log Analytics workspace, follow these steps: Navigate to the perftoolsdeploymentloganalytics directory. Deploy the logAnalyticsDeploy.json Azure Resource Manager template. For more information about deploying Resource Manager templates, see Deploy resources with Resource Manager templates and Azure CLI. The template has the following parameters: location: The region where the Log Analytics workspace and dashboards are deployed. serviceTier: The workspace pricing tier. See here for a list of valid values. dataRetention (optional): The number of days the log data is retained in the Log Analytics workspace. The default value is 30 days. If the pricing tier is Free, the data retention must be seven days. workspaceName (optional): A name for the workspace. If not specified, the template generates a name. azurecli az deployment group create --resource-group --template-file logAnalyticsDeploy.json --parameters location='East US' serviceTier='Standalone' This template creates the workspace and also creates a set of predefined queries that are used by dashboard. Deploy Grafana in a virtual machine Grafana is an open source project you can deploy to visualize the time series metrics stored in your Azure Log Analytics workspace using the Grafana plugin for Azure Monitor. Grafana executes on a virtual machine (VM) and requires a storage account, virtual network, and other resources. To deploy a virtual machine with the bitnami-certified Grafana image and associated resources, follow these steps: Use the Azure CLI to accept the Azure Marketplace image terms for Grafana. azurecli az vm image terms accept --publisher bitnami --offer grafana --plan default Navigate to the spark-monitoringperftoolsdeploymentgrafana directory in your local copy of the GitHub repo. Deploy the grafanaDeploy.json Resource Manager template as follows: azurecli export DATA_SOURCE="https:raw.githubusercontent.commspnpspark-monitoringmasterperftoolsdeploymentgrafanaAzureDataSource.sh" az deployment group create \ --resource-group \ --template-file grafanaDeploy.json \ --parameters adminPass='' dataSource=$DATA_SOURCE Once the deployment is complete, the bitnami image of Grafana is installed on the virtual machine. Update the Grafana password As part of the setup process, the Grafana installation script outputs a temporary password for the admin user. You need this temporary password to sign in. To obtain the temporary password, follow these steps: Log in to the Azure portal. Select the resource group where the resources were deployed. Select the VM where Grafana was installed. If you used the default parameter name in the deployment template, the VM name is prefaced with sparkmonitoring-vm-grafana. In the Support + troubleshooting section, click Boot diagnostics to open the boot diagnostics page. Click Serial log on the boot diagnostics page. Search for the following string: "Setting Bitnami application password to". Copy the password to a safe location. Next, change the Grafana administrator password by following these steps: In the Azure portal, select the VM and click Overview. Copy the public IP address. Open a web browser and navigate to the following URL: http::3000. At the Grafana login screen, enter admin for the user name, and use the Grafana password from the previous steps. Once logged in, select Configuration (the gear icon). Select Server Admin. On the Users tab, select the admin login. Update the password. Create an Azure Monitor data source Create a service principal that allows Grafana to manage access to your Log Analytics workspace. For more information, see Create an Azure service principal with Azure CLI azurecli az ad sp create-for-rbac --name http: \ --role "Log Analytics Reader" \ --scopes subscriptionsmySubscriptionID Note the values for appId, password, and tenant in the output from this command: json { "appId": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "displayName": "azure-cli-2019-03-27-00-33-39", "name": "http:", "password": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "tenant": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" } Log into Grafana as described earlier. Select Configuration (the gear icon) and then Data Sources. In the Data Sources tab, click Add data source. Select Azure Monitor as the data source type. In the Settings section, enter a name for the data source in the Name textbox. In the Azure Monitor API Details section, enter the following information: Subscription Id: Your Azure subscription ID. Tenant Id: The tenant ID from earlier. Client Id: The value of "appId" from earlier. Client Secret: The value of "password" from earlier. In the Azure Log Analytics API Details section, check the Same Details as Azure Monitor API checkbox. Click Save & Test. If the Log Analytics data source is correctly configured, a success message is displayed. Create the dashboard Create the dashboards in Grafana by following these steps: Navigate to the perftoolsdashboardsgrafana directory in your local copy of the GitHub repo. Run the following script: ```bash export WORKSPACE= export LOGTYPE=SparkListenerEvent_CL sh DashGen.sh ``` The output from the script is a file named SparkMonitoringDash.json. Return to the Grafana dashboard and select Create (the plus icon). Select Import. Click Upload .json File. Select the SparkMonitoringDash.json file created in step 2. In the Options section, under ALA, select the Azure Monitor data source created earlier. Click Import. Visualizations in the dashboards Both the Azure Log Analytics and Grafana dashboards include a set of time-series visualizations. Each graph is time-series plot of metric data related to an Apache Spark job, stages of the job, and tasks that make up each stage. The visualizations are: Job latency This visualization shows execution latency for a job, which is a coarse view on the overall performance of a job. Displays the job execution duration from start to completion. Note that the job start time is not the same as the job submission time. Latency is represented as percentiles (10%, 30%, 50%, 90%) of job execution indexed by cluster ID and application ID. Stage latency The visualization shows the latency of each stage per cluster, per application, and per individual stage. This visualization is useful for identifying a particular stage that is running slowly. Task latency This visualization shows task execution latency. Latency is represented as a percentile of task execution per cluster, stage name, and application. Sum Task Execution per host This visualization shows the sum of task execution latency per host running on a cluster. Viewing task execution latency per host identifies hosts that have much higher overall task latency than other hosts. This may mean that tasks have been inefficiently or unevenly distributed to hosts. Task metrics This visualization shows a set of the execution metrics for a given task's execution. These metrics include the size and duration of a data shuffle, duration of serialization and deserialization operations, and others. For the full set of metrics, view the Log Analytics query for the panel. This visualization is useful for understanding the operations that make up a task and identifying resource consumption of each operation. Spikes in the graph represent costly operations that should be investigated. Cluster throughput This visualization is a high-level view of work items indexed by cluster and application to represent the amount of work done per cluster and application. It shows the number of jobs, tasks, and stages completed per cluster, application, and stage in one minute increments. Streaming ThroughputLatency This visualization is related to the metrics associated with a structured streaming query. The graph shows the number of input rows per second and the number of rows processed per second. The streaming metrics are also represented per application. These metrics are sent when the OnQueryProgress event is generated as the structured streaming query is processed and the visualization represents streaming latency as the amount of time, in milliseconds, taken to execute a query batch. Resource consumption per executor Next is a set of visualizations for the dashboard show the particular type of resource and how it is consumed per executor on each cluster. These visualizations help identify outliers in resource consumption per executor. For example, if the work allocation for a particular executor is skewed, resource consumption will be elevated in relation to other executors running on the cluster. This can be identified by spikes in the resource consumption for an executor. Executor compute time metrics Next is a set of visualizations for the dashboard that show the ratio of executor serialize time, deserialize time, CPU time, and Java virtual machine time to overall executor compute time. This demonstrates visually how much each of these four metrics is contributing to overall executor processing. Shuffle metrics The final set of visualizations shows the data shuffle metrics associated with a structured streaming query across all executors. These include shuffle bytes read, shuffle bytes written, shuffle memory, and disk usage in queries where the file system is used. Next steps [!div class="nextstepaction"] Troubleshoot performance bottlenecks Related resources Monitoring Azure Databricks Send Azure Databricks application logs to Azure Monitor Modern analytics architecture with Azure Databricks Ingestion, ETL (extract, transform, load), and stream processing pipelines with Azure Databricks
architecture-center/docs/databricks-monitoring/dashboards.md/0
Use dashboards to visualize Azure Databricks metrics
architecture-center/docs/databricks-monitoring/dashboards.md
architecture-center
2,708
99