최신 AD0-E106 무료덤프 - Adobe Experience Manager Dev/Ops Engineer
A client is submitting a form that contains a CSRF token that is passed using the CSRF-Token HTTP header.
When looking at the web server access logs the header is printed out, but on the AEM instance the value is not present in the request.
What should the DevOps Engineer configure to make the values available on the AEM instance?
When looking at the web server access logs the header is printed out, but on the AEM instance the value is not present in the request.
What should the DevOps Engineer configure to make the values available on the AEM instance?
정답: A
When configuration agents, under which path of the repository are agent stored for the AEM author instance?
정답: D
A DevOps Engineer notices that the disk usage for an AEM author instance is constantly going up.
Upon further investigation, it is revealed:
- Online Compaction is unable to complete revision cleanup task during the daily maintenance window
- Segment Store Size has gone up from 8 GB to 13 GB
- The following message is logged in the error.log file
Canceling RevisionGC. The task was either stopped by the user or the Maintenance Window reached its end - Lucene Binaries Cleanup task completed successfully during the daily maintenance window What additional step can be performed during the weekly maintenance window to reduce the overall disk usage?
Upon further investigation, it is revealed:
- Online Compaction is unable to complete revision cleanup task during the daily maintenance window
- Segment Store Size has gone up from 8 GB to 13 GB
- The following message is logged in the error.log file
Canceling RevisionGC. The task was either stopped by the user or the Maintenance Window reached its end - Lucene Binaries Cleanup task completed successfully during the daily maintenance window What additional step can be performed during the weekly maintenance window to reduce the overall disk usage?
정답: A
A DevOps Engineer notices that existing pages are not updated through a package installation. The updated pages are present after deleting the pages that are not updating, and reinstall the package.
What is the source of the problem?
What is the source of the problem?
정답: A
Recently published content is not visible on the search results on the public website.
* All results show on the author environment
* Some results show on the publish environment
* The LastIndexedTime metric is not updated when checking the Async Indexer stats MBean
* The user is trying to find the page by the title
The following line appears regularly in the log:
08.01.2019 01:22:04.474 *INFO* [pool-9-thread-2] org.apache.jackrabbit.oak.plugins.index.IndexUpdate Reindexing will be performed for following indexes [/oak:index/damFileSize, /oak:index/lucene,
/oak:index/cqLastModified]
How can the DevOps Engineer gather more information about the root cause of this issue?
* All results show on the author environment
* Some results show on the publish environment
* The LastIndexedTime metric is not updated when checking the Async Indexer stats MBean
* The user is trying to find the page by the title
The following line appears regularly in the log:
08.01.2019 01:22:04.474 *INFO* [pool-9-thread-2] org.apache.jackrabbit.oak.plugins.index.IndexUpdate Reindexing will be performed for following indexes [/oak:index/damFileSize, /oak:index/lucene,
/oak:index/cqLastModified]
How can the DevOps Engineer gather more information about the root cause of this issue?
정답: A
An html page is published and is accessible through the dispatcher. A client is trying to access the updated page but is not getting the updated content.
* A rule /type "allow" /glob "*.html" is present in the dispatcher /cache /rules configuration
* The HTTP header "Cache-Control: max-age=3600" is set for all requests with the html extension
* The /invalidate section is not set in the dispatcher configuration
* A flush agent is configured on the publish instance
* A CDN is serving all static content including html files and was flushed manually Why does the content fail to update on the client side?
* A rule /type "allow" /glob "*.html" is present in the dispatcher /cache /rules configuration
* The HTTP header "Cache-Control: max-age=3600" is set for all requests with the html extension
* The /invalidate section is not set in the dispatcher configuration
* A flush agent is configured on the publish instance
* A CDN is serving all static content including html files and was flushed manually Why does the content fail to update on the client side?
정답: A