Performance
- Clustering
- Standalone Solr
- Gzip compression and caching of static pages
- Memory
- Sizing
- Database Indexes
- Large number of users
- Robots.txt
- Statistics
- Document Cache
- Cache Macro
- LESS CSS Performances
- Rendering cache
- Merge the CSS files
- Set up NginX
- Local resource access
- Backlinks
- Versioning
- Custom Mapping
- LDAP
- Performance tree
- Navigation Tree
- Legacy
Here are some tips to increase XWiki's performance.
Clustering
If you need high availability or if the load on your XWiki instance is too high you can configure XWiki in a cluster to spread the load.
Standalone Solr
By default XWiki use an embedded instance of Solr for ease of use, but if you struggle with very slow searches, you should try a external Solr instance.
You can use debug=true in the URL of the search to see how much time is spent inside Solr to verify if Solr is taking a long time, or the issue is somewhere else, for example the XWiki UI.
See Performance Guide in Solr module documentation.
Gzip compression and caching of static pages
HTTP compression is a capability that can be built into web servers and web clients to improve transfer speed and bandwidth utilization. HTTP data is compressed before it is sent from the server: compliant browsers will announce what methods are supported to the server before downloading the correct format; browsers that do not support compliant compression method will download uncompressed data.
Many application servers (Tomcat, etc.) and HTTP proxies (Apache HTTPd, Nginx, etc.) support it.
In Apache HTTP Server
The recommended solution is to set up an Apache Web Server in front of your servlet container and install/configure the following modules:
- mod-deflate
- mod-expires
- mod-proxy-ajp (note that this depends on mod-proxy that you also need to install)
Modify your Apache configuration file to load the different modules:
LoadModule deflate_module /usr/lib/apache2/modules/mod_deflate.so
LoadModule proxy_module /usr/lib/apache2/modules/mod_proxy.so
# Depends: proxy
LoadModule proxy_ajp_module /usr/lib/apache2/modules/mod_proxy_ajp.so
Alternatively you can run the following commands as root (sudo)
a2enmod proxy_ajp
a2enmod expires
and configure your different modules as described below:
Mod Deflate Configuration
<Location ></Location>
# Insert filter
SetOutputFilter DEFLATE
# Netscape 4.x has some problems...
BrowserMatch ^Mozilla/4 gzip-only-text/html
# Netscape 4.06-4.08 have some more problems
BrowserMatch ^Mozilla/4.0[678] no-gzip
# MSIE masquerades as Netscape, but it is fine
# BrowserMatch bMSIE !no-gzip !gzip-only-text/html
# NOTE: Due to a bug in mod_setenvif up to Apache 2.0.48
# the above regex won't work. You can use the following
# workaround to get the desired effect:
BrowserMatch bMSI[E] !no-gzip !gzip-only-text/html
# Don't compress images
SetEnvIfNoCase Request_URI .(?:gif|jpe?g|png)$ no-gzip dont-vary
# Make sure proxies don't deliver the wrong content
#Header append Vary User-Agent env=!dont-vary
</Location>
On debian apache2 the config file for deflate is located under /etc/apache2/mods-enabled/deflate.conf
Mod Expire Configuration
<Location /xwiki/skins></Location>
ExpiresActive on
ExpiresDefault "access plus 1 day"
</Location>
<Location /xwiki/bin/skin></Location>
ExpiresActive on
ExpiresDefault "access plus 1 day"
</Location>
Mod Proxy AJP Configuration
<Proxy *>
Order deny,allow
Allow from all
</Proxy>
ProxyPreserveHost On
ProxyPass /xwiki ajp://192.168.1.181:8009/xwiki
where ajp://192.168.1.181:8009/xwiki is the internal address of your Servlet container where XWiki is running.
If you use Tomcat(9) you need to enable the ajp connector in the /etc/tomcat9/server.xml. Comment out the following line with adding <!-- -->.
<Connector port="8080"
protocol="HTTP/1.1"
connectionTimeout="20000"
URIEncoding="UTF-8"
redirectPort="8443">
</Connector>
-->
Uncomment the following line by removing the <!-- --> and add URIEncoding="UTF-8" to it.
<Connector port="8009"
protocol="AJP/1.3"
redirectPort="8443"
URIEncoding="UTF-8">
</Connector>
Memory
You need to configure your Servlet container so that XWiki has enough memory. This is done in the /etc/default/tomcat9 configuration file (or /etc/default/tomcat8 for Tomcat 8, etc). You'll need to tune the value to suit your needs. For possible memory issues you can check the logs to see if there are any "out of memory" errors.
Here are some good default values:
- Small and medium installs: A minimum of 1024MB (-Xmx1024m)
- Large installs: 2048MB or beyond (-Xmx2048m)
For your information here are the values used for the xwiki.org site:
Sizing
To give you an idea about what you need to run XWiki on, XWiki SAS has the following configuration for its cloud instances:
- 2GB of RAM (See XWiki memory needs),
- 2 coresAMD Opteron(tm) Processor 6386 SE
cpu MHz : 2800.000
cache size : 2048 KB - 16GB disk size by default
Database Indexes
Make sure you've set Database indexes. This is especially important when you start having lots of documents.
Large number of users
When you have large number of users it's recommended to turn on implicit All Group, i.e. to consider that all users are members of XWiki.XWikiAllGroup by default in the configuration. This is achieved by editing the xwiki.cfg file and setting:
Then you should remove all the XObjects from the XWikiAllGroup page but you should keep the page since otherwise you won't be able to set permissions for this group. This will prevent XWiki from having to load all that page's XObjects representing the users (thousands of them if you have thousands of users).
Also make sure that the XWikiAllGroup is listed in the xwiki.users.initialGroups property (it's there by default if you haven't touched that property):
#-# document names.
# xwiki.users.initialGroups=XWiki.XWikiAllGroup
Robots.txt
If your wiki is open on the Internet, it'll be crawled by search robots (like GoogleBot, etc). They will call all the URLs and especially the ones that are resource hungry like exports (PDF/RTF). You need to protect against this. To do so configure a robots.txt file and put it in your webserver configuration.
Some example:
# Prevent bots from executing all actions except "view" since:
# 1) we don't want bots to execute stuff in the wiki!
# 2) we don't want bots to consume CPU and memory
# (for example to perform exports)
# Note: You may want to allow /download/ if you wish to have
# attachments indexed.
# Note2: Using * instead of /bin/ to also match path-based
# subwikis like "/xwiki/wiki/wikialias/view/Space/Page"
Disallow: /xwiki/*/viewattachrev/
Disallow: /xwiki/*/viewrev/
Disallow: /xwiki/*/pdf/
Disallow: /xwiki/*/tex/
Disallow: /xwiki/*/edit/
Disallow: /xwiki/*/create/
Disallow: /xwiki/*/inline/
Disallow: /xwiki/*/preview/
Disallow: /xwiki/*/save/
Disallow: /xwiki/*/saveandcontinue/
Disallow: /xwiki/*/rollback/
Disallow: /xwiki/*/deleteversions/
Disallow: /xwiki/*/cancel/
Disallow: /xwiki/*/delete/
Disallow: /xwiki/*/deletespace/
Disallow: /xwiki/*/undelete/
Disallow: /xwiki/*/reset/
Disallow: /xwiki/*/register/
Disallow: /xwiki/*/propupdate/
Disallow: /xwiki/*/propadd/
Disallow: /xwiki/*/propdisable/
Disallow: /xwiki/*/propenable/
Disallow: /xwiki/*/propdelete/
Disallow: /xwiki/*/objectadd/
Disallow: /xwiki/*/commentadd/
Disallow: /xwiki/*/commentsave/
Disallow: /xwiki/*/objectsync/
Disallow: /xwiki/*/objectremove/
Disallow: /xwiki/*/attach/
Disallow: /xwiki/*/upload/
Disallow: /xwiki/*/download/
Disallow: /xwiki/*/temp/
Disallow: /xwiki/*/downloadrev/
Disallow: /xwiki/*/dot/
Disallow: /xwiki/*/svg/
Disallow: /xwiki/*/delattachment/
Disallow: /xwiki/*/skin/
Disallow: /xwiki/*/jsx/
Disallow: /xwiki/*/ssx/
Disallow: /xwiki/*/login/
Disallow: /xwiki/*/loginsubmit/
Disallow: /xwiki/*/loginerror/
Disallow: /xwiki/*/logout/
Disallow: /xwiki/*/charting/
Disallow: /xwiki/*/lock/
Disallow: /xwiki/*/redirect/
Disallow: /xwiki/*/admin/
Disallow: /xwiki/*/export/
Disallow: /xwiki/*/import/
Disallow: /xwiki/*/get/
Disallow: /xwiki/*/distribution/
Disallow: /xwiki/*/imagecaptcha/
Disallow: /xwiki/*/unknown/
# Note: In addition, this matches both old /xwiki/bin/webjars/
# and new /xwiki/webjars paths.
Disallow: /xwiki/*/webjars/
# Don't index additional UI-related resources.
Disallow: /xwiki/resources/
# Don't index sandbox content since it's sample content
Disallow: /xwiki/*/view/Sandbox/
# Don't index Admin space since it contains Admin stuff.
# Note that the Admin space is protected by permissions
# anyway but this acts as a safety net to not have private
# info leaked on the internet ;)
Disallow: /xwiki/*/view/Admin/
# Don't index Stats data (just because it's not useful and
# those pages are a bit CPU intensive)
Disallow: /xwiki/*/view/Stats/
# Don't index Panels data (because we don't want it
# indexed on the internet)
Disallow: /xwiki/*/view/Panels/
# Don't index the search page.
Disallow: /xwiki/*/Main/Search
# Don't index the REST API.
Disallow: /xwiki/rest/
# These are just UI elements which can cause infinite loops in
# web crawlers. See https://jira.xwiki.org/browse/XWIKI-16915
Disallow: /xwiki/*?*xpage=*
Other example:
# It could be also useful to block certain spaces from crawling,
# especially if this spaces doesn't provide new content
Disallow: /xwiki/bin/view/Main/
Disallow: /xwiki/bin/view/XWiki/
# On the other hand you would like to have your recent (public) changes included
Allow: /xwiki/bin/view/Main/Dashboard
Note:
For Tomcat6 the placement of the robots.txt file should be within the $TOMCAT/webapps/ROOT folder and should have permission 644 applied.
Indexing JS and CSS
Google officially recommends that you do not disallow crawling of the JS and CSS files which are now actually rendered by the crawler bot and used to better index your content. Also see this short video on the topic.
In this case, you might want to make sure to remove the following Disallow entries from your robots.txt file:
Disallow: /xwiki/*/jsx/
Disallow: /xwiki/*/ssx/
Disallow: /xwiki/*/webjars/
Disallow: /xwiki/*/resources/
Indexing images
For images uploaded as attachments inside wiki pages, you should add the following Allow entries for the /download/ action:
Allow: /xwiki/*/download/*.jpg$
Allow: /xwiki/*/download/*.jpeg$
Allow: /xwiki/*/download/*.gif$
In order to test if the robots.txt file is either accessible or working as desired use this checker.
Statistics
The statistics module is off by default since it's quite database intensive. If you don't need it you should turn it off.
The current recommendation is to use the Matomo extension for statistics instead.
Document Cache
You can tune the Document cache in the xwiki.cfg configuration file. The value depends on how much memory you have. The higher the better (but of course it's not very useful to allocate more than the total number of documents you have).
Cache Macro
It's possible to perform selective content caching by using the Cache Macro.
LESS CSS Performances
LESS is a preprocessor used to generate CSS files for skins and skin extensions. See the Performances section of the LESS module documentation to learn more about how to optimize its cache for performances, and to set the appropriate number of simultaneous compilations your server can handle.
Rendering cache
Some pages are complex to render (they may aggregate outside data for example or do complex and slow queries). For theses pages you can use the rendering cache. When doing so, any change made to these pages won't be displayed to the user until the cache expires. Note that when a page is modified, the cache for the page is refreshed.
The configuration is done in xwiki.properties with the following configuration options:
#-# Indicate if the rendering cache is enabled.
#-# Default value is false.
# core.renderingcache.enabled=true
#-# [Since 2.4M1]
#-# A list of Java regex patterns matching full documents reference.
# core.renderingcache.documents=wiki:Space\.Page
# core.renderingcache.documents=wiki:Space\..*
#-# [Since 2.4M1]
#-# The time (in seconds) after which data should be removed from the cache when not used.
#-# Default value is 300 (5 min).
# core.renderingcache.duration=300
#-# [Since 2.4M1]
#-# The size of the rendering cache. Not that it's not the number of cached documents but the number of cached results.
#-# (For a single document several cache entries are created, because each action, language and request query string
#-# produces a unique rendering result)
#-# Default value is 100.
# core.renderingcache.size=100
You can force a page to refresh using refresh=1 in the URL.
It's also possible to programmatically refresh any document cache using com.xpn.xwiki.internal.cache.rendering.RenderingCache component:
private RenderingCache renderingCache;
...
renderingCache.flushWholeCache();
renderingCache.flushCache(new DocumentReference("xwiki", "MySpace", "MyCachedDocument"));
Merge the CSS files
In order to reduce the number of requests and files that are downloaded from the browser or client, it could help to merge all XWiki CSS files into a single one. See the Merge CSS Script.
Set up NginX
If you experience heavy loads on your wiki, you could try using NginX.
NginX is used to fetch static content: images, javascript, styles, etc, but it can also be used as a reverse-proxy to pass requests down to the web container (e.g. Tomcat on port 8080).
Unlike Apache, which instantiates a new process per every static file, NginX uses the same process to fetch all the static data, and thus gives you extra perfomance "for free".
For more info on setting up NginX check this guide.
Local resource access
Backlinks
While a pretty neat feature, keeping track of the backlinks has a medium impact on the document saving time and a minor impact on the document loading time. If you feel that your wiki does not need backlinks, you can safely disable them with the following line in xwiki.cfg:
Versioning
One of the key features of any wiki system, versioning greatly affects the database size and the document update time. If you are sure your wiki does not need to keep track of all the changes and you will never need to revert documents to a previous version, then you can add the following line in xwiki.cfg:
Custom Mapping
In some cases you may not want to rely on XWiki's generic database schema for storing XClass data and instead you'd like to provide your own optimized table. For these use cases you can use Custom Mapping.
LDAP
Disable LDAP sub groups search
By default when loading a LDAP group, each member is searched and loaded to figure out if it's a group or not (and then load the sub group members, etc). If you know there is no sub group in your LDAP groups you can disable it and speed up quite a lot big groups handling using xwiki.authentication.ldap.group_sync_resolve_subgroups property in xwiki.cfg configuration file.
Performance tree
Since 7.1 it's possible to directly get a tree of time spent in each step of the request by using debug mode.
Navigation Tree
The Navigation Panel and other navigation trees can have some small performance issues under very high volumes. Here ares some base measures we did on some developer laptops to give you some ideas of the performance you should expect:
Measure set 1:
DB | Levels | Spaces | Pages | Requests | Time |
---|---|---|---|---|---|
MySQL | 5 | 6887 | 4049 | 31 | 180ms |
HSQLDB | 5 | 6762 | 4063 | 27 | 138ms |
MySQL | 1 | 2514 | 4962 | 27 | 331ms |
HSQLDB | 1 | 2377 | 4718 | 21 | 3.15s |
Measure set 2:
DB | Levels | Spaces | Pages | Requests | Time |
---|---|---|---|---|---|
Oracle | 5 | 6943 | 4106 | 20 | 119ms |
Oracle | 1 | 2493 | 4982 | 20 | 153ms |
Measure set 3:
DB | Levels | Spaces | Pages | Requests | Time |
---|---|---|---|---|---|
Oracle | 1 | 2485 | 4991 | 20 | 151ms |
PostgreSQL | 1 | 2494 | 4991 | 20 | 125ms |
Measure set 4 (XWiki 10.5, Intel i7 CPU, SSD Storage):
DB | Levels | Spaces | Pages | Requests | Time |
---|---|---|---|---|---|
PostgreSQL 10 | 5 | 6847 | 4052 | 25 | 74ms |
Legacy
Slow random number generation on UNIX
The library used for random number generation in Oracle's JVM relies on /dev/random by default for UNIX platforms.
Although /dev/random is more secure, it's possible to use /dev/urandom if the default JVM configuration instead.
To determine if your operating system exhibits this behavior, try displaying a portion of the file from a shell prompt:
If the command returns immediately, you can use /dev/random as the default generator for Oracle's JVM. If the command does not return immediately, use on of the following solutions to use /dev/urandom:
JVM setup
- Open the $JAVA_HOME/jre/lib/security/java.security file in a text editor.
- Change the line:
securerandom.source=file:/dev/random
to read:
securerandom.source=file:/dev/urandom - Save your change and exit the text editor.
Command line parameter
The same effect can be obtained using -Djava.security.egd=file:/dev/./urandom in the Java command line (usually in the application server configuration).
Monitor plugin
More a developer-oriented feature, XWiki can monitor its own code, reporting the time spent for each sub-component activated during a request. While the monitoring code isn't time consuming, it increases the memory consumption a bit, and the create/start/stop/log/destroy calls are spread all around the code, so you will save a lot of method calls by disabling this. You can do that by setting the following line in xwiki.cfg:
1.0 rendering cache using velocity in document content itself
You can add the following to their content to cache them after they are rendered. Note that the document is refreshed whenever the content of the document changes, and the cache takes into account the URL, so it is pretty safe to add a long cache duration for all documents that don't contain scripts gathering data from the wiki. For example to cache the rendered content for 60 seconds you would add:
Since 1.5M2, you can set the default rendering cache duration for all pages in xwiki.cfg:
xwiki.rendering.defaultCacheDuration=3600
Setting the default cache duration to a large value, and manually disabling the cache in dynamic pages would really speed up the wiki, since the rendering is one of the most time consuming processes.
Wiki syntax features for XWiki Syntax 1.0
If you're using XWiki Syntax 1.0 and if you don't plan to use all of the markup features, like the strikethrough filter, the automatic http links filter, the SVG, Laszlo or style macros, you can disable them in xwiki-core-*.jar/META-INF/services/com.xpn.xwiki.render.*. The wiki rendering is the most costly operation in the rendering process, so any disabled feature counts.
Now this will have no effect if you're using another syntax, like XWiki Syntax 2.x.