Performance

Last modified by Thomas Mortagne on 2024/09/04

Here are some tips to increase XWiki's performance.

Clustering

If you need high availability or if the load on your XWiki instance is too high you can configure XWiki in a cluster to spread the load.

Standalone Solr

By default XWiki use an embedded instance of Solr for ease of use, but if you struggle with very slow searches, you should try a external Solr instance.
You can use debug=true in the URL of the search to see how much time is spent inside Solr to verify if Solr is taking a long time, or the issue is somewhere else, for example the XWiki UI.

See Performance Guide in Solr module documentation.

Gzip compression and caching of static pages

HTTP compression is a capability that can be built into web servers and web clients to improve transfer speed and bandwidth utilization. HTTP data is compressed before it is sent from the server: compliant browsers will announce what methods are supported to the server before downloading the correct format; browsers that do not support compliant compression method will download uncompressed data.

Many application servers (Tomcat, etc.) and HTTP proxies (Apache HTTPd, Nginx, etc.) support it.

In Apache HTTP Server

The recommended solution is to set up an Apache Web Server in front of your servlet container and install/configure the following modules:

Modify your Apache configuration file to load the different modules:

LoadModule expires_module /usr/lib/apache2/modules/mod_expires.so
LoadModule deflate_module /usr/lib/apache2/modules/mod_deflate.so
LoadModule proxy_module /usr/lib/apache2/modules/mod_proxy.so
# Depends: proxy
LoadModule proxy_ajp_module /usr/lib/apache2/modules/mod_proxy_ajp.so

Alternatively you can run the following commands as root (sudo)

a2enmod deflate
a2enmod proxy_ajp
a2enmod expires

and configure your different modules as described below:

Mod Deflate Configuration

vwwwpro-1:~# cat /etc/apache2/conf.d/deflate
<Location ></Location>
   # Insert filter
   SetOutputFilter DEFLATE

   # Netscape 4.x has some problems...
   BrowserMatch ^Mozilla/4 gzip-only-text/html

   # Netscape 4.06-4.08 have some more problems
   BrowserMatch ^Mozilla/4.0[678] no-gzip

   # MSIE masquerades as Netscape, but it is fine
   # BrowserMatch bMSIE !no-gzip !gzip-only-text/html

   # NOTE: Due to a bug in mod_setenvif up to Apache 2.0.48
   # the above regex won't work. You can use the following
   # workaround to get the desired effect:
   BrowserMatch bMSI[E] !no-gzip !gzip-only-text/html

   # Don't compress images
   SetEnvIfNoCase Request_URI .(?:gif|jpe?g|png)$ no-gzip dont-vary

   # Make sure proxies don't deliver the wrong content
   #Header append Vary User-Agent env=!dont-vary
</Location>

On debian apache2 the config file for deflate is located under /etc/apache2/mods-enabled/deflate.conf

Mod Expire Configuration

vwwwpro-1:~# cat /etc/apache2/conf.d/expires
<Location /xwiki/skins></Location>
       ExpiresActive on
       ExpiresDefault "access plus 1 day"
</Location>

<Location /xwiki/bin/skin></Location>
       ExpiresActive on
       ExpiresDefault "access plus 1 day"
</Location>

Mod Proxy AJP Configuration

ProxyRequests Off
   <Proxy *>
       Order deny,allow
       Allow from all
   </Proxy>
   ProxyPreserveHost On
   ProxyPass /xwiki ajp://192.168.1.181:8009/xwiki

where ajp://192.168.1.181:8009/xwiki is the internal address of your Servlet container where XWiki is running.

If you use Tomcat(9) you need to enable the ajp connector in the /etc/tomcat9/server.xml. Comment out the following line with adding <!-- -->.

    <!-- Disable to use ajp connector instead
    <Connector port="8080"
               protocol="HTTP/1.1"
               connectionTimeout="20000"
               URIEncoding="UTF-8"
               redirectPort="8443">
    </Connector>
    -->

Uncomment the following line by removing the <!-- --> and add URIEncoding="UTF-8" to it.

    <!-- Activate ajp connector for apache proxy_ajp -->
   <Connector port="8009"
              protocol="AJP/1.3"
              redirectPort="8443"
              URIEncoding="UTF-8">
   </Connector>

Memory

You need to configure your Servlet container so that XWiki has enough memory. This is done in the /etc/default/tomcat9 configuration file (or /etc/default/tomcat8 for Tomcat 8, etc). You'll need to tune the value to suit your needs. For possible memory issues you can check the logs to see if there are any "out of memory" errors.

Here are some good default values:

  • Small and medium installs: A minimum of 1024MB (-Xmx1024m)
  • Large installs: 2048MB or beyond (-Xmx2048m)

You should not increase the memory beyond what you need because increasing it means that there's more Objects in memory at any time and the automatic JVM Garbage Collector has to work harder to clean itself, which can results in performance degradation in XWiki (since a full GC will pause the application for a longer time).

If you use HSQLDB as the wiki database, be aware that the full content of the database is stored in memory and thus the memory requirements are higher. See HSQLDB installation page for more details.

For your information here are the values used for the xwiki.org site:

CATALINA_OPTS="-server -Xms1080m -Xmx1600m -Dfile.encoding=utf-8 -Djava.awt.headless=true --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.io=ALL-UNNAMED --add-opens java.base/java.util=ALL-UNNAMED --add-opens java.base/java.util.concurrent=ALL-UNNAMED"

Sizing

To give you an idea about what you need to run XWiki on, XWiki SAS has the following configuration for its cloud instances:

  • 2GB of RAM (See XWiki memory needs),
  • 2 cores
    AMD Opteron(tm) Processor 6386 SE
    cpu MHz         : 2800.000
    cache size      : 2048 KB
  • 16GB disk size by default

Database Indexes

Make sure you've set Database indexes. This is especially important when you start having lots of documents.

Large number of users

When you have large number of users it's recommended to turn on implicit All Group, i.e. to consider that all users are members of XWiki.XWikiAllGroup by default in the configuration. This is achieved by editing the xwiki.cfg file and setting:

xwiki.authentication.group.allgroupimplicit=1

Then you should remove all the XObjects from the XWikiAllGroup page but you should keep the page since otherwise you won't be able to set permissions for this group. This will prevent XWiki from having to load all that page's XObjects representing the users (thousands of them if you have thousands of users).

Also make sure that the XWikiAllGroup is listed in the xwiki.users.initialGroups property (it's there by default if you haven't touched that property):

#-# List of groups that a new user should be added to by default after registering. Comma-separated list of group
#-# document names.
# xwiki.users.initialGroups=XWiki.XWikiAllGroup

Robots.txt

If your wiki is open on the Internet, it'll be crawled by search robots (like GoogleBot, etc). They will call all the URLs and especially the ones that are resource hungry like exports (PDF/RTF). You need to protect against this. To do so configure a robots.txt file and put it in your webserver configuration.

Some example:

User-agent: *
# Prevent bots from executing all actions except "view" since:
# 1) we don't want bots to execute stuff in the wiki!
# 2) we don't want bots to consume CPU and memory
# (for example to perform exports)
# Note: You may want to allow /download/ if you wish to have
# attachments indexed.
# Note2: Using * instead of /bin/ to also match path-based
# subwikis like "/xwiki/wiki/wikialias/view/Space/Page"
Disallow: /xwiki/*/viewattachrev/
Disallow: /xwiki/*/viewrev/
Disallow: /xwiki/*/pdf/
Disallow: /xwiki/*/tex/
Disallow: /xwiki/*/edit/
Disallow: /xwiki/*/create/
Disallow: /xwiki/*/inline/
Disallow: /xwiki/*/preview/
Disallow: /xwiki/*/save/
Disallow: /xwiki/*/saveandcontinue/
Disallow: /xwiki/*/rollback/
Disallow: /xwiki/*/deleteversions/
Disallow: /xwiki/*/cancel/
Disallow: /xwiki/*/delete/
Disallow: /xwiki/*/deletespace/
Disallow: /xwiki/*/undelete/
Disallow: /xwiki/*/reset/
Disallow: /xwiki/*/register/
Disallow: /xwiki/*/propupdate/
Disallow: /xwiki/*/propadd/
Disallow: /xwiki/*/propdisable/
Disallow: /xwiki/*/propenable/
Disallow: /xwiki/*/propdelete/
Disallow: /xwiki/*/objectadd/
Disallow: /xwiki/*/commentadd/
Disallow: /xwiki/*/commentsave/
Disallow: /xwiki/*/objectsync/
Disallow: /xwiki/*/objectremove/
Disallow: /xwiki/*/attach/
Disallow: /xwiki/*/upload/
Disallow: /xwiki/*/download/
Disallow: /xwiki/*/temp/
Disallow: /xwiki/*/downloadrev/
Disallow: /xwiki/*/dot/
Disallow: /xwiki/*/svg/
Disallow: /xwiki/*/delattachment/
Disallow: /xwiki/*/skin/
Disallow: /xwiki/*/jsx/
Disallow: /xwiki/*/ssx/
Disallow: /xwiki/*/login/
Disallow: /xwiki/*/loginsubmit/
Disallow: /xwiki/*/loginerror/
Disallow: /xwiki/*/logout/
Disallow: /xwiki/*/charting/
Disallow: /xwiki/*/lock/
Disallow: /xwiki/*/redirect/
Disallow: /xwiki/*/admin/
Disallow: /xwiki/*/export/
Disallow: /xwiki/*/import/
Disallow: /xwiki/*/get/
Disallow: /xwiki/*/distribution/
Disallow: /xwiki/*/imagecaptcha/
Disallow: /xwiki/*/unknown/
# Note: In addition, this matches both old /xwiki/bin/webjars/
# and new /xwiki/webjars paths.
Disallow: /xwiki/*/webjars/
# Don't index additional UI-related resources.
Disallow: /xwiki/resources/
# Don't index sandbox content since it's sample content
Disallow: /xwiki/*/view/Sandbox/
# Don't index Admin space since it contains Admin stuff.
# Note that the Admin space is protected by permissions
# anyway but this acts as a safety net to not have private
# info leaked on the internet ;)
Disallow: /xwiki/*/view/Admin/
# Don't index Stats data (just because it's not useful and
# those pages are a bit CPU intensive)
Disallow: /xwiki/*/view/Stats/
# Don't index Panels data (because we don't want it
# indexed on the internet)
Disallow: /xwiki/*/view/Panels/
# Don't index the search page.
Disallow: /xwiki/*/Main/Search
# Don't index the REST API.
Disallow: /xwiki/rest/
# These are just UI elements which can cause infinite loops in
# web crawlers. See https://jira.xwiki.org/browse/XWIKI-16915
Disallow: /xwiki/*?*xpage=*

Other example:

[...]
# It could be also useful to block certain spaces from crawling,
# especially if this spaces doesn't provide new content
Disallow: /xwiki/bin/view/Main/
Disallow: /xwiki/bin/view/XWiki/
# On the other hand you would like to have your recent (public) changes included
Allow: /xwiki/bin/view/Main/Dashboard

Note:

For Tomcat6 the placement of the robots.txt file should be within the $TOMCAT/webapps/ROOT folder and should have permission 644 applied.

-rw-r--r--  1 root  www  1478 Jan  8 15:52 robots.txt

Indexing JS and CSS

Google officially recommends that you do not disallow crawling of the JS and CSS files which are now actually rendered by the crawler bot and used to better index your content. Also see this short video on the topic.

In this case, you might want to make sure to remove the following Disallow entries from your robots.txt file:

Disallow: /xwiki/*/skin/
Disallow: /xwiki/*/jsx/
Disallow: /xwiki/*/ssx/
Disallow: /xwiki/*/webjars/
Disallow: /xwiki/*/resources/

Indexing images

For images uploaded as attachments inside wiki pages, you should add the following Allow entries for the /download/ action:

Allow: /xwiki/*/download/*.png$
Allow: /xwiki/*/download/*.jpg$
Allow: /xwiki/*/download/*.jpeg$
Allow: /xwiki/*/download/*.gif$

In order to test if the robots.txt file is either accessible or working as desired use this checker.

Statistics

This is no longer true starting with XE 1.4M2 since statistics are now put on a queue and written in a different thread in the database in one go, thus reducing the overhead to a maximum.

The statistics module is off by default since it's quite database intensive. If you don't need it you should turn it off.
The current recommendation is to use the Matomo extension for statistics instead.

Document Cache

You can tune the Document cache in the xwiki.cfg configuration file. The value depends on how much memory you have. The higher the better (but of course it's not very useful to allocate more than the total number of documents you have).

xwiki.store.cache.capacity=1000

Cache Macro

It's possible to perform selective content caching by using the Cache Macro.

LESS CSS Performances

LESS is a preprocessor used to generate CSS files for skins and skin extensions. See the Performances section of the LESS module documentation to learn more about how to optimize its cache for performances, and to set the appropriate number of simultaneous compilations your server can handle.

Rendering cache

Some pages are complex to render (they may aggregate outside data for example or do complex and slow queries). For theses pages you can use the rendering cache. When doing so, any change made to these pages won't be displayed to the user until the cache expires. Note that when a page is modified, the cache for the page is refreshed.

The configuration is done in xwiki.properties with the following configuration options:

#-# [Since 2.4M1]
#-# Indicate if the rendering cache is enabled.
#-# Default value is false.
# core.renderingcache.enabled=true

#-# [Since 2.4M1]
#-# A list of Java regex patterns matching full documents reference.
# core.renderingcache.documents=wiki:Space\.Page
# core.renderingcache.documents=wiki:Space\..*

#-# [Since 2.4M1]
#-# The time (in seconds) after which data should be removed from the cache when not used.
#-# Default value is 300 (5 min).
# core.renderingcache.duration=300

#-# [Since 2.4M1]
#-# The size of the rendering cache. Not that it's not the number of cached documents but the number of cached results.
#-# (For a single document several cache entries are created, because each action, language and request query string
#-# produces a unique rendering result)
#-# Default value is 100.
# core.renderingcache.size=100

You can force a page to refresh using refresh=1 in the URL.

It's also possible to programmatically refresh any document cache using com.xpn.xwiki.internal.cache.rendering.RenderingCache component:

@Inject
private RenderingCache renderingCache;

...

renderingCache.flushWholeCache();
renderingCache.flushCache(new DocumentReference("xwiki", "MySpace", "MyCachedDocument"));

Merge the CSS files

In order to reduce the number of requests and files that are downloaded from the browser or client, it could help to merge all XWiki CSS files into a single one. See the Merge CSS Script.

Set up NginX

If you experience heavy loads on your wiki, you could try using NginX.

NginX is used to fetch static content: images, javascript, styles, etc, but it can also be used as a reverse-proxy to pass requests down to the web container (e.g. Tomcat on port 8080).

Unlike Apache, which instantiates a new process per every static file, NginX uses the same process to fetch all the static data, and thus gives you extra perfomance "for free". 

For more info on setting up NginX check this guide.

Local resource access

See URL API.

Backlinks

While a pretty neat feature, keeping track of the backlinks has a medium impact on the document saving time and a minor impact on the document loading time. If you feel that your wiki does not need backlinks, you can safely disable them with the following line in xwiki.cfg:

xwiki.backlinks=0

Versioning

One of the key features of any wiki system, versioning greatly affects the database size and the document update time. If you are sure your wiki does not need to keep track of all the changes and you will never need to revert documents to a previous version, then you can add the following line in xwiki.cfg:

xwiki.store.versioning=0

Custom Mapping

In some cases you may not want to rely on XWiki's generic database schema for storing XClass data and instead you'd like to provide your own optimized table. For these use cases you can use Custom Mapping.

LDAP

Disable LDAP sub groups search

By default when loading a LDAP group, each member is searched and loaded to figure out if it's a group or not (and then load the sub group members, etc). If you know there is no sub group in your LDAP groups you can disable it and speed up quite a lot big groups handling using xwiki.authentication.ldap.group_sync_resolve_subgroups property in xwiki.cfg configuration file.

Performance tree

Since 7.1 it's possible to directly get a tree of time spent in each step of the request by using debug mode.

Navigation Tree

The Navigation Panel and other navigation trees can have some small performance issues under very high volumes. Here ares some base measures we did on some developer laptops to give you some ideas of the performance you should expect:

Measure set 1:

DBLevelsSpacesPagesRequestsTime
MySQL56887404931180ms
HSQLDB56762406327138ms
MySQL12514496227331ms
HSQLDB123774718213.15s

Measure set 2:

DBLevelsSpacesPagesRequestsTime
Oracle56943410620119ms
Oracle12493498220153ms

Measure set 3:

DBLevelsSpacesPagesRequestsTime
Oracle12485499120151ms
PostgreSQL12494499120125ms

Measure set 4 (XWiki 10.5, Intel i7 CPU, SSD Storage):

DBLevelsSpacesPagesRequestsTime
PostgreSQL 105684740522574ms

Legacy

Slow random number generation on UNIX

Since Java 8 the default random generator used by the JVM on UNIX systems is generally /dev/urandom which should always be fast, but just in case you are in a setup in which it's not the case, this section can be interersting.

The library used for random number generation in Oracle's JVM relies on /dev/random by default for UNIX platforms.

Although /dev/random is more secure, it's possible to use /dev/urandom if the default JVM configuration instead.

To determine if your operating system exhibits this behavior, try displaying a portion of the file from a shell prompt:

head -n 1 /dev/random

If the command returns immediately, you can use /dev/random as the default generator for Oracle's JVM. If the command does not return immediately, use on of the following solutions to use /dev/urandom:

JVM setup

  1. Open the $JAVA_HOME/jre/lib/security/java.security file in a text editor.
  2. Change the line:
        securerandom.source=file:/dev/random
        to read:
        securerandom.source=file:/dev/urandom
  3. Save your change and exit the text editor.

Command line parameter

The same effect can be obtained using -Djava.security.egd=file:/dev/./urandom in the Java command line (usually in the application server configuration).

Monitor plugin

More a developer-oriented feature, XWiki can monitor its own code, reporting the time spent for each sub-component activated during a request. While the monitoring code isn't time consuming, it increases the memory consumption a bit, and the create/start/stop/log/destroy calls are spread all around the code, so you will save a lot of method calls by disabling this. You can do that by setting the following line in xwiki.cfg:

xwiki.monitor=0

1.0 rendering cache using velocity in document content itself

You can add the following to their content to cache them after they are rendered. Note that the document is refreshed whenever the content of the document changes, and the cache takes into account the URL, so it is pretty safe to add a long cache duration for all documents that don't contain scripts gathering data from the wiki. For example to cache the rendered content for 60 seconds you would add:

$context.setCacheDuration(60)

Since 1.5M2, you can set the default rendering cache duration for all pages in xwiki.cfg:

## cache all rendered documents for one hour
xwiki.rendering.defaultCacheDuration=3600

Setting the default cache duration to a large value, and manually disabling the cache in dynamic pages would really speed up the wiki, since the rendering is one of the most time consuming processes.

Wiki syntax features for XWiki Syntax 1.0

If you're using XWiki Syntax 1.0 and if you don't plan to use all of the markup features, like the strikethrough filter, the automatic http links filter, the SVG, Laszlo or style macros, you can disable them in xwiki-core-*.jar/META-INF/services/com.xpn.xwiki.render.*. The wiki rendering is the most costly operation in the rendering process, so any disabled feature counts.

Now this will have no effect if you're using another syntax, like XWiki Syntax 2.x.

Get Connected