Performance

Version 63.1 by Thomas Mortagne on 2015/05/21

Here are some tips to increase XWiki's performance.

Clustering

If you need high availability or if the load on your XWiki instance is too high you can configure XWiki in a cluster to spread the load.

Standalone Solr

cogSolr engine management: configuration, indexing, listeners, script service, etc. This module does not handle the search queries
TypeJAR
Category
Developed by

XWiki Development Team

Rating
0 Votes
LicenseGNU Lesser General Public License 2.1
Bundled With

XWiki Standard

Success

Installable with the Extension Manager

Description

Checkout the Solr Core to understand what information is being indexed.

Configuration

The following properties can be configured in the xwiki.properties file for the Solr API:

#-------------------------------------------------------------------------------------
# Solr Search
#-------------------------------------------------------------------------------------

#-# [Since 4.5M1]
#-# The Solr server type. Currently accepted values are "embedded" (default) and "remote".
# solr.type=embedded

#-# [Since 4.5M1]
#-# The location where the embedded Solr instance home folder is located.
#-# The default is the subfolder "store/solr" inside folder defined by the property "environment.permanentDirectory".
# solr.embedded.home=/var/local/xwiki/store/solr

#-# [Since 12.2]
#-# The URL of the Solr server (the root server and not the URL of a core).
#-# The default value assumes that the remote Solr server is started in a different process on the same machine, using the default port.
# solr.remote.baseURL=http://localhost:8983/solr

#-# [Since 5.1M1]
#-# Elements to index are not sent to the Solr server one by one but in batch to improve performances.
#-# It's possible to configure this behavior with the following properties:
#-#
#-# The maximum number of elements sent at the same time to the Solr server
#-# The default is 50.
# solr.indexer.batch.size=50
#-# The maximum number of characters in the batch of elements to send to the Solr server.
#-# The default is 10000.
# solr.indexer.batch.maxLength=10000

#-# [Since 5.1M1]
#-# The maximum number of elements in the background queue of elements to index/delete
#-# The default is 10000.
# solr.indexer.queue.capacity=100000

#-# [Since 6.1M2]
#-# Indicating if a synchronization between SOLR index and XWiki database should be run at startup.
#-# Synchronization can be started from search administration.
#-# The default is true.
# solr.synchronizeAtStartup=false

#-# [Since 12.5RC1]
#-# Indicates which wiki synchronization to perform when the "solr.synchronizeAtStartup" property is set to true.
#-# Two modes are available:
#-#   - WIKI: indicate that the synchronization is performed when each wiki is accessed for the first time.
#-#   - FARM: indicate that the synchronization is performed once for the full farm when XWiki is started.
#-# For large farms and in order to spread the machine's indexing load, the WIKI value is recommended, especially if
#-# some wikis are not used.
#-# The default is:
# solr.synchronizeAtStartupMode=FARM

#-# [Since 17.2.0RC1]
#-# [Since 16.10.5]
#-# [Since 16.4.7]
#-# Indicates the batch size for the synchronization between SOLR index and XWiki database. This defines how many
#-# documents will be loaded from the database and Solr in each step. Higher values lead to fewer queries and thus
#-# better performance but increase the memory usage. The expected memory usage is around 1KB per document, but
#-# depends highly on the length of the document names.
#-# The default is 1000.
# solr.synchronizeBatchSize=1000

Setup a remote Solr server

Solr is not so great at retro-compatibility when it comes to core schema, so it's safer to use the version of Solr that your version of XWiki uses as the embedded version. Here is a compatibility matrix to help with the choice:

XWiki versionSolr version
11.4 to 11.57.7.x (XWiki embeds 7.7.1)
11.6 to 13.28.1.x (XWiki embeds 8.1.1)
12.3 to 13.08.5.x (XWiki embeds 8.5.1)
13.1 to 14.78.8.x (XWiki embeds 8.8.0)
14.8 to 16.1.08.11.x (XWiki embeds 8.11.2)
16.2.0+9.4.x (XWiki embeds 9.4.1)

Download and install Solr. WarningXWiki 16.6.0+ You will need to enable the analysis-extras module.

Debian based system

If your Solr instance is installed on a Debian/Ubuntu system take a look at InstallationViaAPT.

Manual install

The Solr REST API is unfortunately too limited, so you will need to create several cores on your Solr instance. For each one, download the zip file synchronized with your version of XWiki and unzip its content in a new folder located with other Solr cores with the following names:

XWiki <16.2.0

Solr8:

Indicate in xwiki.properties file that you want to use a remote Solr instance, and its URL:

solr.type=remote

solr.remote.baseURL=http://solrhost/solr

When using solr.remote.baseURL you can control the name of the search core (and the prefix for the other cores) using solr.remote.corePrefix property (default the main core is "xwiki" and the others are prefixed with "xwiki_").

Data transfer upon moving the Solr of an existing instance to a remote Solr

TODO: add a note about how to move data for data cores (ratings & events) from the embedded Solr to the remote Solr

Backup remote Solr data

TODO: add a note about what and how to backup the data from the external Solr server.

Performances

By default XWiki ships with an embedded Solr. This is mostly for ease of use but the embedded instance is not really recommended by the Solr team so you might want to externalize it when starting to have a wiki with a lots of pages. Solr is using a lot of memory and a standalone Solr instance is generally better in term of speed than the embedded one. It should not be much noticeable in a small wiki but if you find yourself starting to have memory issues and slow search results you should probably try to install and setup an external instance of Solr using the guide.

Also the speed of the drive where the Solr index is located can be very important because Solr/Lucene is quite filesystem intensive. For example putting it in a SSD might give a noticeable boost.

You can also find more Solr-specific performance details on https://wiki.apache.org/solr/SolrPerformanceProblems. Standalone Solr also comes with a very nice UI, along with monitoring and test tools.

Size on disk

It depends on the size of each document but an instance like the http://www.myxwiki.org farm (mostly standard documents in lots of wikis) uses 3.2GB of disk space to store around 180000 documents, which gives approximately 18KB per document.

Prerequisites & Installation Instructions

We recommend using the Extension Manager to install this extension (Make sure that the text "Installable with the Extension Manager" is displayed at the top right location on this page to know if this extension can be installed with the Extension Manager).

You can also use the manual method which involves dropping the JAR file and all its dependencies into the WEB-INF/lib folder and restarting XWiki.

Dependencies

Dependencies for this extension (org.xwiki.platform:xwiki-platform-search-solr-api 17.2.0):

Slow random number generation on UNIX

The library used for random number generation in Sun's JVM relies on /dev/random by default for UNIX platforms.

Although /dev/random is more secure, it's possible to use /dev/urandom if the default JVM configuration instead.

To determine if your operating system exhibits this behavior, try displaying a portion of the file from a shell prompt:

head -n 1 /dev/random

If the command returns immediately, you can use /dev/random as the default generator for SUN's JVM. If the command does not return immediately, use these steps to configure the JVM to use /dev/urandom:

  1. Open the $JAVA_HOME/jre/lib/security/java.security file in a text editor.
  2. Change the line:
        securerandom.source=file:/dev/random
        to read:
        securerandom.source=file:/dev/urandom
  3. Save your change and exit the text editor.

Gzip compression and caching of static pages

We're working on making these features part of the XWiki core (see XWIKI-2022). While waiting for this to be natively implemented, the recommended solution is to set up an Apache Web Server in front of your servlet container and install/configure the following modules:

Modify your Apache configuration file to load the different modules:

LoadModule expires_module /usr/lib/apache2/modules/mod_expires.so
LoadModule deflate_module /usr/lib/apache2/modules/mod_deflate.so
LoadModule proxy_module /usr/lib/apache2/modules/mod_proxy.so
# Depends: proxy
LoadModule proxy_ajp_module /usr/lib/apache2/modules/mod_proxy_ajp.so

Alternatively you can run the following commands as root (sudo)

a2enmod deflate
a2enmod proxy_ajp
a2enmod expires

and configure your different modules as described below:

Mod Deflate Configuration

vwwwpro-1:~# cat /etc/apache2/conf.d/deflate
<Location />
   # Insert filter
   SetOutputFilter DEFLATE

   # Netscape 4.x has some problems...
   BrowserMatch ^Mozilla/4 gzip-only-text/html

   # Netscape 4.06-4.08 have some more problems
   BrowserMatch ^Mozilla/4.0[678] no-gzip

   # MSIE masquerades as Netscape, but it is fine
   # BrowserMatch bMSIE !no-gzip !gzip-only-text/html

   # NOTE: Due to a bug in mod_setenvif up to Apache 2.0.48
   # the above regex won't work. You can use the following
   # workaround to get the desired effect:
   BrowserMatch bMSI[E] !no-gzip !gzip-only-text/html

   # Don't compress images
   SetEnvIfNoCase Request_URI .(?:gif|jpe?g|png)$ no-gzip dont-vary

   # Make sure proxies don't deliver the wrong content
   #Header append Vary User-Agent env=!dont-vary
</Location>

On debian apache2 the config file for deflate is located under /etc/apache2/mods-enabled/deflate.conf

Mod Expire Configuration

vwwwpro-1:~# cat /etc/apache2/conf.d/expires
<Location /xwiki/skins/>
       ExpiresActive on
       ExpiresDefault "access plus 1 day"
</Location>

<Location /xwiki/bin/skin/>
       ExpiresActive on
       ExpiresDefault "access plus 1 day"
</Location>

Mod Proxy AJP Configuration

ProxyRequests Off
   <Proxy *>
       Order deny,allow
       Allow from all
   </Proxy>
   ProxyPreserveHost On
   ProxyPass /xwiki ajp://192.168.1.181:8009/xwiki

where ajp://192.168.1.181:8009/xwiki is the internal address of your Servlet container where XWiki is running.

If you use Tomcat(7) you need to enable the ajp connector in the /etc/tomcat7/server.xml. Comment out the following line with adding <!-- -->
. Maybe give a comment why you did it.

    <!-- disable to use ajp connector instead
   <Connector port="8080" protocol="HTTP/1.1"
               connectionTimeout="20000"
               URIEncoding="UTF-8"
               redirectPort="8443" />
   -->

Uncomment the following line by removing the <!-- -->
and add  URIEncoding="UTF-8" to it.Maybe give a comment too.

<!-- Activate ajp connector for apache proxy_ajp -->
<Connector port="8009" protocol="AJP/1.3" redirectPort="8443" URIEncoding="UTF-8"/>

Memory

You need to configure your Servlet container so that XWiki has enough memory. You'll need to tune the value to your need. You should check the logs and see if there are any "out of memory" errors. Here are some good default values:

  • Small installs: A minimum of 512MB of heap memory and 196MB of permGen (-Xmx512m -XX:MaxPermSize=196m)
  • Medium installs: 1024MB for the heap and 196MB of permGen (-Xmx1024m -XX:MaxPermSize=196m)
  • Large installs: 2048MB (or beyond) for the heap and 196MB of permGen (-Xmx2048m -XX:MaxPermSize=196m).
Information

You should not increase the memory beyond what you need because increasing it means that there's more Objects in memory at any time and the automatic JVM Garbage Collector has to work harder to clean itself, which can results in performance degradation in XWiki (since a full GC will pause the application for a longer time).

Warning

Note that storing attachments with the default (in database) storage mechanism is very memory intensive. See the administrators guide to attachments for more information about memory cost and the alternative filesystem based attachment store.

Also note that uploading a lot of pages can trigger out of memory (OOM) errors due to scheduled watchlist jobs. For example uploading 1 million pages will trigger OOM errors even when the JVM is configured to run with 2GB of heap space. For such kind of load we recommend to disable (unschedule) the Watchlist jobs (in /xwiki/bin/view/Scheduler/) before uploading the pages.

For your information here are the values used for the xwiki.org site:

CATALINA_OPTS="-server -Xms800m -Xmx1480m -XX:MaxPermSize=222m -Dfile.encoding=utf-8 -Djava.awt.headless=true -XX:+UseParallelGC -XX:MaxGCPauseMillis=100"

Database Indexes

Make sure you've set Database indexes. This is especially important when you start having lots of documents.

Panels

Some panels take more resources than others. For example the Navigation panel should NOT be used for wikis with a lot of documents since it displays all documents in the wiki. In the future that panel should be improved for performance but that's not the case right now. Originally this panel was only meant as a starting point. A better approach is to use a "Quick Links Panel" as we've now set up in the default XWiki Enterprise wiki version 1.1 (we've removed the default usage of the Navigation Panel in that version).

Robots.txt

If your wiki is open on the Internet, it'll be crawled by search robots (like GoogleBot, etc). They will call all the URLs and especially the ones that are resource hungry like exports (PDF/RTF). You need to protect against this. To do so configure a robots.txt file like this one and put it in your webserver configuration:

User-agent: *
Disallow: /xwiki/bin/attach/
Disallow: /xwiki/bin/cancel/
Disallow: /xwiki/bin/commentadd/
Disallow: /xwiki/bin/delattachment/
Disallow: /xwiki/bin/delete/
Disallow: /xwiki/bin/dot/
Disallow: /xwiki/bin/download/
Disallow: /xwiki/bin/downloadrev/
Disallow: /xwiki/bin/edit/
Disallow: /xwiki/bin/export/
Disallow: /xwiki/bin/get/
Disallow: /xwiki/bin/inline/
Disallow: /xwiki/bin/lifeblog/
Disallow: /xwiki/bin/login/
Disallow: /xwiki/bin/loginerror/
Disallow: /xwiki/bin/logout/
Disallow: /xwiki/bin/objectremove/
Disallow: /xwiki/bin/pdf/
Disallow: /xwiki/bin/preview/
Disallow: /xwiki/bin/propadd/
Disallow: /xwiki/bin/propdelete/
Disallow: /xwiki/bin/propupdate/
Disallow: /xwiki/bin/register/
Disallow: /xwiki/bin/save/
Disallow: /xwiki/bin/skin/
Disallow: /xwiki/bin/status/
Disallow: /xwiki/bin/upload/
Disallow: /xwiki/bin/viewattachrev/
Disallow: /xwiki/bin/viewrev/
Disallow: /xwiki/bin/xmlrpc/
# It could be also usefull to block certain spaces from crawling,
# especially if this spaces doesn't provide new content
Disallow: /xwiki/bin/view/Main/
Disallow: /xwiki/bin/view/XWiki/
# on the other hand you would like to have your recent (public) changes included
Allow: /xwiki/bin/view/Main/Dashboard

Note:

For Tomcat6 the placement of the robots.txt file should be within the $TOMCAT/webapps/ROOT folder and should have permission 644 applied.

-rw-r--r--  1 root  www  1478 Jan  8 15:52 robots.txt

In order to test if the robots.txt file is either accessable or working as desired use this checker.

Statistics

Information

This is no longer true starting with XE 1.4M2 since statistics are now put on a queue and written in a different thread in the database in one go, thus reducing the overhead to a maximum.

The statistics module is off by default since it's quite database intensive. If you don't need it you should turn it off.

Monitoring

More a developer-oriented feature, XWiki can monitor its own code, reporting the time spent for each sub-component activated during a request. While the monitoring code isn't time consuming, it increases the memory consumption a bit, and the create/start/stop/log/destroy calls are spread all around the code, so you will save a lot of method calls by disabling this. You can do that by setting the following line in xwiki.cfg:

xwiki.monitor=0

Document Cache

You can tune the Document cache in the xwiki.cfg configuration file. The value depends on how much memory you have. The higher the better. A good reasonable value is 1000.

xwiki.store.cache.capacity=1000

Cache Macro

It's possible to perform selective content caching by using the Cache Macro.

LESS Cache

LESS is a preprocessor used to generate CSS files for skins and skin extensions. See the Performance section of the LESS module documentation to learn more about how to optimize this cache for performances.

Rendering cache

Some pages are complex to render (they may aggregate outside data for example or do complex and slow queries). For theses pages you can use rendering cache.

Configuration based

Pages can be cached (i.e. their rendered content cached) to speed up displaying. The configuration is done in xwiki.properties with the following configuration options:

#-# [Since 2.4M1]
#-# Indicate if the rendering cache is enabled.
#-# Default value is false.
# core.renderingcache.enabled=true

#-# [Since 2.4M1]
#-# A list of Java regex patterns matching full documents reference.
# core.renderingcache.documents=wiki:Space\.Page
# core.renderingcache.documents=wiki:Space\..*

#-# [Since 2.4M1]
#-# The time (in seconds) after which data should be removed from the cache when not used.
#-# Default value is 300 (5 min).
# core.renderingcache.duration=300

#-# [Since 2.4M1]
#-# The size of the rendering cache. Not that it's not the number of cached documents but the number of cached results.
#-# (For a single document several cache entries are created, because each action, language and request query string
#-# produces a unique rendering result)
#-# Default value is 100.
# core.renderingcache.size=100

You can force a page to refresh using refresh=1 in the URL.

Since 6.2 it's also possible to programmatically refresh any document cache using com.xpn.xwiki.internal.cache.rendering.RenderingCache component:

@Inject
private RenderingCache renderingCache;

...

renderingCache.flushWholeCache();
renderingCache.flushCache(new DocumentReference("xwiki", "MySpace", "MyCachedDocument"));

Enabled using velocity in document content itself (XWiki 1.0 syntax only)

You can add the following to their content to cache them after they are rendered. Note that the document is refreshed whenever the content of the document changes, and the cache takes into account the URL, so it is pretty safe to add a long cache duration for all documents that don't contain scripts gathering data from the wiki. For example to cache the rendered content for 60 seconds you would add:

$context.setCacheDuration(60)

Since 1.5M2, you can set the default rendering cache duration for all pages in xwiki.cfg:

## cache all rendered documents for one hour
xwiki.rendering.defaultCacheDuration=3600

Setting the default cache duration to a large value, and manually disabling the cache in dynamic pages would really speed up the wiki, since the rendering is one of the most time consuming processes.

Merge the CSS files

In order to reduce the number of requests and files that are downloaded from the browser or client, it could help to merge all XWiki CSS files into a single one. See the Merge CSS Script.

Set up NginX

If you experience heavy loads on your wiki, you could try using NginX.

NginX is used to fetch static content: images, javascript, styles, etc, but it can also be used as a reverse-proxy to pass requests down to the web container (e.g. Tomcat on port 8080).

Unlike Apache, which instantiates a new process per every static file, NginX uses the same process to fetch all the static data, and thus gives you extra perfomance "for free". 

For more info on setting up NginX check this guide.

Backlinks

While a pretty neat feature, keeping track of the backlinks has a medium impact on the document saving time and a minor impact on the document loading time. If you feel that your wiki does not need backlinks, you can safely disable them with the following line in xwiki.cfg:

xwiki.backlinks=0

Versioning

One of the key features of any wiki system, versioning greatly affects the database size and the document update time. If you are sure your wiki does not need to keep track of all the changes and you will never need to revert documents to a previous version, then you can add the following line in xwiki.cfg:

xwiki.store.versioning=0

Custom Mapping

In some cases you may not want to rely on XWiki's generic database schema for storing XClass data and instead you'd like to provide your own optimized table. For these use cases you can use Custom Mapping.

Wiki syntax features for XWiki Syntax 1.0

If you're using XWiki Syntax 1.0 and if you don't plan to use all of the markup features, like the strikethrough filter, the automatic http links filter, the SVG, Laszlo or style macros, you can disable them in xwiki-core-*.jar/META-INF/services/com.xpn.xwiki.render.*. The wiki rendering is the most costly operation in the rendering process, so any disabled feature counts.

Now this will have no effect if you're using another syntax, like XWiki Syntax 2.x.

Performance tree

Since 7.1 it's possible to directly get a tree of time spent in each step of the request by using debug mode.

Get Connected