This repo contains Docker Compose containers to run the MediaWiki software.
Clone the repo. Then create and start the containers:
cd docker-bugsigdb.org
copy a database dump to the __initdb directory
copy images to the `bugsigdb.org/_data/mediawiki/images` directory
copy .env_example to .env and modify as needed (see the Settings section)
docker-compose up
Wait for the completion of the build and initialization process and access it via http://localhost:8081
in a browser.
Running sudo docker-compose up
will start the containers:
db
- MySQL container, used as the database backend for MediaWiki.web
- Apache/MediaWiki container with PHP 7.4 and MediaWiki 1.35.0redis
- Redis is an open-source key-value store that functions as a data structure servermatomo
- Matomo instance
Settings are in the docker-compose.yml
file, in the environment sections.
Also, _resources
contains the favicon, logo and styles for the chameleon skin.
CustomSettings.php
contains settings for MediaWiki core and extensions. If customization is required, change the settings there.
Was cloned from the official mysql container and has the same environment variables.
The reason that it is better than the official is the ability to automatically update the database when upgrading the version of mysql.
The only one important environment variable for us is MYSQL_ROOT_PASSWORD
; it specifies the password that will be set for the MySQL root
superuser account.
If changed, make sure that MW_DB_INSTALLDB_PASS
in the web section was changed too.
MW_SITE_SERVER
configures $wgServer; set this to the server host and include the protocol likehttp://my-wiki:8080
MW_SITE_NAME
configures $wgSitenameMW_SITE_LANG
configures $wgLanguageCodeMW_DEFAULT_SKIN
configures $wgDefaultSkinMW_ENABLE_UPLOADS
configures $wgEnableUploadsMW_USE_INSTANT_COMMONS
configures $wgUseInstantCommonsMW_ADMIN_USER
configures the default administrator usernameMW_ADMIN_PASS
configures the default administrator passwordMW_DB_NAME
specifies the database name that will be created automatically upon container startupMW_DB_USER
specifies the database user for access to the database specified inMW_DB_NAME
MW_DB_PASS
specifies the database user passwordMW_DB_INSTALLDB_USER
specifies the database superuser name for create database and user specified aboveMW_DB_INSTALLDB_PASS
specifies the database superuser password; should be the same asMYSQL_ROOT_PASSWORD
in db section.MW_PROXY_SERVERS
(comma separated values) configures $wgSquidServers. Leave empty if no reverse proxy server used.MW_MAIN_CACHE_TYPE
configures $wgMainCacheType.MW_MEMCACHED_SERVERS
should be provided forCACHE_MEMCACHED
.MW_MEMCACHED_SERVERS
(comma separated values) configures $wgMemCachedServers.MW_AUTOUPDATE
iftrue
(by default), run needed maintenance scripts automatically before web server start.MW_SHOW_EXCEPTION_DETAILS
iftrue
(by default) configures $wgShowExceptionDetails as true.PHP_LOG_ERRORS
specifieslog_errors
parameter inphp.ini
file.PHP_ERROR_REPORTING
specifieserror_reporting
parameter inphp.ini
file.E_ALL
by default, on production should be changed toE_ALL & ~E_DEPRECATED & ~E_STRICT
.MATOMO_USER
- Matomo admin usernameMATOMO_PASSWORD
- Matomo admin password
The LocalSettings.php is divided into three parts:
- LocalSettings.php will be created automatically upon container startup, contains settings specific to the MediaWiki installed instance such as database connection, $wgSecretKey and etc. Should not be changed
- DockerSettings.php contains settings specific to the released containers such as database server name, path to programs, installed extensions, etc. Should be changed if you make changes to the containers only
- CustomSettings.php - contains user-defined settings such as user rights, extensions settings and etc. For any required customizations, make changes there.
CustomSettings.php
placed in folder_resources
And will be copied to the container during build
Data, like uploaded images and the database files are stored in the _data
directory.
Docker containers write files to these directories using internal users; most likely you cannot change/remove these directories until you change the permissions.
Log files arestored in the _logs
directory.
Make a full backup of the wiki, including both the database and the files. While the upgrade scripts are well-maintained and robust, things could still go awry.
cd compose-mediawiki-ubuntu
docker-compose exec db /bin/bash -c 'mysqldump --all-databases -uroot -p"$MYSQL_ROOT_PASSWORD" 2>/dev/null | gzip | base64 -w 0' | base64 -d > backup_$(date +"%Y%m%d_%H%M%S").sql.gz
docker-compose exec web /bin/bash -c 'tar -c $MW_VOLUME $MW_HOME/images 2>/dev/null | base64 -w 0' | base64 -d > backup_$(date +"%Y%m%d_%H%M%S").tar
For picking up the latest changes, stop, rebuild and start containers:
cd compose-mediawiki-ubuntu
git pull
docker-compose build
docker-compose stop
docker-compose up
The upgrade process is fully automated and includes the launch of all necessary maintenance scripts.
By default, Matomo runs on port 8182 (to be shadowed with Nginx) and requires initial setup
on the first run. Once installed, modify the .env
file by adding MATOMO_USER
and MATOMO_PASSWORD
variables matching the user & password that were used during installation.
Make the import_logs_matomo.sh
run on Cron @daily close to midnight to keep the Matomo
fed with visit information.
# matomo
location /matomo/ {
proxy_set_header Host $host;
proxy_set_header X-Forwarded-Host $host;
proxy_set_header X-Forwarded-Server $host;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header X-Forwarded-Uri /matomo;
proxy_read_timeout 300;
proxy_pass http://127.0.0.1:8182/;
proxy_set_header X-Forwarded-For $remote_addr;
}
Also, once the containers are started, modify the Matomo config as below (the settings are intended to be generated automatically, but it's better to verify):
[General]
trusted_hosts[] = "127.0.0.1:8182"
assume_secure_protocol = 1
force_ssl=0
proxy_uri_header = 1
The image is configured to automatically purge the homepage once per hour. You can configure this using the following environment variables:
MW_CACHE_PURGE_PAUSE=3600
MW_CACHE_PURGE_PAGE=Main_Page
The repo contains a Python script that is capable to walk the wiki Glossary terms pages and update outdated EFO links by replacing them with actual ones. Follow the steps below to set it up:
- Install
python
v.3 andpip
- Run
pip install -r updateEFO.requirements.txt
- Navigate to
Special:BotPasswords
on the wiki and create a new Bot with mass edit permissions - Run
python updateEFO.py --help
to ensure you hava a correct python version links as default binary, you should see a help text - Run
python updateEFO.py -s www.site.com -uBOT_USERNAME -pBOT_PASSWORD --verbose --dry
- The script should start working and printing some output, if everything looks good terminate it with Ctrl+C
- Modify the
updateEFO.cron
file to use correct credentials and paths to the script and the output log - Copy the
updateEFO.cron
contents to yourcrontab -e
file or move it to/etc/cron.weekly/
by executing the following command:cp updateEFO.cron /etc/cron.weekly/updateEFO && chown root:root /etc/cron.weekly/updateEFO && chmod +x /etc/cron.weekly/updateEFO
Note: the script may produce extra load to the wiki so it's recommended to schedule it for nigh time, also worth to
consider that it takes time to process all the pages so average script cycle is ~4-8 hours. You can change sleep
timeouts via -z
parameter.
To work around T333776 we run maintenance/updateSpecialPages.php once a day. This ensures the count of active users on Special:CreateAccount stays up to date.
- bugsigdb.org: A Comprehensive Database of Published Microbial Signatures
- BugSigDB issue tracker: Report bugs or feature requests for bugsigdb.org
- BugSigDBExports: Hourly data exports of bugsigdb.org
- Stable data releases: Periodic manually-reviewed stable data releses on Zenodo
- bugsigdbr: R/Bioconductor access to published microbial signatures from BugSigDB
- Curation issues: Report curation issues, requests studies to be added
- bugSigSimple: Simple analyses of BugSigDB data in R
- BugSigDBStats: Statistics and trends of BugSigDB
- BugSigDBPaper: Reproduces analyses of the Nature Biotechnology publication
- community-bioc Slack Team: Join #bugsigdb channel