To execute the test suite run
cd test/php
- phpunit ../
+ UNIT_TEST_DSN='pgsql:dbname=nominatim_unit_tests' phpunit ../
It will read phpunit.xml which points to the library, test path, bootstrap
strip and set other parameters.
+It will use (and destroy) a local database 'nominatim_unit_tests'. You can set
+a different connection string with e.g. UNIT_TEST_DSN='pgsql:dbname=foo_unit_tests'.
BDD Functional Tests
====================
cd test/bdd
behave
-The tests can be configured with a set of environment variables:
+The tests can be configured with a set of environment variables (`behave -D key=val`):
- * `BUILD_DIR` - build directory of Nominatim installation to test
+ * `BUILDDIR` - build directory of Nominatim installation to test
* `TEMPLATE_DB` - name of template database used as a skeleton for
the test databases (db tests)
* `TEST_DB` - name of test database (db tests)
- * `ABI_TEST_DB` - name of the database containing the API test data (api tests)
+ * `API_TEST_DB` - name of the database containing the API test data (api tests)
+ * `DB_HOST` - (optional) hostname of database host
+ * `DB_PORT` - (optional) port of database on host
+ * `DB_USER` - (optional) username of database login
+ * `DB_PASS` - (optional) password for database login
+ * `SERVER_MODULE_PATH` - (optional) path on the Postgres server to Nominatim
+ module shared library file
* `TEST_SETTINGS_TEMPLATE` - file to write temporary Nominatim settings to
* `REMOVE_TEMPLATE` - if true, the template database will not be reused during
the next run. Reusing the base templates speeds up tests
These tests are meant to test the different API endpoints and their parameters.
They require a preimported test database, which consists of the import of a
planet extract. A precompiled PBF with the necessary data can be downloaded from
-http://www.nominatim.org/data/test/nominatim-api-testdata.pbf
+https://www.nominatim.org/data/test/nominatim-api-testdata.pbf
+
+You need at least 2GB RAM and 10GB discspace.
The polygons defining the extract can be found in the test/testdb
directory. There is also a reduced set of wikipedia data for this extract,
which you need to import as well. For Tiger tests the data of South Dakota
is required. Get the Tiger files `46*`.
-The official test dataset is derived from the 160725 planet. Newer
+ cd Nominatim/data
+ wget https://nominatim.org/data/tiger2018-nominatim-preprocessed.tar.gz
+ tar xvf tiger2018-nominatim-preprocessed.tar.gz --wildcards --no-anchored '46*'
+ rm tiger2018-nominatim-preprocessed.tar.gz
+
+The official test dataset is derived from the 180924 planet. Newer
planets are likely to work as well but you may see isolated test
failures where the data has changed. To recreate the input data
for the test database run:
- wget http://free.nchc.org.tw/osm.planet/pbf/planet-160725.osm.pbf
- osmconvert planet-160725.osm.pbf -B=test/testdb/testdb.polys -o=testdb.pbf
+ wget https://ftp5.gwdg.de/pub/misc/openstreetmap/planet.openstreetmap.org/pbf/planet-180924.osm.pbf
+ osmconvert planet-180924.osm.pbf -B=test/testdb/testdb.polys -o=testdb.pbf
Before importing make sure to add the following to your local settings:
#### Code Coverage
The API tests also support code coverage tests. You need to install
-PHP_CodeCoverage. On Debian/Ubuntu run:
+[PHP_CodeCoverage](https://github.com/sebastianbergmann/php-code-coverage).
+On Debian/Ubuntu run:
- apt-get install php-codecoverage
+ apt-get install php-codecoverage php-xdebug
The run the API tests as follows:
behave api -DPHPCOV=<coverage output dir>
-To generate reports, you can use the phpcov tool:
+The output directory must be an absolute path. To generate reports, you can use
+the [phpcov](https://github.com/sebastianbergmann/phpcov) tool:
phpcov merge --html=<report output dir> <coverage output dir>