- name: BDD tests
run: |
- python3 -m behave -DREMOVE_TEMPLATE=1 -DBUILDDIR=$GITHUB_WORKSPACE/build --format=progress3
+ export PATH=$GITHUB_WORKSPACE/build/osm2pgsql:$PATH
+ python3 -m behave -DREMOVE_TEMPLATE=1 --format=progress3
working-directory: Nominatim/test/bdd
- name: Install mypy and typechecking info
- name: BDD tests (legacy tokenizer)
run: |
- python3 -m behave -DREMOVE_TEMPLATE=1 -DBUILDDIR=$GITHUB_WORKSPACE/build -DAPI_ENGINE=php -DTOKENIZER=legacy --format=progress3
+ export PATH=$GITHUB_WORKSPACE/build/osm2pgsql:$PATH
+ python3 -m behave -DREMOVE_TEMPLATE=1 -DSERVER_MODULE_PATH=$GITHUB_WORKSPACE/build/module -DAPI_ENGINE=php -DTOKENIZER=legacy --format=progress3
working-directory: Nominatim/test/bdd
- name: BDD tests (php)
run: |
- python3 -m behave -DREMOVE_TEMPLATE=1 -DBUILDDIR=$GITHUB_WORKSPACE/build -DAPI_ENGINE=php --format=progress3
+ export PATH=$GITHUB_WORKSPACE/build/osm2pgsql:$PATH
+ python3 -m behave -DREMOVE_TEMPLATE=1 -DAPI_ENGINE=php --format=progress3
working-directory: Nominatim/test/bdd
* [mkdocstrings](https://mkdocstrings.github.io/) >= 0.25
* [mkdocs-material](https://squidfunk.github.io/mkdocs-material/)
+Please be aware that tests always run against the globally installed
+osm2pgsql, so you need to have this set up. If you want to test against
+the vendored version of osm2pgsql, you need to set the PATH accordingly.
+
### Installing prerequisites on Ubuntu/Debian
The Python tools should always be run with the most recent version.
The tests can be configured with a set of environment variables (`behave -D key=val`):
- * `BUILDDIR` - build directory of Nominatim installation to test
* `TEMPLATE_DB` - name of template database used as a skeleton for
the test databases (db tests)
* `TEST_DB` - name of test database (db tests)
* `DB_USER` - (optional) username of database login
* `DB_PASS` - (optional) password for database login
* `SERVER_MODULE_PATH` - (optional) path on the Postgres server to Nominatim
- module shared library file
+ module shared library file (only needed for legacy tokenizer)
* `REMOVE_TEMPLATE` - if true, the template and API database will not be reused
during the next run. Reusing the base templates speeds
up tests considerably but might lead to outdated errors
API tests should only be testing the functionality of the website PHP code.
Most tests should be formulated as BDD DB creation tests (see below) instead.
-#### Code Coverage (PHP engine only)
-
-The API tests also support code coverage tests. You need to install
-[PHP_CodeCoverage](https://github.com/sebastianbergmann/php-code-coverage).
-On Debian/Ubuntu run:
-
- apt-get install php-codecoverage php-xdebug
-
-Then run the API tests as follows:
-
- behave api -DPHPCOV=<coverage output dir>
-
-The output directory must be an absolute path. To generate reports, you can use
-the [phpcov](https://github.com/sebastianbergmann/phpcov) tool:
-
- phpcov merge --html=<report output dir> <coverage output dir>
-
### DB Creation Tests (`test/bdd/db`)
These tests check the import and update of the Nominatim database. They do not
expected_count = sum(t.count for t in hnrs)
partials = {t.token: t.addr_count for trange in address
- for t in self.query.get_partials_list(trange)}
+ for t in self.query.get_partials_list(trange)
+ if t.is_indexed}
+
+ if not partials:
+ # can happen when none of the partials is indexed
+ return
if expected_count < 8000:
sdata.lookups.append(dbf.FieldLookup('nameaddress_vector',
lookup_word = row.word_token[1:]
elif rowclass == 'place' and row.type == 'postcode':
ttype = qmod.TokenType.POSTCODE
- lookup_word = row.word_token[1:]
+ lookup_word = row.word
else:
ttype = qmod.TokenType.NEAR_ITEM if row.operator in ('in', 'near')\
else qmod.TokenType.QUALIFIER
def _dump_word_tokens(query: qmod.QueryStruct) -> Iterator[List[Any]]:
- yield ['type', 'token', 'word_token', 'lookup_word', 'penalty', 'count', 'info']
+ yield ['type', 'token', 'word_token', 'lookup_word', 'penalty', 'count', 'info', 'indexed']
for node in query.nodes:
for tlist in node.starting:
for token in tlist.tokens:
t = cast(LegacyToken, token)
yield [tlist.ttype.name, t.token, t.word_token or '',
- t.lookup_word or '', t.penalty, t.count, t.info]
+ t.lookup_word or '', t.penalty, t.count, t.info,
+ 'Y' if t.is_indexed else 'N']
async def create_query_analyzer(conn: SearchConnection) -> AbstractQueryAnalyzer:
Feature: Searches with postcodes
Various searches involving postcodes
+ @v1-api-php-only
Scenario: US 5+4 ZIP codes are shortened to 5 ZIP codes if not found
When sending json search query "36067 1111, us" with address
Then result addresses contain
| postcode |
| 36067 |
+ And results contain
+ | type |
+ | postcode |
Scenario: Postcode search with address
When sending json search query "9486, mauren"
| AD675 |
+ @fail-legacy
Scenario: Different postcodes with the same normalization can both be found
Given the places
| osm | class | type | addr+postcode | addr+housenumber | geometry |
--- /dev/null
+@DB
+Feature: Reverse searches
+ Test results of reverse queries
+
+ @v1-api-python-only
+ Scenario: POI in POI area
+ Given the 0.0001 grid with origin 1,1
+ | 1 | | | | | | | | 2 |
+ | | 9 | | | | | | | |
+ | 4 | | | | | | | | 3 |
+ And the places
+ | osm | class | type | geometry |
+ | W1 | aeroway | terminal | (1,2,3,4,1) |
+ | N1 | amenity | restaurant | 9 |
+ When importing
+ And sending v1/reverse at 1.0001,1.0001
+ Then results contain
+ | osm |
+ | N1 |
+ When sending v1/reverse at 1.0003,1.0001
+ Then results contain
+ | osm |
+ | W1 |
| W1 |
+ @fail-legacy
Scenario Outline: Special cased american states will be found
Given the grid
| 1 | | 2 |
| R1 | boundary | administrative | rel | 8 | (10,11,12,13,10) |
And the places
| osm | class | type | name+name:de |
- | N3 | place | city | pnt |
+ | N3 | place | city | greeny |
And the relations
| id | members |
| 1 | N3:label |
When importing
Then placex contains
| object | linked_place_id | name+_place_name:de |
- | R1 | - | pnt |
+ | R1 | - | greeny |
And placex contains
| object | linked_place_id | name+name:de |
- | N3 | R1 | pnt |
+ | N3 | R1 | greeny |
When updating places
| osm | class | type | name+name:de |
| N3 | place | city | newname |
| R1 | boundary | administrative | rel | 8 | (10,11,12,13,10) |
And the places
| osm | class | type | name |
- | N3 | place | city | pnt |
+ | N3 | place | city | greeny |
And the relations
| id | members |
| 1 | N3:label |
When importing
Then placex contains
| object | linked_place_id | name+_place_name | name+name |
- | R1 | - | pnt | rel |
+ | R1 | - | greeny | rel |
And placex contains
| object | linked_place_id | name+name |
- | N3 | R1 | pnt |
- When sending search query "pnt"
+ | N3 | R1 | greeny |
+ When sending search query "greeny"
Then results contain
| osm |
| R1 |
And placex contains
| object | linked_place_id | name+_place_name:de | name+name |
| R1 | - | depnt | rel |
- When sending search query "pnt"
+ When sending search query "greeny"
Then exactly 0 results are returned
Scenario: Updating linkee extratags keeps linker's extratags
# Copyright (C) 2024 by the Nominatim developer community.
# For a full list of authors see the git log.
from pathlib import Path
+import sys
from behave import *
+sys.path.insert(1, str(Path(__file__, '..', '..', '..', 'src').resolve()))
+
from steps.geometry_factory import GeometryFactory
from steps.nominatim_environment import NominatimEnvironment
-TEST_BASE_DIR = Path(__file__) / '..' / '..'
+TEST_BASE_DIR = Path(__file__, '..', '..').resolve()
userconfig = {
- 'BUILDDIR' : (TEST_BASE_DIR / '..' / 'build').resolve(),
'REMOVE_TEMPLATE' : False,
'KEEP_TEST_DB' : False,
'DB_HOST' : None,
'TEMPLATE_DB' : 'test_template_nominatim',
'TEST_DB' : 'test_nominatim',
'API_TEST_DB' : 'test_api_nominatim',
- 'API_TEST_FILE' : (TEST_BASE_DIR / 'testdb' / 'apidb-test-data.pbf').resolve(),
+ 'API_TEST_FILE' : TEST_BASE_DIR / 'testdb' / 'apidb-test-data.pbf',
'SERVER_MODULE_PATH' : None,
'TOKENIZER' : None, # Test with a custom tokenizer
'STYLE' : 'extratags',
- 'API_ENGINE': 'falcon',
- 'PHPCOV' : False, # set to output directory to enable code coverage
+ 'API_ENGINE': 'falcon'
}
use_step_matcher("re")
+++ /dev/null
-<?php
-/**
- * SPDX-License-Identifier: GPL-2.0-only
- *
- * This file is part of Nominatim. (https://nominatim.org)
- *
- * Copyright (C) 2022 by the Nominatim developer community.
- * For a full list of authors see the git log.
- */
-require_once 'SebastianBergmann/CodeCoverage/autoload.php';
-
-
-function coverage_shutdown($oCoverage)
-{
- $oCoverage->stop();
- $writer = new \SebastianBergmann\CodeCoverage\Report\PHP;
- $writer->process($oCoverage, $_SERVER['PHP_CODE_COVERAGE_FILE']);
-}
-
-$covfilter = new SebastianBergmann\CodeCoverage\Filter();
-if (method_exists($covfilter, 'addDirectoryToWhitelist')) {
- // pre PHPUnit 9
- $covfilter->addDirectoryToWhitelist($_SERVER['COV_PHP_DIR'].'/lib-php');
- $covfilter->addDirectoryToWhitelist($_SERVER['COV_PHP_DIR'].'/website');
- $coverage = new SebastianBergmann\CodeCoverage\CodeCoverage(null, $covfilter);
-} else {
- // since PHP Uit 9
- $covfilter->includeDirectory($_SERVER['COV_PHP_DIR'].'/lib-php');
- $covfilter->includeDirectory($_SERVER['COV_PHP_DIR'].'/website');
- $coverage = new SebastianBergmann\CodeCoverage\CodeCoverage(
- (new SebastianBergmann\CodeCoverage\Driver\Selector)->forLineCoverage($covfilter),
- $covfilter
- );
-}
-
-$coverage->start($_SERVER['COV_TEST_NAME']);
-
-register_shutdown_function('coverage_shutdown', $coverage);
-
-include $_SERVER['COV_SCRIPT_FILENAME'];
# For a full list of authors see the git log.
from pathlib import Path
import importlib
-import sys
import tempfile
import psycopg
from psycopg import sql as pysql
-sys.path.insert(1, str((Path(__file__) / '..' / '..' / '..' / '..'/ 'src').resolve()))
-
from nominatim_db import cli
from nominatim_db.config import Configuration
from nominatim_db.db.connection import Connection, register_hstore, execute_scalar
"""
def __init__(self, config):
- self.build_dir = Path(config['BUILDDIR']).resolve()
self.src_dir = (Path(__file__) / '..' / '..' / '..' / '..').resolve()
self.db_host = config['DB_HOST']
self.db_port = config['DB_PORT']
self.server_module_path = config['SERVER_MODULE_PATH']
self.reuse_template = not config['REMOVE_TEMPLATE']
self.keep_scenario_db = config['KEEP_TEST_DB']
- self.code_coverage_path = config['PHPCOV']
- self.code_coverage_id = 1
self.default_config = Configuration(None).get_os_env()
self.test_env = None
raise RuntimeError(f"Unknown API engine '{config['API_ENGINE']}'")
self.api_engine = getattr(self, f"create_api_request_func_{config['API_ENGINE']}")()
+ if self.tokenizer == 'legacy' and self.server_module_path is None:
+ raise RuntimeError("You must set -DSERVER_MODULE_PATH when testing the legacy tokenizer.")
+
def connect_database(self, dbname):
""" Return a connection to the database with the given name.
Uses configured host, user and port.
dbargs['password'] = self.db_pass
return psycopg.connect(**dbargs)
- def next_code_coverage_file(self):
- """ Generate the next name for a coverage file.
- """
- fn = Path(self.code_coverage_path) / "{:06d}.cov".format(self.code_coverage_id)
- self.code_coverage_id += 1
-
- return fn.resolve()
def write_nominatim_config(self, dbname):
""" Set up a custom test configuration that connects to the given
self.test_env['NOMINATIM_DATADIR'] = str((self.src_dir / 'data').resolve())
self.test_env['NOMINATIM_SQLDIR'] = str((self.src_dir / 'lib-sql').resolve())
self.test_env['NOMINATIM_CONFIGDIR'] = str((self.src_dir / 'settings').resolve())
- self.test_env['NOMINATIM_DATABASE_MODULE_SRC_PATH'] = str((self.build_dir / 'module').resolve())
- self.test_env['NOMINATIM_OSM2PGSQL_BINARY'] = str((self.build_dir / 'osm2pgsql' / 'osm2pgsql').resolve())
if self.tokenizer is not None:
self.test_env['NOMINATIM_TOKENIZER'] = self.tokenizer
if self.import_style is not None:
if self.server_module_path:
self.test_env['NOMINATIM_DATABASE_MODULE_PATH'] = self.server_module_path
- else:
- # avoid module being copied into the temporary environment
- self.test_env['NOMINATIM_DATABASE_MODULE_PATH'] = str((self.build_dir / 'module').resolve())
if self.website_dir is not None:
self.website_dir.cleanup()
def get_test_config(self):
cfg = Configuration(Path(self.website_dir.name), environ=self.test_env)
- cfg.set_libdirs(module=self.build_dir / 'module',
- osm2pgsql=self.build_dir / 'osm2pgsql' / 'osm2pgsql')
+ cfg.set_libdirs(module=self.server_module_path)
return cfg
def get_libpq_dsn(self):
if self.website_dir is not None:
cmdline = list(cmdline) + ['--project-dir', self.website_dir.name]
- cli.nominatim(module_dir='',
- osm2pgsql_path=str(self.build_dir / 'osm2pgsql' / 'osm2pgsql'),
+ cli.nominatim(module_dir=self.server_module_path,
+ osm2pgsql_path=None,
cli_args=cmdline,
environ=self.test_env)
for k, v in context.http_headers.items():
env['HTTP_' + k.upper().replace('-', '_')] = v
- cmd = ['/usr/bin/env', 'php-cgi', '-f']
- if context.nominatim.code_coverage_path:
- env['XDEBUG_MODE'] = 'coverage'
- env['COV_SCRIPT_FILENAME'] = env['SCRIPT_FILENAME']
- env['COV_PHP_DIR'] = context.nominatim.src_dir
- env['COV_TEST_NAME'] = f"{context.scenario.filename}:{context.scenario.line}"
- env['SCRIPT_FILENAME'] = \
- os.path.join(os.path.split(__file__)[0], 'cgi-with-coverage.php')
- cmd.append(env['SCRIPT_FILENAME'])
- env['PHP_CODE_COVERAGE_FILE'] = context.nominatim.next_code_coverage_file()
- else:
- cmd.append(env['SCRIPT_FILENAME'])
+ cmd = ['/usr/bin/env', 'php-cgi', '-f', env['SCRIPT_FILENAME']]
for k,v in params.items():
cmd.append(f"{k}={v}")
def get_osm2pgsql_options(nominatim_env, fname, append):
return dict(import_file=fname,
- osm2pgsql=str(nominatim_env.build_dir / 'osm2pgsql' / 'osm2pgsql'),
+ osm2pgsql='osm2pgsql',
osm2pgsql_cache=50,
osm2pgsql_style=str(nominatim_env.get_test_config().get_import_style_file()),
osm2pgsql_style_path=nominatim_env.get_test_config().config_dir,