runs-on: ubuntu-latest
steps:
- - uses: actions/checkout@v3
+ - uses: actions/checkout@v4
with:
submodules: true
```bash
update
- ├── europe
- │ ├── andorra
- │ │ └── sequence.state
- │ └── monaco
- │ └── sequence.state
- └── tmp
- └── europe
- ├── andorra-latest.osm.pbf
- └── monaco-latest.osm.pbf
-
+ ├── europe
+ │ ├── andorra
+ │ │ └── sequence.state
+ │ └── monaco
+ │ └── sequence.state
+ └── tmp
+ └── europe
+ ├── andorra-latest.osm.pbf
+ └── monaco-latest.osm.pbf
```
This will get diffs from the replication server, import diffs and index
the database. The default replication server in the
-script([Geofabrik](https://download.geofabrik.de)) provides daily updates.
+script ([Geofabrik](https://download.geofabrik.de)) provides daily updates.
## Using an external PostgreSQL database
This section gives a quick overview on how to configure Apache and Nginx to
serve Nominatim. It is not meant as a full system administration guide on how
to run a web service. Please refer to the documentation of
-[Apache](http://httpd.apache.org/docs/current/) and
+[Apache](https://httpd.apache.org/docs/current/) and
[Nginx](https://nginx.org/en/docs/)
for background information on configuring the services.
### Installing the required packages
The recommended way to deploy a Python ASGI application is to run
-the ASGI runner (uvicorn)[https://uvicorn.org/]
-together with (gunicorn)[https://gunicorn.org/] HTTP server. We use
+the ASGI runner [uvicorn](https://uvicorn.org/)
+together with [gunicorn](https://gunicorn.org/) HTTP server. We use
Falcon here as the web framework.
Create a virtual environment for the Python packages and install the necessary
``` sh
sudo apt install virtualenv
virtualenv /srv/nominatim-venv
-/srv/nominatim-venv/bin/pip install SQLAlchemy PyICU psycopg[binary]\
+/srv/nominatim-venv/bin/pip install SQLAlchemy PyICU psycopg[binary] \
psycopg2-binary python-dotenv PyYAML falcon uvicorn gunicorn
```
* [datrie](https://github.com/pytries/datrie)
When running the PHP frontend:
+
* [PHP](https://php.net) (7.3+)
* PHP-pgsql
* PHP-intl (bundled with PHP)
version update or create the index manually **before** starting the update
using the following SQL:
-```
+```sql
CREATE INDEX IF NOT EXISTS idx_placex_geometry_reverse_lookupPlaceNode
ON placex USING gist (ST_Buffer(geometry, reverse_place_diameter(rank_search)))
WHERE rank_address between 4 and 25 AND type != 'postcode'
## Installation
To use the Nominatim library, you need access to a local Nominatim database.
-Follow the [installation and import instructions](../admin/) to set up your
-database.
+Follow the [installation](../admin/Installation.md) and
+[import](../admin/Import.md) instructions to set up your database.
It is not yet possible to install it in the usual way via pip or inside a
virtualenv. To get access to the library you need to set an appropriate
-PYTHONPATH. With the default installation, the python library can be found
+`PYTHONPATH`. With the default installation, the python library can be found
under `/usr/local/share/nominatim/lib-python`. If you have installed
Nominatim under a different prefix, adapt the `/usr/local/` part accordingly.
-You can also point the PYTHONPATH to the Nominatim source code.
+You can also point the `PYTHONPATH` to the Nominatim source code.
### A simple search example
is done by creating an Nominatim API object. This object exposes all the
search functions of Nominatim that are also known from its web API.
-This code snippet implements a simple search for the town if 'Brugge':
+This code snippet implements a simple search for the town of 'Brugge':
!!! example
=== "NominatimAPIAsync"
not possible, it tries English and eventually falls back to the default `name`
or `ref`.
-The Locale object can be applied to a name dictionary to return the best-matching
+The `Locale` object can be applied to a name dictionary to return the best-matching
name out of it:
``` python
from nominatim.api.results import DetailedResult, ReverseResult, SearchResults
-class NominatimAPIAsync:
+class NominatimAPIAsync: #pylint: disable=too-many-instance-attributes
""" The main frontend to the Nominatim database implements the
functions for lookup, forward and reverse geocoding using
asynchronous functions.
self.config = Configuration(project_dir, environ)
self.query_timeout = self.config.get_int('QUERY_TIMEOUT') \
if self.config.QUERY_TIMEOUT else None
+ self.reverse_restrict_to_country_area = self.config.get_bool('SEARCH_WITHIN_COUNTRIES')
self.server_version = 0
if sys.version_info >= (3, 10):
conn.set_query_timeout(self.query_timeout)
if details.keywords:
await make_query_analyzer(conn)
- geocoder = ReverseGeocoder(conn, details)
+ geocoder = ReverseGeocoder(conn, details,
+ self.reverse_restrict_to_country_area)
return await geocoder.lookup(coord)
coordinate.
"""
- def __init__(self, conn: SearchConnection, params: ReverseDetails) -> None:
+ def __init__(self, conn: SearchConnection, params: ReverseDetails,
+ restrict_to_country_areas: bool = False) -> None:
self.conn = conn
self.params = params
+ self.restrict_to_country_areas = restrict_to_country_areas
self.bind_params: Dict[str, Any] = {'max_rank': params.max_rank}
return _get_closest(address_row, other_row)
- async def lookup_country(self) -> Optional[SaRow]:
+ async def lookup_country_codes(self) -> List[str]:
""" Lookup the country for the current search.
"""
log().section('Reverse lookup by country code')
t = self.conn.t.country_grid
- sql: SaLambdaSelect = sa.select(t.c.country_code).distinct()\
+ sql = sa.select(t.c.country_code).distinct()\
.where(t.c.geometry.ST_Contains(WKT_PARAM))
- ccodes = tuple((r[0] for r in await self.conn.execute(sql, self.bind_params)))
+ ccodes = [cast(str, r[0]) for r in await self.conn.execute(sql, self.bind_params)]
log().var_dump('Country codes', ccodes)
+ return ccodes
+
+
+ async def lookup_country(self, ccodes: List[str]) -> Optional[SaRow]:
+ """ Lookup the country for the current search.
+ """
+ if not ccodes:
+ ccodes = await self.lookup_country_codes()
if not ccodes:
return None
.order_by(sa.desc(inner.c.rank_search), inner.c.distance)\
.limit(1)
- sql = sa.lambda_stmt(_base_query)
+ sql: SaLambdaSelect = sa.lambda_stmt(_base_query)
if self.has_geometries():
sql = self._add_geometry_columns(sql, sa.literal_column('area.geometry'))
row, tmp_row_func = await self.lookup_street_poi()
if row is not None:
row_func = tmp_row_func
- if row is None and self.max_rank > 4:
- row = await self.lookup_area()
- if row is None and self.layer_enabled(DataLayer.ADDRESS):
- row = await self.lookup_country()
+
+ if row is None:
+ if self.restrict_to_country_areas:
+ ccodes = await self.lookup_country_codes()
+ if not ccodes:
+ return None
+ else:
+ ccodes = []
+
+ if self.max_rank > 4:
+ row = await self.lookup_area()
+ if row is None and self.layer_enabled(DataLayer.ADDRESS):
+ row = await self.lookup_country(ccodes)
result = row_func(row, nres.ReverseResult)
if result is not None:
if self.qualifiers:
place_sql = place_sql.where(self.qualifiers.sql_restrict(thnr))
- numerals = [int(n) for n in self.housenumbers.values if n.isdigit()]
+ numerals = [int(n) for n in self.housenumbers.values
+ if n.isdigit() and len(n) < 8]
interpol_sql: SaColumn
tiger_sql: SaColumn
if numerals and \
return f"(?P<{axis}_deg>\\d+\\.\\d+)°?"
def _deg_min(axis: str) -> str:
- return f"(?P<{axis}_deg>\\d+)[°\\s]+(?P<{axis}_min>[\\d.]+)?[′']*"
+ return f"(?P<{axis}_deg>\\d+)[°\\s]+(?P<{axis}_min>[\\d.]+)[′']*"
def _deg_min_sec(axis: str) -> str:
- return f"(?P<{axis}_deg>\\d+)[°\\s]+(?P<{axis}_min>\\d+)[′'\\s]+(?P<{axis}_sec>[\\d.]+)?[\"″]*"
+ return f"(?P<{axis}_deg>\\d+)[°\\s]+(?P<{axis}_min>\\d+)[′'\\s]+(?P<{axis}_sec>[\\d.]+)[\"″]*"
COORD_REGEX = [re.compile(r'(?:(?P<pre>.*?)\s+)??' + r + r'(?:\s+(?P<post>.*))?') for r in (
r"(?P<ns>[NS])\s*" + _deg('lat') + r"[\s,]+" + r"(?P<ew>[EW])\s*" + _deg('lon'),
for oid in (params.get('osm_ids') or '').split(','):
oid = oid.strip()
if len(oid) > 1 and oid[0] in 'RNWrnw' and oid[1:].isdigit():
- places.append(napi.OsmID(oid[0], int(oid[1:])))
+ places.append(napi.OsmID(oid[0].upper(), int(oid[1:])))
if len(places) > params.config().get_int('LOOKUP_MAX_COUNT'):
params.raise_error('Too many object IDs.')
-Subproject commit 27cfb5e37c7bcf8926807baa7c27b1a6e83712c8
+Subproject commit 415de9abdf2d003a5c0a0abe8e8fc139acacc2b5
assert all(geom.name.lower() in r.geometry for r in results)
+def test_very_large_housenumber(apiobj):
+ apiobj.add_placex(place_id=93, class_='place', type='house',
+ parent_place_id=2000,
+ housenumber='2467463524544', country_code='pt')
+ apiobj.add_placex(place_id=2000, class_='highway', type='residential',
+ rank_search=26, rank_address=26,
+ country_code='pt')
+ apiobj.add_search_name(2000, names=[1,2],
+ search_rank=26, address_rank=26,
+ country_code='pt')
+
+ lookup = FieldLookup('name_vector', [1, 2], 'lookup_all')
+
+ results = run_search(apiobj, 0.1, [lookup], [], hnrs=['2467463524544'],
+ details=SearchDetails())
+
+ assert results
+ assert [r.place_id for r in results] == [93, 2000]
+
+
class TestInterpolations:
@pytest.fixture(autouse=True)
import nominatim.api.v1.helpers as helper
-@pytest.mark.parametrize('inp', ['', 'abc', '12 23', 'abc -78.90, 12.456 def'])
+@pytest.mark.parametrize('inp', ['',
+ 'abc',
+ '12 23',
+ 'abc -78.90, 12.456 def',
+ '40 N 60 W'])
def test_extract_coords_no_coords(inp):
query, x, y = helper.extract_coords_from_query(inp)
# ---------------------
#
# Tune the postgresql configuration, which is located in
-# `/etc/postgresql/12/main/postgresql.conf`. See section *Postgres Tuning* in
-# [the installation page](../admin/Installation.md#postgresql-tuning)
+# `/etc/postgresql/12/main/postgresql.conf`. See section *Tuning the PostgreSQL database*
+# in [the installation page](../admin/Installation.md#tuning-the-postgresql-database)
# for the parameters to change.
#
# Restart the postgresql service after updating this config file.
# ---------------------
#
# Tune the postgresql configuration, which is located in
-# `/etc/postgresql/14/main/postgresql.conf`. See section *Postgres Tuning* in
-# [the installation page](../admin/Installation.md#postgresql-tuning)
+# `/etc/postgresql/14/main/postgresql.conf`. See section *Tuning the PostgreSQL database*
+# in [the installation page](../admin/Installation.md#tuning-the-postgresql-database)
# for the parameters to change.
#
# Restart the postgresql service after updating this config file.