X-Git-Url: https://git.openstreetmap.org./nominatim.git/blobdiff_plain/ffb467028e138d174af4b3b52472f4a349ffceb5..44fface92a52e5972cafaf93d7a46e23626de1a0:/docs/admin/Import.md diff --git a/docs/admin/Import.md b/docs/admin/Import.md index b31066d3..38cd0b74 100644 --- a/docs/admin/Import.md +++ b/docs/admin/Import.md @@ -14,15 +14,15 @@ to a single Nominatim setup: configuration, extra data, etc. Create a project directory apart from the Nominatim software and change into the directory: ``` -mkdir ~/nominatim-planet -cd ~/nominatim-planet +mkdir ~/nominatim-project +cd ~/nominatim-project ``` In the following, we refer to the project directory as `$PROJECT_DIR`. To be able to copy&paste instructions, you can export the appropriate variable: ``` -export PROJECT_DIR=~/nominatim-planet +export PROJECT_DIR=~/nominatim-project ``` The Nominatim tool assumes per default that the current working directory is @@ -75,14 +75,17 @@ This data is available as a binary download. Put it into your project directory: cd $PROJECT_DIR wget https://nominatim.org/data/wikimedia-importance.sql.gz + wget -O secondary_importance.sql.gz https://nominatim.org/data/wikimedia-secondary-importance.sql.gz -The file is about 400MB and adds around 4GB to the Nominatim database. +The files are about 400MB and add around 4GB to the Nominatim database. For +more information about importance, +see [Importance Customization](../customize/Importance.md). !!! tip If you forgot to download the wikipedia rankings, then you can also add importances after the import. Download the SQL files, then - run `nominatim refresh --wiki-data --importance`. Updating - importances for a planet will take a couple of hours. + run `nominatim refresh --wiki-data --secondary-importance --importance`. + Updating importances for a planet will take a couple of hours. ### External postcodes @@ -153,7 +156,7 @@ if you plan to use the installation only for exports to a [photon](https://photon.komoot.io/) database, then you can set up a database without search indexes. Add `--reverse-only` to your setup command above. -This saves about 5% of disk space. +This saves about 5% of disk space, import time won't be significant faster. ### Filtering Imported Data @@ -228,7 +231,7 @@ to load the OSM data into the PostgreSQL database. This step is very demanding in terms of RAM usage. osm2pgsql and PostgreSQL are running in parallel at this point. PostgreSQL blocks at least the part of RAM that has been configured with the `shared_buffers` parameter during -[PostgreSQL tuning](Installation.md#postgresql-tuning) +[PostgreSQL tuning](Installation.md#tuning-the-postgresql-database) and needs some memory on top of that. osm2pgsql needs at least 2GB of RAM for its internal data structures, potentially more when it has to process very large relations. In addition it needs to maintain a cache for node locations. The size