Query via wget

where {USERNAME} is the valid account username, {PASSWORD} is the corresponding authentication password value and {FILE} is the name of the file where to print the output of the query. If ?-? is used as {FILE}, documents will be printed to standard output.

The following example shows how to make an OpenSearch query using Wget. The query searches for all the products in the Data Hub archive. The first 25 results are printed in a file named query_results.txt:

The --continue option is very useful when downloads do not complete due to network problems. Wget will automatically try to continue the download from where it left off, and will repeat this until the whole file is retrieved.

NOTE: Depending on the programming language used for scripting, the dollar ($) character might be interpreted as a wildcard character and not as a string value.
The examples in this page are in Unix Shell and the backslash (\) needs to be used as escape character for the dollar ($) character.

The following example shows how to download the manifest file of a Sentinel-1 product using an Odata URI with Wget identified by the universally unique identifier {UUID}:

-t <time in hours> : search for products ingested in the last <time in hours> (integer) from the time of execution of the script. (e.g. '=-t 24=' to search for products ingested in the last 24 Hours)

-s <ingestion_date_FROM> : search for products ingested after the date and time specified by <ingestion_date_FROM>. The date format is ISO 8601:YYYY-MM-DDThh:mm:ss.cccZ (e.g. -s 2016-10-02T06:00:00.000Z)

-S <sensing_date_FROM> : search for products with sensing date greater than the date and time specified by <sensing_date_FROM>. The date format is ISO 8601: YYYY-MM-DDThh:mm:ss.cccZ (e.g. -S 2016-10-02T06:00:00.000Z)

-E <sensing_date_TO> : search for products with sensing date less than the date and time specified by <sensing_date_TO>. The date format is ISO 8601: YYYY-MM-DDThh:mm:ss.cccZ (e.g. -E 2016-10-10T12:00:00.000Z)

-f <file> : search for products ingested after the date and time provided through the input <file>. The file is updated at the end of the script execution with the ingestion date of the last successful downloaded product.

-c <coordinates i.e.: lon1,lat1:lon2,lat2> : coordinates of two opposite vertices of the rectangular area of interest

'manifest' to download the manifest of all products returned from the search or

'product' to download all products returned from the search

'all' to download both

-O <path/filename> : save the product ZIP file in the specified folder with the specified filename.

-N <1...n> : set number of wget download retries. Default value is 5. Fatal errors like 'connection refused' or 'not found' (404) are not retried

-R <file> : write in the specified file the list of products that have failed the MD5 integrity check. By default the list is written in ./failed_MD5_check_list.txt. The format of the output file is compatible with option -r

-D : if specified, remove the products that have failed the MD5 integrity check from disk. By deafult products are not removed

-r <file> : download the products listed in an input file written according to the following format:

-L <lock folder> : by default only one instance of dhusget can be executed at a time. This is ensured by the creation of a temporary lock folder $HOME/dhusget_tmp/lock which is removed a the end of each run. In order to run more than one dhusget instance at a time is sufficient to assign different lock folders using the -L option (e.g. '-L foldername') to each dhusget instance

-n <1...n> : number of concurrent downloads (either products or manifest files). Default value is 2; this value doesn't override the quota limit set on the server side for the user.

-w <1...n> : minutes to wait until retrying the download of an offline product. Default value is 10