diff --git a/README.md b/README.md index 6e2fba988255d3b1e58d89d6382a4e88e6810165..85e30bc4d5ac3a70489fc98db1b9995743745531 100644 --- a/README.md +++ b/README.md @@ -1,7 +1,8 @@ -# iamap +#  IAMAP [Documentation](https://iamap.readthedocs.io/en/latest/) -[Gitlab repo](https://forge.ird.fr/amap/iamap) + +[Gitlab repo (mirror)](https://forge.ird.fr/amap/iamap) ## Rationale diff --git a/docs/source/examples.md b/docs/source/examples.md index e54fcb31b7df09f5ccf273fa2495970e6b38becd..5743c91e94ae642c78dc75aedc10403a2fae3652 100644 --- a/docs/source/examples.md +++ b/docs/source/examples.md @@ -106,7 +106,7 @@ Thanks to these two steps the comparison between both maps can be done pixel by A photo-identification is achieved on 30 randomly selected tiles for each dataset. The variability inherent to each class is accounted for by the identification of 10 polygons per landcover class for the training set and another 10 polygons per landcover class for the validation step. Training and validation sets are spatially separated to avoid spatial auto-correlation. -The 10 polygons per class resulted in a dataset of XXX points (XXX in training and XXX in test dataset). +<!-- The 10 polygons per class resulted in a dataset of XXX points (XXX in training and XXX in test dataset). --> The different landcover classes are the following : - Agricultural @@ -119,10 +119,10 @@ The different landcover classes are the following : Preliminary test have been done to diffferenciate a variety of homogeneous patchs (such as forest, urban area, low vegetation) using the Haralick Texture metrics (*i.e.* 9 metrics from the R package GLCMTextures) with different sets of parameters and using the resulting features to train a RF classifier but the results were not satisfying, motivating the use of a DL encoder. -Images are fed through a ViT base DINO encoder with default parameters and the resulting features are used as input for a random forest classifier (XXX hyper-parameters). +Images are fed through a ViT base DINO encoder with default parameters and the resulting features are used as input for a random forest classifier (ntree=500, mtry=28). The RF achieves the following accuracy/kappa: -XXX +<!-- XXX --> ### Results