.. _appendix: Appendix I: Queries =================== Queries (Bounding Box) ---------------------- - Query 1: "Give me a bbox for 10km long along the shoreline axis of paralia of Eresos. ". "To extract accurately the bbox, draw 3 lines between Eresos beach and major cities of Lesvos (points), ". "then draw 3 circles and use the Trilateration method to extract the 3 circles intersection point. This is the center of the bbox. ". "Reply clearly by providing min_lat, max_lat, min_lon, and max_lon." - Query 2: "Give me a bbox for 10km long along the shoreline axis of paralia of Eresos. ". "To extract accurately the bbox, identify at least 3 cafeterias or restaurants along the coastline of Eressos. ". "Reply clearly by providing min_lat, max_lat, min_lon, and max_lon in 3 lines, not the process" Queries (Results) ----------------- - Data used for calculating Coastal Vulnerability Index (CVI): Landcover (raster): https://eodata.dataspace.copernicus.eu, DEM (raster): CopDEM_COG/copernicus-dem-30m, Slope (raster): CopDEM_COG/copernicus-dem-30m, Coastal erosion: https://coastalhazardwheel.avi.deltares.nl/geoserver/chw2-vector/ows, Mean tidal range (shapefile): https://services.arcgis.com/P3ePLMYs2RVChkJx/arcgis/rest/services/Ecological_Coastal_Units__ECU__1km_Segments/FeatureServer, Mean Wave Height (raster): Copernicus - cmems_mod_med_wav_anfc_4.2km_PT1H-I, Sea Level Rise (raster): Copernicus - c3s_obs-sl_glo_phy-ssh_my_twosat-l4-duacs-0.25deg_P1D. Can you analyse the accuracy of the results and the inherent uncertainties? - Based on the accuracy and the uncertainty analysis you provided, how good are the results of our CVI score for a coastal area? Can we find better datasets? - Consider using datasets with higher spatial resolution and accuracy, such as those from the European Centre for Medium-Range Weather Forecasts (ECMWF) ERA5. However, ERA5 has a spatial resolution more than 20 km. How this will reduce uncertainty? Other low resolution reanalysis datasets? - Apart from the input datasets used, the way we work and calculate CVI is based on transects. Hence, we produce vertical transects across the shoreline every 100 meters and all values of the sub-criteria (i.e. elevation, slope, sea level, mean wave height etc.) are then assigned to each one of the transects based on the minimum distance of a transect from a grid point (raster cell), line or polygon feature. So, we store all criteria scores and the final CVI calculation to the transects attribute table. Is there any other solution to increase accuracy? - We estimate the Coastal Vulnerability Index as the square root of the criteria sum, divided by the total N, where N equals the number of the criteria (2√( × b × c × d × e × f × g × h × i ×j )/10). For weighted overlay and fuzzy logic, can you provide the equations? - If I give you the GeoJSON file and you have access to the attribute table, i.e. for each one of the transects, the score for each criterion as a seperate column in the attribute table. Can you perform the calculation of the CVI, using higher weights for the low uncertainty data and lower weights for the high uncertainty data? - it is embedded in the criterion values themselves. We don't know the exact uncertainty, you know better from the analysis you provided. Maybe you can create your own uncertainty scale in order to assign weights?