Summary
Cape Town is a unique location for beach enthusiasts as the coastline is stitched over a peninsula which covers almost the entire N-E-S-W plane. For surfers (and kite-surfers) this provides variable options for catching waves (and wind!) year-round.
Swell and wave conditons depend heavily on the strength and the prevailing vector of the wind to the shoreline, the geometry of the bays, and sand-bank alignment.
In this series of posts I trace the spatial and temporal evolution of wind fields in the Cape Town coastal domain.
Data Source
The NOAA Integrated Surface Database contains decade long coverage of +20000 global stations, including several dozen in South Africa.
Using their station-id list and gedit I searched for local stations (South Africa: SF) and lightly edited the native file to a .csv
type format including stations of interest. Where V1
is the station-id, V4/5
are lat-lon coordinates and V6/V7
highlight the beginning and end of the data coverage.
Below is a look at the spatial coverage of the data.
library(leaflet)
stations_CPT
## V1 V2 V3 V4 V5 V6 V7
## 1 689160 99999 CAPE POINT -34.350 18.500 19730101 20161210
## 2 689200 99999 CAPE AGULHAS -34.833 20.017 19490101 20161210
## 3 689140 99999 SIMONSTOWN -34.200 18.433 19490101 19500625
## 4 688150 99999 LANGEBAAN -33.083 18.017 19970701 20161210
## 5 688160 99999 CAPE TOWN INTL -33.965 18.602 19490101 20161210
## 6 688130 99999 ROBBEN ISLAND -33.800 18.367 20041005 20161210
## 7 688140 99999 DASSEN ISLAND -33.433 18.083 19980301 20120919
## 8 689120 99999 SLANGKOP -34.150 18.317 19980301 20161210
# plot locations
leaflet(data = stations_CPT) %>% addTiles() %>%
addMarkers(~V5, ~V4, label = ~as.character(V3))
If one wanted to make a detailed study of climate evolution over the last 60 years then that would be possible with this data. I’m going to restrict the analysis period to ~15 years.
Data Retrieval
I used the FTP server to download the files - which are individually bundled by year-station pair. A script is most effective for this.
setwd("C:/Users/daniel/Desktop/locstore/portfolio")
library(magrittr)
library(stringr)
stations_CPT <- stations_CPT[-7,] # select only stations with the desired coverage
station_names <- as.vector(stations_CPT$V1) %>% as.character() # extract station IDS
yrs <-rep(2000:2015,each=length(station_names)) # make a vector of dates
suf <- ".gz" # make suffix
tmp<-str_c(station_names,yrs,sep = "-99999-")
sts_folder <- str_c(tmp,suf)
# make url tags
base.url<-"ftp://ftp.ncdc.noaa.gov/pub/data/noaa/"
tail.url <- rep(2000:2015,each=length(station_names))
lead.address<-str_c(base.url,tail.url,sep = "")
full_srch<-str_c(lead.address,sts_folder,sep = "/") # join full string
head(full_srch)
## [1] "ftp://ftp.ncdc.noaa.gov/pub/data/noaa/2000/689160-99999-2000.gz"
## [2] "ftp://ftp.ncdc.noaa.gov/pub/data/noaa/2000/689200-99999-2000.gz"
## [3] "ftp://ftp.ncdc.noaa.gov/pub/data/noaa/2000/689140-99999-2000.gz"
## [4] "ftp://ftp.ncdc.noaa.gov/pub/data/noaa/2000/688150-99999-2000.gz"
## [5] "ftp://ftp.ncdc.noaa.gov/pub/data/noaa/2000/688160-99999-2000.gz"
## [6] "ftp://ftp.ncdc.noaa.gov/pub/data/noaa/2000/688130-99999-2000.gz"
I then exported the list of directories to download to a .txt
file, for later use with wget.
full_srch_DF<-as.data.frame(full_srch) # as data.frame for easy export
write.table(full_srch_DF,"list-cpt.txt",
row.names = FALSE, col.names = FALSE,
quote=FALSE) # export table