That's a neat solution.  I'm guessing there's a way to do this with RCurl, as in this post which scraped off wikipedia.
But as a more general point for discussion: why don't we just use data from the "datasets" package in R?  Then everyone will have the data by just calling the data() function, and there are datasets to cover most cases.
[Edit]: I was able to do this.  It's clearly more work (i.e. impractical) than your solution.  :)
[Edit 2]: I wrapped this into a function and tried it with another page.  
getSOTable <- function(url, code.block=2, raw=FALSE, delimiter="code") {
  require(RCurl)
  require(XML)
  webpage <- getURL(url)
  webpage <- readLines(tc <- textConnection(webpage)); close(tc)
  pagetree <- htmlTreeParse(webpage, error=function(...){}, useInternalNodes = TRUE)
  x <- xpathSApply(pagetree, paste("//*/", delimiter, sep=""), xmlValue)[code.block]  
  if(raw)
    return(strsplit(x, "\n")[[1]])
  else 
    return(read.table(textConnection(strsplit(x, "\n")[[1]][-1])))
}
getSOTable("https://stackoverflow.com/questions/1434897/how-do-i-load-example-datasets-in-r")
    site year     peak
1  ALBEN    5 101529.6
2  ALBEN   10 117483.4
3  ALBEN   20 132960.9
8  ALDER    5   6561.3
9  ALDER   10   7897.1
10 ALDER   20   9208.1
15 AMERI    5  43656.5
16 AMERI   10  51475.3
17 AMERI   20  58854.4
getSOTable("https://stackoverflow.com/questions/1428174/quickly-generate-the-cartesian-product-of-a-matrix", code.block=10)
   X1 X2 X3 X4
1   1 11  1 11
2   1 11  2 12
3   1 11  3 13
4   1 11  4 14
5   1 11  5 15
6   1 11  6 16
7   1 11  7 17
8   1 11  8 18
9   1 11  9 19
10  1 11 10 20