Im talking about performing a deep recursion for around 5+ mins, something that you may have a crawler perform. in order to extract url links and and sub-url links of pages
it seems that deep recursion in PHP does not seem realistic
e.g.
getInfo("www.example.com");
function getInfo($link){
   $content = file_get_content($link)
   if($con = $content->find('.subCategories',0)){
      echo "go deeper<br>";
      getInfo($con->find('a',0)->href);
   }
   else{
      echo "reached deepest<br>";
   }
}
 
    