Anda di halaman 1dari 3

Localized cloning of popular sites

The 2 MBPS lease line ones thought to be more than sufficient for the communication between corporate office at Delhi & Regional office at Mumbai but with the video conferencing now-a-days in vogue the other communications during video talk had to be stopped to save bandwidth. But this has irked some bosses as there are some important online websites at corporate server which are seen by most of the bosses for most of the time of day. Some of these sites are quite huge and takes away lots of bandwidth. The other day our IT man discovers that half of the bandwidth is spent only for viewing one site which is seen by most of the bosses in Mumbai. 'If I could replicate this site locally then all these traffic could have been directed to a local server here and my precious bandwidth could have been protected' , the IT man exasperated with pain. Since then I was thinking about it. This one site hauls an xml file from another deep rooted secret corporate server and then rearrange these data in different pages. The page refreshes once per minute. I tried to wget this xml data and the xml file arrives neatly in my computer. I tried to make a 'crond task' in my Linux server . It will run the 'wget' command every minute and the xml file is fetched & over written every minute on my server. So far so good. Now thru a php-xml parser it is so easy to read the data from the xml file and then put them in different php pages on the Linux server. I called the IT man and told him to find out a solution in his windows environment to divert all the traffic for the corporate page to a local page as desired by us. Once that diversion is in place we can protect our bandwidth comfortably. Here is the crond task which is all put in a file in '/etc/cron.d/mycron' file. The computer will connect the remote server and then download the xml data file in one go into the local disk. # the following line goes into mycron file .. #since this runs every minute, entry into log file is avoided by '>/dev/null 2>1' command # these four stars represents day,month,week,year etc #for detail see google # remove the old file 0-59 * * * * rm -f realviewnew.xml

#now get the remote file 0-59 * * * * wget http://191.254.198.107/gdams/realviewnew.xml #Transfer the file onto the destination file in the documentserver 0-59 * * * * cp -rfu /root/realviewnew.xml /srv/www/htdocs/example/cache/ #end of file The realviewnew.xml file now has arrived the local directory. Here is a php-xml parser in a typical webpage output. Test.php goes here. The site refresh every minute taking data from the remote site and rebuilding them locally. <html> <body> <META HTTP-EQUIV="Pragma" CONTENT="no-cache"> <META HTTP-EQUIV=REFRESH CONTENT="60"> <body bgcolor="FFFFCC" > <?php $dom = new DOMDocument(); $dom->load('./realviewnew.xml'); foreach ($dom->getElementsByTagname('STATION') as $element) { foreach (($element->childNodes) as $e) { if (is_a($e, 'DOMElement')) { if ($e->tagName == 'TIME') $avar1 = htmlspecialchars($e->textContent); if ($e->tagName == 'REGION') $avar2 = htmlspecialchars($e->textContent); if ($e->tagName == 'AGDC') $avar3 = htmlspecialchars($e->textContent); // collect all the tag names and transfer them to variables } } if ($region=='WR') { //you can segregate as per tagname before assiging them in arrays $var1[] = $avar1; $var2[] = $avar2; $var3[]= $avar3; // collect the remaining variables } } echo $var[0]. $var2[0].$var3[0]; //redisplay them now // Rebuild the table , page here ,put your code ,table , code etc here... ?> </body>

</html> After building this page one Sunday I physically went to the each Boss's room and redirect the traffic to this local site. 30% congestion suddenly vanished from the bandwidth. By next Monday we rebuild the page and then told our IT man to make the necessary rerouting in the DNS server so that the path for the remote site would redirect to this local site. 'Sir, we don't have any DNS server here, we will do the rerouting in the local machines' The IT man replied. Since then this page is getting redisplayed from a local Linux box , The local page opens very promptly now-a-days. S. Bera Powai

Anda mungkin juga menyukai