Harvesting google profiles
Some minutes ago, I saw an interesting tweet from Mikko H. Hypponen saying that he found out that all (yes, as in ALL – 35,513,445 ) google profiles addresses can be retrieved from a single XML file . Looked through it and , yeap, he was quite right.
Well , all these information is going to be useful somehow ,right? Right. In case it’s going to be removed here is a simple way to harvest them before that happens :
#!/usr/bin/env python import urllib from BeautifulSoup import BeautifulStoneSoup as bs xml = bs(urllib.urlopen('http://www.gstatic.com/s2/sitemaps/profiles-sitemap.xml').read()) for i in xml.findAll('loc'): try: urllib.urlretrieve(i.text, i.text[35:]) print 'Downloaded %s' % i.text[35:] except Exception, err: print '%s could not be retrieved' % i.text print 'All done'
That’s it, save it , run it and wait 🙂 Not that I used it, but I calculate that you get around 1.7 GB worth of profile links .
Well , the juicy part is obviously the harvesting of the information from the profiles themselves. People are mentioning on twitter that Google is aware for a long time, or at least should be. Thoughts about the potential implications from that harvesting, on a blogpost to come .
At least they are transparent 😉
For people getting a message about a beautiful soup that is missing:
sudo apt-get install python-beautifulsoup
Just sayin!
Sorry, new to linux. How do I “save” and then “run” the script?
Hey,
Sorry I had missed your comment somehow. The instructions from daneelrsixth are valid. You will need to have python ofcourse installed and BeautifulSoup ( either via easy_install ) or via your distributions package manager
KR – copy the source code, open a text editor paste it, save the file as “google.py” than open the terminal go to the directory where you saved the file and digit “python google.py”. (PS i hope you are using an *nix system).
Btw the script doesn’t work know, they kinda fixed the issue.
Just run it out of curiosity, seems to still work fine. From what I gather, from Google’s perspective this is not an “issue” .
Well silly me, silly silly me… i had some problems with the beautiful soup module… i can confirm that the script works. 🙂