Trying to extract domain and tld from list of urls.

I have done a fair amount of searching the threads, but I have not been able to cobble together a solution to my challenge. What I am trying to do is to line edit a file that will leave behind only the domain and tld of a long list of urls. The list looks something like this:

www.google.com
ja.wikipedia.org
bbc.co.uk
fr-fr.facebook.com

and I would like to end up with:

google.com
wikipedia.org
bbc.co.uk
facebook.com

I prefer bash, but am learning ruby and perl....though not very good at them yet. I have used ruby's URI function to extract the input links above...is there another ruby function I am overlooking for domain.tld?

Thanks!

cut -d"." -f2,3

First you are going to need to come up with some rules per exactly what you want to strip off, Do you want to just strip off the first period and what is before it?

Check out these sites:

www.iptools.com
en.wikipedia.org/wiki/URI_scheme

Also the "cut" command example that was posted will do the strip of the first period and anything before it from the bash command line:

$ link="fr-fr.facebook.com"; echo $link | cut -d"." -f2,3
facebook.com

Thanks for the quick replies. Straight out of the box, the "cut" command would not work, but a workaround might be to separate my file first into all of the .com, .net, etc. and then apply customized cut commands. Let me give that a try....