negatively concatenating files

That subject might sound weird. What I have is two files. One looks like:

rf17 pts/59 Jul 10 08:43 (10.72.11.22) 27718 pts/59  0:00 b
rf17 pts/3 Jul 10 10:03 (10.72.11.22) 32278  pts/3  1:43 b 
rf58 pts/29 Jul 10 10:09 (10.72.11.51) 44220 pts/29  0:06 b
rf58 pts/61 Jul 10 08:45 (10.72.11.51) 28542 pts/61  0:18 b
rf7 pts/15 Jul 10 10:43 (10.72.11.89) 52326 pts/15  0:00 b 
rf7 pts/49 Jul 10 08:33 (10.72.11.89) 58982 pts/49  0:06 b 
rf56 pts/28 Jul 10 10:04 (10.72.11.9) 17256 pts/28  0:19 b 
rf56 pts/55 Jul 10 08:36 (10.72.11.9) 35368 pts/55  0:49 b 

The other looks like:

rf17 pts/3 Jul 10 10:03 (10.72.11.22) 32278  pts/3  1:43 b  
rf56 pts/28 Jul 10 10:04 (10.72.11.9) 17256 pts/28  0:19 b  
rf58 pts/29 Jul 10 10:09 (10.72.11.51) 44220 pts/29  0:06 b 
rf7 pts/15 Jul 10 10:43 (10.72.11.89) 52326 pts/15  0:00 b  

I would like to be able to remove the contents of the second file from the first file, giving, for example, this:

rf17 pts/59 Jul 10 08:43 (10.72.11.22) 27718 pts/59  0:00 b
rf56 pts/55 Jul 10 08:36 (10.72.11.9) 35368 pts/55  0:49 b 
rf7 pts/49 Jul 10 08:33 (10.72.11.89) 58982 pts/49  0:06 b 
rf58 pts/61 Jul 10 08:45 (10.72.11.51) 28542 pts/61  0:18 b

The first file's contents are always changing, the second file's contents are derived from the first using sort -uk1,1. It might make more sense if I put it all in context. I am trying to make a script (cron job) that checks to see how many duplicate sessions there are for users, and then kill the oldest duplicates while keeping the newest one alive. I know this is possible, in my experience with Unix (and this board) anything is possible. Can somebody tell me how I can do this file operation?
Thank you

Something like this?

grep -v -f secondfile firstfile

Regards

Excellent. such a simple solution. Thanks!