So, I have a kindof off the wall question. I've got 10 computers which I inherited from a charter school that closed that I did their admin work for. They're not servers, just workstations with ubuntu server running on them. I had them all up and running at one point... but crimineys the load on my electric was way too much for me to justify leaving them up and running.
I'm running them all headless since I obviously don't need a monitor. I'm trying to think of other ways I can save on the electric with these things. I was thinking of pulling unnecessary components like the vid cards and making sure I don't run the snot out of them... but I really don't have any other ideas than that.
Yes, I could run less than 10... but I'm sure y'all understand. I've got it in my head that I want to do something and darndable if I won't see if I can make it work! :wall: (only board that has that emoticon, how fitting!)
Well, you will have to invest money to save money. You can buy "green" motherboards, low-power PSUs, small factor video cards( ATi rage 8mb is consuming as much power as my hand watch). Do not buy "green" disks tough, they save power in manner that you won`t like in your server :). You can run all of then on an pendrive(as root hdd) and buy small SAN/NAS solution for the applications and etc. You can tell us more about your idea, but I don`t think that there are many ways to save electricity with old hardware.
You don't say where you live, but for me, for 6 months of the year there is no difference between having a computer consuming 100 watts running all day, and having a 2kw heater coming on for an hour (alright for you people who can do the arithmetic, an hour and 12 minutes.) You still have to heat the house.
Thanks guys They're workstations but I'm running them as servers. I'm using them for a myriad of things, right now I'm testing LDAP scripts and pushing updates across ou's. I've been thinking it'd be good to play around with some grid computing too, but, I need to read up more on that.
Too true jgt, I live in the frigid north... so unfortunately I understand all too well the cost of running heaters lol.
With the first snow we've gone from <20KWH a day to a sudden 100KWH/day :wall: And we're supposed to be well-insulated now! Gotta find the heat leak...
Why not sell (some or all) those old units off. Build a new computer with a 4 or 6-core processor, lots of RAM, lots of hard drive space, multiple LAN cards and create multiple virtual machines?
Or if the workstations are powerful enough, you could probably set up virtualization in 5 machines with each machine running 1 real and 1 virtual machine. That's half the hardware but you still get 10 servers.
I can think of plenty of things to do with that many servers. Join a distributed computing project for instance. It will make the electricity bill go up, so it will either way defeat the purpose.
Investing in ssd's is hardly worth it, given that they cost more than your electricity bill for a year or two.
If all else fails, place the servers right under the window silk and see if you can reuse the heat coming off the energy supply and lower the heater setting.