DesignComputerRoom.ppt 7383KB Jun 23 2011 12:15:38 PM

Computer Room
Requirements for High
Density Rack Mounted
Servers
Rhys Newman
Oxford University

Outline
• Why do we need computer rooms?
– Why in the past.
– Why in the future.

• Design of the environment.
– Cooling
– Humidity
– Power

• Proposal at Oxford Physics
• Conclusion

Why do we need them (Past)

• Security
– Equipment is valuable.

• Convenience
– Specialist Knowledge is needed to look after them.
– Networking was relatively difficult.

• Bulk
– A single (useful) installation was large

Why we need them (future)
• Specialist Environmental Requirements
– High density implies more sensitive.

• Convenience
– Human time cost of software maintenance.

Will be needed for the immediate future,
but the Grid will reduce the need long
term.


Cooling - Then
• Rack mounting designed to get high CPU
density – optimise space usage given the
effort needed to allocate secure facility.
– Until recently, maximum power usage was
about 2-3kw per rack.
– Air cooling sufficient, cool air taken directly
from under the floor.
– Even conventional air conditioning on the
ceiling was often enough.

Cooling Now: too much Success!
• Modern 1U servers are 300W heaters =>
12KW per rack (18KW for blade servers).
• Rule of thumb: 1000 litres/sec of cool air
can handle 12KW.
– In detail a Dell 1750 uses 1200 l/min.

• For 40 racks, this is 32000 l/sec which in a

typical 600mm duct is a wind speed of
320km/hr!

Cooling - Solutions
• Focus on airflow!
– Place racks in rows – hot
aisle, cold aisle.
– Leave doors off the racks.
– Identify hotspots statically,
or dynamically (HP smart
cooling).

• Rule of thumb: air cooling
can manage 1200W/m2

Major Problem – no bang for buck
• As the processor
speeds increase =>
• They get hotter =>
• Fewer can exist per

sqr metre =>
• Overall CPU power in
datacentre goes
DOWN.
All this irrespective of how well you design the air cooling systems!

Cooling Solution II
• Try self contained
systems.
• Try water cooled units
(self contained or
otherwise).
• Use “smarter” systems
which actively manage
hotspots. HP smart
cooling claims to get up to
2.5KW/m2 in this way (??).

Humidity
• Computers (in a datacentre) have tighter

tolerances than humans – 45%-55% (despite
manufacturer limits of 8%-80%).
– Too low, risks static eletricity (fans in the computers
themselves cause this).
– Too high, localised condensation, corrosion and
electrical short. Note: Zinc in floor tiles!

• Air conditioning units must be better than for
normal offices – how many rooms use
conventional units?
No magic bullet of simply importing external air and venting it to the outside!!!

Power
• All this heat comes
from the power supply
– 1.2A per server
– 50A per rack
– 4000A for a 40 rack
centre


• And for the cooling
systems, a total of
5000A => 1.25 MW.

Summary so far….
• Modern machines need a well designed physical
environment to get the most out of them. Most
current facilities are no longer well suited (a
recent thing).
– Intel scrapped 2 chip lines to concentrate on lower
power chips, rather than simply faster.
– Sun (and others) are working on chips with multiple
cores and lower clock speeds (good for internet
servers, not so good for physics!).

• The cost of the surrounding room is a substantial
cost of the entire facility.

Example: 40 Racks for Oxford
• We have an ideal location

– Lots of power
– Underground (no heat from
the sun and very secure).
– Lots of headroom (false
floor/ceiling for cooling
systems)
– Basement
• no floor loading limit
• Does not use up office
space.

Bottom Line
• The very basic estimate for the room,
given the shell, is £80k.
• Adding fully loaded cooling, UPS, power
conditioning, fire protection etc will
probably take this to £400k over time.
• Cost of 40 racks ~ £1.6 million
• Infrastructure costs: 25% of setup and up
to 50% of running costs.


Hang on!
• There are about 50000
computers already in Oxford
university alone.
• Assume 20000 are OK.
• Already have a major data
centre, with essentially no
infrastructure problems!
• The problem is software –
the Grid will exploit these
resources and thereby save
millions in datacentre costs –
medium term!

Thank you!
• Sun has a detailed paper at:
http://www.sun.com/servers/white-papers/d
c-planning-guide.
pdf

• APC has a number of useful white papers:
http://www.apc.com/tools/mytools/

Dokumen yang terkait

ANALISIS FAKTOR YANGMEMPENGARUHI FERTILITAS PASANGAN USIA SUBUR DI DESA SEMBORO KECAMATAN SEMBORO KABUPATEN JEMBER TAHUN 2011

2 53 20

KONSTRUKSI MEDIA TENTANG KETERLIBATAN POLITISI PARTAI DEMOKRAT ANAS URBANINGRUM PADA KASUS KORUPSI PROYEK PEMBANGUNAN KOMPLEK OLAHRAGA DI BUKIT HAMBALANG (Analisis Wacana Koran Harian Pagi Surya edisi 9-12, 16, 18 dan 23 Februari 2013 )

64 565 20

FAKTOR – FAKTOR YANG MEMPENGARUHI PENYERAPAN TENAGA KERJA INDUSTRI PENGOLAHAN BESAR DAN MENENGAH PADA TINGKAT KABUPATEN / KOTA DI JAWA TIMUR TAHUN 2006 - 2011

1 35 26

A DISCOURSE ANALYSIS ON “SPA: REGAIN BALANCE OF YOUR INNER AND OUTER BEAUTY” IN THE JAKARTA POST ON 4 MARCH 2011

9 161 13

Pengaruh kualitas aktiva produktif dan non performing financing terhadap return on asset perbankan syariah (Studi Pada 3 Bank Umum Syariah Tahun 2011 – 2014)

6 101 0

Pengaruh pemahaman fiqh muamalat mahasiswa terhadap keputusan membeli produk fashion palsu (study pada mahasiswa angkatan 2011 & 2012 prodi muamalat fakultas syariah dan hukum UIN Syarif Hidayatullah Jakarta)

0 22 0

Perlindungan Hukum Terhadap Anak Jalanan Atas Eksploitasi Dan Tindak Kekerasan Dihubungkan Dengan Undang-Undang Nomor 39 Tahun 1999 Tentang Hak Asasi Manusia Jo Undang-Undang Nomor 23 Tahun 2002 Tentang Perlindungan Anak

1 15 79

Pendidikan Agama Islam Untuk Kelas 3 SD Kelas 3 Suyanto Suyoto 2011

4 108 178

PP 23 TAHUN 2010 TENTANG KEGIATAN USAHA

2 51 76

KOORDINASI OTORITAS JASA KEUANGAN (OJK) DENGAN LEMBAGA PENJAMIN SIMPANAN (LPS) DAN BANK INDONESIA (BI) DALAM UPAYA PENANGANAN BANK BERMASALAH BERDASARKAN UNDANG-UNDANG RI NOMOR 21 TAHUN 2011 TENTANG OTORITAS JASA KEUANGAN

3 32 52