OVH Community, your new community space.

Ovh housing: the concept


Andy
04-04-2008, 11:02
Hello,
We are approaching the date of opening of a first part of our
Datacentre Parisien specializes in housing. Given the problems
With Telecity / Redbus, we have decided to revise the schedule for start
Datacentre, and propose a solution as a priority our customers to Redbus.
Only later, we are going to propose that we will remain in the new
Customers. According to the conservation, everything will be gone in 4-6 weeks
The date for the opening. Indeed, the datacentre works with N + N of
Price we no longer on the market for 5 years. At our level it
Is a first attempt in this kind of offer on their own. If experience
Is positive, we are going to re-offend in such datacentre in province
From 2009.

It is time to talk about environmental innovations that we
Have established in this datacentre. These innovations
Cooling your plants. The goal was to offer
The maximum power in the bay, and thus ensure a cooling
Up to the bay and avoid wasting power vis-à-vis
The environment. In such cases, often we arrive at solutions to the most
Simple and more efficient and therefore necessarily the most efficient.

In a traditional datacentre, the cooling is done with clims that
Blowing cold air in the false floor. The air then passes through
Holes in the false floor in the bay (directly) and the turn of
Bay. The air cools servers and hot spell at the top. Then clims
Resume the warm air in the false ceiling to cool again. It
There are several problems with this system there:
-- Cold air cools concrete, wood, cables in the false floor.
It is therefore an unnecessary and wasteful 1st significant. Approximately 20%
Energy.
-- Cold air out of faux-floor must enter the bay through
The hole below the bay and through the small holes in front and behind
The bay. In reality, servers are blocking the passage of air through
The hole in the bay and therefore if your hosteur offers with berries
Doors without small holes, you can be sure to get 60 ° C in the
Bay. For 5 years, all hosteurs offer door with small holes
To reduce the additional cold air in the bay. The problem is that
Servers often have needs for the cold air to the bay and eject
Hot air back of the bay. As a cold 50% who tries to enter
In the Bay can not do as it is pushed by hot air
Leaving the bay. It is estimated that only 40% of the product is really cold
Used to cool the servers in the bay. The result is simple: none
Hosteur does berries with more than 2KVA electric power since
Is impossible to cool over 2KVA with this method there.
-- As there are losses associated with these 2 mess, we must initially cool
The air even more. It is therefore more energy needed to cool
Infrastructure. Approximately 2 times more.
-- The cooling of the air is very long and requires large
Fans who will make the difference in pressure sufficient to save
Air moving through every hole and servers. This requires
A lot of power.

The advantage:
-- The temperature in the bay and the ambient temperature is almost the same.
In terms of marketing when you visit a datacentre (nine) air
Is cold and there is a sense that "it works well." Yes it works well because
That by visiting on a cold. It is reassuring to have cold in a datacentre.

We worked on the issue of cooling racks under
Offers of housing for 5 months and we have developed a system
Extrement innovative and with very little waste.

We tested the solution in our datacentre Roubaix on a platform
Experimental for 3 months which has permitted us to validate the solution in charge
(100 servers HG were accommodated in these bays for 3 months). No problem
To face.

How does it work?
-------------------
You all know equipment called "fridge". You all have a fridge
Over your house or apartment. The principle is to create fridge
The cold air on the inside to keep your food, soft drinks to cool
. We have adopted the principle of housing for the fridge but the scale
Industrial:

-- Each bay has 2 clims personal fuelled by 2 systems
Cooling. The clims are physically placed at the top of the bay.
-- Each bay has 4 sides: the front, right, back and left. The
Front door is full and the door at the rear has small holes.
The doors of 2 sides of the bay (right and left) are open to semi
To a false half-bay. In this space there, clims blowing in the air
Cold. You have therefore placed with a bay of servers and then a space with
Cold air, a bay and a server space with cold air, etc.
-- The air: the air leaving clims enters the enclosed space between then
2 in the bays at his side directly inside the bay just ahead
Servers. The air inlet is located on 2 sides of the entire height of the
Bay. While cold air is actually used to cool the servers. The air
Cools servers and fate at the rear of the bay through the small holes
The rear door. The air conditioning is the hot air at the rear of the bay for the
Cool again. The cooling circuit is very short.
-- Each bay has 2 clims at its 2 sides. If a 2 system is down,
While continuing to work with the system, which remains in operation.

The advantages:
-- There is very little waste of energy. The clims are 1 meter
Servers and work only to cool the servers.
The effectiveness of our system is approximately 220% higher compared to
A traditional datacentre. It is therefore a much more datacentre
Respectful of nature.
-- The air circuit is very short and thus with little energy we
Know make a difference in pressure sufficient to make a "wind"
In the bay.
-- We know cool to 10KVA of power by bay
(Technically speaking) and 6KVA marketing. Thanks to
That the cold is distributed so homogeneous while height of the
2 bay at its sides.
-- The system of cold extrement secure because there are 2 circuits cold
For the bay and not for the datacentre.

The drawbacks:
-- Technically speaking, it's a real challenge to build such a datacentre.
The aim is to build 70 Bay fridges of cold 10KVA for each
Rack and evacuate all hot outside the datacentre.
-- The only black spot in the datacentre is that the client who is at the rear
Of the bay will be feeling hot. It may therefore think that the system
Cold in our datacentre is not good. Indeed, we set the packet
To cool servers in the bay and avoid any waste of energy for
Marketing issues. We therefore took the decision not to
The clims at the entrance to the datacentre and the galleries to cool the air
Ambient to 19 ° C. Technically speaking this is not necessary. But this gives
The feeling cold in the datacentre and thus reassures customers who visit
The datacentre. Oh, it's cold, so it's a good datacentre. In contrast
We took the risk that the customer has integrated environmental and coercion
Ecology and it will be ignored between the ambient air temperature
24 ° C-25 ° C and cold air in the bay, which is at 19 ° C. We hope that
It is true and we are not going to have prejudged the same reaction that we had
In 2004 when we announced the work on internal liquid cooling
In Ovh where we lost customers because we wanted to make ecology
We believe it is important to avoid the mess, but we will see if
Believe such customer or not.

It tells other stories ecological?

Yours
Octave

oles@ovh.net
04-04-2008, 09:32
Hello,

We are approaching the opening date for the first part of our Datacentre in Paris specialized in housing. After the problems with TelecityRedbus, we have decided to revise the schedule for the start of the datacentre, and propose a solution as a priority to our
customers in TelecityRedbus.Then we will propose remaining capacity in the new datacentres to new customers. According to the projection, everything will be leased within 4-6 weeks after the opening date. Indeed, the datacentre works in N + N with prices that we haven't found on the market for 5 years. At our level it is the first attempt at this kind of offer on our own. If the experience is positive, we will do the same thing in our datacentres in the province from 2009.

It is therefore time to talk about environmental innovations that we have established in this datacentre. These innovations are related to the cooling of your installations. The aim was to offer maximum electric power in the rack, while ensuring maximum cooling to the rack and avoid wasting power into the environment at the same time. In such cases, we often arrive at the simplest and most efficient solution, and therefore the most cost-effective one.

In a traditional datacentre, the cooling is done with air conditioning that blows cold air through the void under the floor. The air then
passes through holes in the floor into the rack (directly) and also around the rack. The air cools servers and exhausts heat at the top of the rack. Then air conditioning takes the warm air in the ceiling to cool it again. There are several problems with this kind of system:

- Cold air cools concrete, wood and cables under the floor. It is therefore unnecessary and significantly wasteful. Approximately 20% of the energy used to cool the air is thus wasted this way.

- Cold air out from the floor must enter the rack through the hole below the rack and through the small holes in front and behind the
rack. In reality, servers are blocking the passage of air through the hole in the rack and therefore if your host offers you racks with
doors without small holes, you can be sure to get 60 ° C in the rack. For the last 5 years, all hosts offer doors with small holes to
bring back additional cold air in the rack. The problem is that servers often need the cold air in front of the rack and eject hot air
at the back of the rack. Therefore 50% of cold air that tries to enter the rack can not do so since it is pushed back by hot air leaving the rack. It is estimated that only 40% of the produced cold is really used to cool the servers in the rack. The result is simple: no host proposes racks with more than 2KVA of electric power since it is impossible to cool over 2KVA with this kind of method.

- As there are losses associated with these 2 forms of waste, initially we must cool the air even more. We therefore need more
energy to cool the infrastructure. Approximately 2 times more.

- The cooling of the air is very long and requires large fans that will generate pressure sufficient to moving air through every hole and
server. This requires a lot of power.

The advantage:

- The temperature in the rack and the ambient temperature is almost the same. In terms of marketing when you visit a datacentre (new) the air is cold and we have the impression that "it works well." Yes it works well because by visiting it we feel cold. It is reassuring to be cold in a datacentre.

We worked on the issue of cooling racks for the housing offers for 5 months and we have developed an extremely innovative system with very little waste.

We have tested the solution in our datacentre in Roubaix on an experimental platform for 3 months which has permitted us to validate
the solution in charge (100 HG servers were hosted in these racks for 3 months). No problem was detected.

How does it work?
-------------------
You all know about a piece of equipment called "a fridge". You all have a fridge in your house or flat. The principle of a fridge is to
create the cold air on the inside to keep your food and drinks cool... We have adopted the principle of the fridge for the housing but on the industrial scale :

- Each rack has 2 personal air conditioners fuelled by 2 cooling systems. The air conditioners are physically placed at the top of the
rack.

- Each rack has 4 sides: the front, right, back and left. The front door is full and the door at the rear has small holes. The doors of 2
sides of the rack (right and left) are semi open to a false half-rack. In this space, air conditionerd blow the air cold. You therefore have a row with a rack of servers and then a space with cold air, a rack of servers, then a space with cold air, etc.

- The air circuit: the air leaving air conditioners between the enclosed space and then between the 2 racks at its side directly inside the rack just opposite the servers. The air entrance is located on 2 sides of the entire height of the rack. All the cold air is actually used to cool the servers. The air cools servers and leaves at the rear of the rack through the small holes of the rear door. The air
conditioning takes the hot air to the rear of the rack, thus cooling it again. The cooling circuit is very short.

- Each rack has 2 air conditioners, one on each side. If one of the 2 systems is down, everything continuies to work with the system, which remains in operation.

The advantages:

- There is very little waste of energy. The air conditioners are 1 meter away from the servers and only work to cool the servers. The
effectiveness of our system is approximately 220% higher compared to a traditional datacentre. It is therefore a much more respectful of nature datacentre.

- The air circuit is very short and thus with little energy we can make a difference in pressure sufficient to make a "wind" in the rack.
- We know how to cool up to 10KVA of power by rack (Technically speaking) and 6KVA in marketing. Thanks to the fact that the cold is distributed in a homogeneous way the whole height of the 2 sides of the rack.

- The cooling system is extremely secure because there are 2 cooling circuits for the rack and not for the datacentre.

The drawbacks:

- Technically speaking, it's a real challenge to build such a datacentre. The aim is to build 70 refridgerated-racks of 10KVA of
cooling for each rack and evacuate all hot air outside the datacentre.

- The only black spot in the datacentre is that the client who is at the rear of the rack will be feeling hot. He may therefore think that
the cooling system in our datacentre is not good. Indeed, we put everything into it to cool the servers in the rack and avoid any waste of energy for marketing issues. We therefore took the decision not to put the air conditioning at the entrance of the datacentre and the rooms to cool the ambient temperature to 19 ° C. Technically speaking this is not necessary. But this gives the cool feeling in the datacentre and thus reassures customers who visit the datacentre. Oh, it's cold, so it's a good datacentre. In contrast we took the risk that the customer has integrated the environmental and ecologic constraint and it will be ignored between the ambient air temperature 24 ° C-25 ° C and cold air in the rack, which is at 19 ° C. We hope that it is true and we are not going to have the same prejudiced reaction that we had in 2004 when we announced the work on the internal liquid cooling system at Ovh where we lost customers because we wanted to make ecology. We believe it is important to avoid waste, but we will see if the customer thinks the same or not.

Should we tell you more ecological stories?

Regards,

Octave