Andy
04-04-2008, 11:02
Hello,
We are approaching the date of opening of a first part of our
Datacentre Parisien specializes in housing. Given the problems
With Telecity / Redbus, we have decided to revise the schedule for start
Datacentre, and propose a solution as a priority our customers to Redbus.
Only later, we are going to propose that we will remain in the new
Customers. According to the conservation, everything will be gone in 4-6 weeks
The date for the opening. Indeed, the datacentre works with N + N of
Price we no longer on the market for 5 years. At our level it
Is a first attempt in this kind of offer on their own. If experience
Is positive, we are going to re-offend in such datacentre in province
From 2009.
It is time to talk about environmental innovations that we
Have established in this datacentre. These innovations
Cooling your plants. The goal was to offer
The maximum power in the bay, and thus ensure a cooling
Up to the bay and avoid wasting power vis-à-vis
The environment. In such cases, often we arrive at solutions to the most
Simple and more efficient and therefore necessarily the most efficient.
In a traditional datacentre, the cooling is done with clims that
Blowing cold air in the false floor. The air then passes through
Holes in the false floor in the bay (directly) and the turn of
Bay. The air cools servers and hot spell at the top. Then clims
Resume the warm air in the false ceiling to cool again. It
There are several problems with this system there:
-- Cold air cools concrete, wood, cables in the false floor.
It is therefore an unnecessary and wasteful 1st significant. Approximately 20%
Energy.
-- Cold air out of faux-floor must enter the bay through
The hole below the bay and through the small holes in front and behind
The bay. In reality, servers are blocking the passage of air through
The hole in the bay and therefore if your hosteur offers with berries
Doors without small holes, you can be sure to get 60 ° C in the
Bay. For 5 years, all hosteurs offer door with small holes
To reduce the additional cold air in the bay. The problem is that
Servers often have needs for the cold air to the bay and eject
Hot air back of the bay. As a cold 50% who tries to enter
In the Bay can not do as it is pushed by hot air
Leaving the bay. It is estimated that only 40% of the product is really cold
Used to cool the servers in the bay. The result is simple: none
Hosteur does berries with more than 2KVA electric power since
Is impossible to cool over 2KVA with this method there.
-- As there are losses associated with these 2 mess, we must initially cool
The air even more. It is therefore more energy needed to cool
Infrastructure. Approximately 2 times more.
-- The cooling of the air is very long and requires large
Fans who will make the difference in pressure sufficient to save
Air moving through every hole and servers. This requires
A lot of power.
The advantage:
-- The temperature in the bay and the ambient temperature is almost the same.
In terms of marketing when you visit a datacentre (nine) air
Is cold and there is a sense that "it works well." Yes it works well because
That by visiting on a cold. It is reassuring to have cold in a datacentre.
We worked on the issue of cooling racks under
Offers of housing for 5 months and we have developed a system
Extrement innovative and with very little waste.
We tested the solution in our datacentre Roubaix on a platform
Experimental for 3 months which has permitted us to validate the solution in charge
(100 servers HG were accommodated in these bays for 3 months). No problem
To face.
How does it work?
-------------------
You all know equipment called "fridge". You all have a fridge
Over your house or apartment. The principle is to create fridge
The cold air on the inside to keep your food, soft drinks to cool
. We have adopted the principle of housing for the fridge but the scale
Industrial:
-- Each bay has 2 clims personal fuelled by 2 systems
Cooling. The clims are physically placed at the top of the bay.
-- Each bay has 4 sides: the front, right, back and left. The
Front door is full and the door at the rear has small holes.
The doors of 2 sides of the bay (right and left) are open to semi
To a false half-bay. In this space there, clims blowing in the air
Cold. You have therefore placed with a bay of servers and then a space with
Cold air, a bay and a server space with cold air, etc.
-- The air: the air leaving clims enters the enclosed space between then
2 in the bays at his side directly inside the bay just ahead
Servers. The air inlet is located on 2 sides of the entire height of the
Bay. While cold air is actually used to cool the servers. The air
Cools servers and fate at the rear of the bay through the small holes
The rear door. The air conditioning is the hot air at the rear of the bay for the
Cool again. The cooling circuit is very short.
-- Each bay has 2 clims at its 2 sides. If a 2 system is down,
While continuing to work with the system, which remains in operation.
The advantages:
-- There is very little waste of energy. The clims are 1 meter
Servers and work only to cool the servers.
The effectiveness of our system is approximately 220% higher compared to
A traditional datacentre. It is therefore a much more datacentre
Respectful of nature.
-- The air circuit is very short and thus with little energy we
Know make a difference in pressure sufficient to make a "wind"
In the bay.
-- We know cool to 10KVA of power by bay
(Technically speaking) and 6KVA marketing. Thanks to
That the cold is distributed so homogeneous while height of the
2 bay at its sides.
-- The system of cold extrement secure because there are 2 circuits cold
For the bay and not for the datacentre.
The drawbacks:
-- Technically speaking, it's a real challenge to build such a datacentre.
The aim is to build 70 Bay fridges of cold 10KVA for each
Rack and evacuate all hot outside the datacentre.
-- The only black spot in the datacentre is that the client who is at the rear
Of the bay will be feeling hot. It may therefore think that the system
Cold in our datacentre is not good. Indeed, we set the packet
To cool servers in the bay and avoid any waste of energy for
Marketing issues. We therefore took the decision not to
The clims at the entrance to the datacentre and the galleries to cool the air
Ambient to 19 ° C. Technically speaking this is not necessary. But this gives
The feeling cold in the datacentre and thus reassures customers who visit
The datacentre. Oh, it's cold, so it's a good datacentre. In contrast
We took the risk that the customer has integrated environmental and coercion
Ecology and it will be ignored between the ambient air temperature
24 ° C-25 ° C and cold air in the bay, which is at 19 ° C. We hope that
It is true and we are not going to have prejudged the same reaction that we had
In 2004 when we announced the work on internal liquid cooling
In Ovh where we lost customers because we wanted to make ecology
We believe it is important to avoid the mess, but we will see if
Believe such customer or not.
It tells other stories ecological?
Yours
Octave
We are approaching the date of opening of a first part of our
Datacentre Parisien specializes in housing. Given the problems
With Telecity / Redbus, we have decided to revise the schedule for start
Datacentre, and propose a solution as a priority our customers to Redbus.
Only later, we are going to propose that we will remain in the new
Customers. According to the conservation, everything will be gone in 4-6 weeks
The date for the opening. Indeed, the datacentre works with N + N of
Price we no longer on the market for 5 years. At our level it
Is a first attempt in this kind of offer on their own. If experience
Is positive, we are going to re-offend in such datacentre in province
From 2009.
It is time to talk about environmental innovations that we
Have established in this datacentre. These innovations
Cooling your plants. The goal was to offer
The maximum power in the bay, and thus ensure a cooling
Up to the bay and avoid wasting power vis-à-vis
The environment. In such cases, often we arrive at solutions to the most
Simple and more efficient and therefore necessarily the most efficient.
In a traditional datacentre, the cooling is done with clims that
Blowing cold air in the false floor. The air then passes through
Holes in the false floor in the bay (directly) and the turn of
Bay. The air cools servers and hot spell at the top. Then clims
Resume the warm air in the false ceiling to cool again. It
There are several problems with this system there:
-- Cold air cools concrete, wood, cables in the false floor.
It is therefore an unnecessary and wasteful 1st significant. Approximately 20%
Energy.
-- Cold air out of faux-floor must enter the bay through
The hole below the bay and through the small holes in front and behind
The bay. In reality, servers are blocking the passage of air through
The hole in the bay and therefore if your hosteur offers with berries
Doors without small holes, you can be sure to get 60 ° C in the
Bay. For 5 years, all hosteurs offer door with small holes
To reduce the additional cold air in the bay. The problem is that
Servers often have needs for the cold air to the bay and eject
Hot air back of the bay. As a cold 50% who tries to enter
In the Bay can not do as it is pushed by hot air
Leaving the bay. It is estimated that only 40% of the product is really cold
Used to cool the servers in the bay. The result is simple: none
Hosteur does berries with more than 2KVA electric power since
Is impossible to cool over 2KVA with this method there.
-- As there are losses associated with these 2 mess, we must initially cool
The air even more. It is therefore more energy needed to cool
Infrastructure. Approximately 2 times more.
-- The cooling of the air is very long and requires large
Fans who will make the difference in pressure sufficient to save
Air moving through every hole and servers. This requires
A lot of power.
The advantage:
-- The temperature in the bay and the ambient temperature is almost the same.
In terms of marketing when you visit a datacentre (nine) air
Is cold and there is a sense that "it works well." Yes it works well because
That by visiting on a cold. It is reassuring to have cold in a datacentre.
We worked on the issue of cooling racks under
Offers of housing for 5 months and we have developed a system
Extrement innovative and with very little waste.
We tested the solution in our datacentre Roubaix on a platform
Experimental for 3 months which has permitted us to validate the solution in charge
(100 servers HG were accommodated in these bays for 3 months). No problem
To face.
How does it work?
-------------------
You all know equipment called "fridge". You all have a fridge
Over your house or apartment. The principle is to create fridge
The cold air on the inside to keep your food, soft drinks to cool
. We have adopted the principle of housing for the fridge but the scale
Industrial:
-- Each bay has 2 clims personal fuelled by 2 systems
Cooling. The clims are physically placed at the top of the bay.
-- Each bay has 4 sides: the front, right, back and left. The
Front door is full and the door at the rear has small holes.
The doors of 2 sides of the bay (right and left) are open to semi
To a false half-bay. In this space there, clims blowing in the air
Cold. You have therefore placed with a bay of servers and then a space with
Cold air, a bay and a server space with cold air, etc.
-- The air: the air leaving clims enters the enclosed space between then
2 in the bays at his side directly inside the bay just ahead
Servers. The air inlet is located on 2 sides of the entire height of the
Bay. While cold air is actually used to cool the servers. The air
Cools servers and fate at the rear of the bay through the small holes
The rear door. The air conditioning is the hot air at the rear of the bay for the
Cool again. The cooling circuit is very short.
-- Each bay has 2 clims at its 2 sides. If a 2 system is down,
While continuing to work with the system, which remains in operation.
The advantages:
-- There is very little waste of energy. The clims are 1 meter
Servers and work only to cool the servers.
The effectiveness of our system is approximately 220% higher compared to
A traditional datacentre. It is therefore a much more datacentre
Respectful of nature.
-- The air circuit is very short and thus with little energy we
Know make a difference in pressure sufficient to make a "wind"
In the bay.
-- We know cool to 10KVA of power by bay
(Technically speaking) and 6KVA marketing. Thanks to
That the cold is distributed so homogeneous while height of the
2 bay at its sides.
-- The system of cold extrement secure because there are 2 circuits cold
For the bay and not for the datacentre.
The drawbacks:
-- Technically speaking, it's a real challenge to build such a datacentre.
The aim is to build 70 Bay fridges of cold 10KVA for each
Rack and evacuate all hot outside the datacentre.
-- The only black spot in the datacentre is that the client who is at the rear
Of the bay will be feeling hot. It may therefore think that the system
Cold in our datacentre is not good. Indeed, we set the packet
To cool servers in the bay and avoid any waste of energy for
Marketing issues. We therefore took the decision not to
The clims at the entrance to the datacentre and the galleries to cool the air
Ambient to 19 ° C. Technically speaking this is not necessary. But this gives
The feeling cold in the datacentre and thus reassures customers who visit
The datacentre. Oh, it's cold, so it's a good datacentre. In contrast
We took the risk that the customer has integrated environmental and coercion
Ecology and it will be ignored between the ambient air temperature
24 ° C-25 ° C and cold air in the bay, which is at 19 ° C. We hope that
It is true and we are not going to have prejudged the same reaction that we had
In 2004 when we announced the work on internal liquid cooling
In Ovh where we lost customers because we wanted to make ecology
We believe it is important to avoid the mess, but we will see if
Believe such customer or not.
It tells other stories ecological?
Yours
Octave