Most, if not all, of our data resides in a data center somewhere.

It is mission-critical that we protect the data center and our data.
In this video, I cover how fire suppression and hot/cold aisles work in our data centers at the level you need for the CISSP exam.

Remember, the CISSP exam is a management-level exam, you need the right point of view to pass the exam.

You can get all my courses, free study materials, my free CISSP course and much more on https://thorteaches.com/

Transcript:

In this lecture, we’re going to talk about environmental controls and to start we’re going to look
at HVACs– Heating, Ventilation and Air Conditioning.
It has for decades been really common to keep our data centers very, very cold.
But keeping them that cold is really not needed.
It doesn’t necessarily have an adverse effect on the equipment, but it can waste hundreds of thousands of dollars every year to keep our data center five or 10 degrees lower than it actually needs to be.
If you look at the image over here on the right, you can see some vendor’s recommendations.
These are the optimal ranges for their equipment to function.
And the ranges are somewhere between 68 and 77 Fahrenheit or 20 to 25 degrees Celsius.
And those are the optimal ranges.
But they also have allowable ranges.
That is where the equipment is going to keep functioning but it is not super good for the equipment.
They will deteriorate faster and not work optimally.
Those ranges are between 59 and 90 degrees Fahrenheit or 15 to 32 Celsius.
Another negative side effect from keeping our data centers too cold is that it raises the humidity level.
We, of course, have humidity control in our data center, but keeping the temperature lower than it needs to be, raising the humidity level, we now also spend more money pulling that humidity out of the room.
I have been in major data centers where they kept the temperature somewhere around 60 degrees Fahrenheit or just around 15 degrees Celsius.
On top of that, wasting a ton of money.
It’s also miserable to work in.
Luckily, though, over the last five, 10 years, it’s become more common knowledge that it is OK to save the money and raise the temperature.
Now, where that sweet spot is really depends on who you are.
But somewhere in the low 20s for Celsius or low 70s for Fahrenheit, it’s probably a good range to shoot for.
And other thing we want to do in our data centers is to keep a positive pressure.
And that means that inside the data center we have a slightly higher air pressure than the rooms outside.
So when someone opens the door to the data center, some air should blow out and not into the data center.
And we do this to keep contaminants out.
And contaminants in this case is just dust and dirt.
And the dust and dirt needs to be kept out because the data center needs to be cleaner than most of the rest of the building.
Because of our fire suppression, some of our fire suppression will use the particle density in the air to determine if they should turn on the fire suppression or not.
So keeping that positive pressure and making sure that our data center is clean can help us not to the fire suppression well it shouldn’t be.
In one of the data centers I have worked in, we had the positive pressure.
We cleaned the data center, but they didn’t do a cleaning of the sub floors and under the sub floors is where the ran all the cables.
So there’s a good foot, foot and a half or 50 centimeters of space between the actual floor and the floor that all the racks were on.
So the sub flooring is made out of a metal grid structure, kind of like a frame that separates the floor up into 2×2 foot tiles or 60 centimeters.
And of course, this needs to be built completely right.
We’re going to put thousands of pounds of servers and racks on top of it.
And then the floor tiles are either fixed, there’s no openings in them or they’re perforated so air can come up in our cold aisles.
And in the next slide, I’ll show you how this actually works.
So you have a better idea.
But what happened in this example is they hadn’t cleaned below the sub floor.
And one day a technician was in the floor, working, pulling cables, and there was so much dust under there that the particle count rose and the fire suppression kicked in.
Obviously, nothing bad happened.
One of the people in the data center got knocked on his knees when the FM200 kicked in.
But had they cleaned the floors, this would never have happened.
On top of that, refilling the first 200 costs, I think about $30,000.
So here, after this happened, they implemented a cycle where every three months someone would come in and clean under the sub flooring.
A little bit of a sidetrack here, but you need to understand why we keep that positive pressure and why it is so important that we keep the contaminants out.
We have talked about humidity before in our data center.
We need to keep it somewhere between 40 and 60 percent relative humidity.
If we keep the humidity too low, we call static electricity, which we know can damage our equipment.
If we keep the humidity too high, then metals will start to corrode.
And since that is what our servers are made of, it’s a really bad idea.
In our HVAC systems, an integral part of those is both a humidifier and a dehumidifier.
We need to ensure that we operate in the optimal range.
I have worked in data centers where it is both really, really dry and others where it’s really, really humid.
In central California, that was the place where it was 115 Fahrenheit in the summer or 45 Celsius, there, we needed to use the humidifier because the relative humidit in the air that we pull in was too low.
Now, when I worked in Oahu in Hawaii, there, we needed to pull humidity out of the air.
And we already touched on how important it is to make sure that we have enough HVAC even if one unit goes out.
The humidity being raised for a while is probably not horrible.
The temperature is.
But for anything critical, we need to ensure we have that redundancy.
So if one thing breaks, we have another that is enough to carry the whole load.
And while this is not specifically your responsibility, it is either the data center manager or the server manager.
You would this, just like anything else on the exam, need to understand exactly how it works so you can ask the right questions to ensure we have the right protection.
So moving on, this was the slide I was talking about where we have the sub flooring, the sub ceiling and then the hot and cold aisles.
The bottom solid blue line is the actual real floor.
The solid top blue line is the actual true ceiling.
And then out on the right, the solid blue line there is the slab to slab wall.
Then about a foot and a half from the true floor and a true ceiling, we have the sub floor and the sub ceiling, a foot and a half again, it’s about 50 centimeters.
And that sub flooring is where all that dust was gathered.
Once they opened it up, the air pushed it out, the fire suppression kicked in.
When we put our servers and our equipment in the racks, they all faced the same way.
All the servers are designed to pull air into the front and push it out the back.
And this is where the hot and cold aisles come in.
If you look at the graphics, you can see we have the perforated grates and the front of the rack.
The cold air is pulled in through the servers and pushed out the back.
That’s the hot aisle.
Since high rises, it is pulled out above the servers, circulated through the HVACs and pushed in again in the sub flooring.
In most racks, we would also have rackable switches, and they face the other way because the networking cables on the servers need to be attached in the back.
So the front of the switch faces the hot aisle, the back, the cold aisle.
The switch pulls the air in from the back and pushes it out the front.
So in a well-designed data center, depending on where you stand, there will be a huge temperature difference.
If you’re standing on one of the grates in the cold aisle, it might be 50 degrees Fahrenheit or 10 degrees Celsius.
Now, if you’re standing on the other side at the exhaust of a massive server, the temperature there can maybe be a 105 degrees Fahrenheit or 40 degrees Celsius.
I have seen some data centers where the perforated grates were just randomly placed throughout the data center.
It doesn’t make any sense.
It is going to waste a ton of money because we’re not pushing the air up exactly where it’s needed.
So the perforated grates should only be right in front of the racks where the servers have their air intake and the rest of the floor should not be perforated.
And I’m guessing many of those places where they were just in random places, they probably were in the right place to begin with.
But then some data center person came in, removed five tiles and put them back, not quite in front of the server.
And then they did again and again and again.
And over time, the perforated grates were just kind of dispersed in random places.
Throughout our sub flooring, it is very common to have drains since in many places the HVAC units have that dehumidifier we talked about.
So a ton of water is going to get pulled out of the air through the systems and normally they have places that take it out.
But what if that breaks?
On top of that, they also build up natural condensation.
If we don’t know that there’s water pooling in the sub flooring and that is where we run our electricity, well, that just seems like a horrible scenario.
So we either have drains in the sub flooring or we have sensors that alert us whenever there’s water present under the flooring.
And the idea of a drain might sound smart. If there’s water, it runs out, but a drain can go to waste.
What if there’s an overflow and it starts flowing water into the data center?
And there are obviously fixes to make sure that water only goes one way, but it is part of our design considerations when we build that data center, do we think drains is a good idea?
And if we do and we can document, why, well then we implement.
But as part of your due diligence, you do your research and then implement due care and the issue with water in the subflooring is a real issue, which is why many places choose not to run any cables in the sub flooring or if they run them, definitely don’t run the power cables down there.
Because electricity and water bad mix.
For the ones where we don’t run the cables below the floors, it’s very common to have cable trays above the racks.
Just like a huge grid system, couple of trays running over each rack that then connects the whole data center, all the major switches that needs to connect to the switches in the rack or the patch panel, but here again, we need to make sure that if we use copper Ethernet, well, then we don’t run the power in the same tray.
If we’re a limited on trays, then really we should use fiber, which we can run next to the power cables because it doesn’t use electricity, it uses light.
And because of that, it is not susceptible to EMI.
We have already covered electricity and EMI, now let’s look at static electricity.
So we obviously have the right humidity in the data center.
All our circuits are grounded.
On top of that, whenever we work with hardware, we use at least an anti static wrist strap.
There are some places where I’ve seen these anti static shoes, you can see a picture of some here, not the prettiest things, but they can save our hardware.
The wrist straps and the shoes are for anyone who’s working on our hardware.
When you open a server up and you touch the motherboards, the memory cards, the hard drives, anything really hardware, you need to wear that strap.
And I guess in some cases the shoes.
I have never worked anywhere where we’re used to shoes, but everywhere I have worked, we used the wrist strap.
It might not matter in 999 out of a thousand times, but that one time can be catastrophic.
And with that, we’re done with this lecture and I will see you in the next one.