Warning: Constant ABSPATH already defined in /home/public/wp-config.php on line 27
Not Very Shocking — Why Now?
On-line Opinion Magazine…OK, it's a blog
Random header image... Refresh for more!

Not Very Shocking

Marcy Wheeler notes that the Utah server farm is not doing well.

This is nothing new. There are all kinds of server farms created across the world every year, so this isn’t experimental. It needs a lot of power so the utility is going to be dedicating major high voltage lines to feed a small electrical substation to provide power to the building. Because they will be powering a lot of electronics, that substation is a good place to filter and stabilize the power before it is fed into the service entrances and circuit breaker panels.

This isn’t small, fiddly work – it’s industrial electrical construction. They are pulling a lot of wire through conduit, but it is color coded and numbered wire, most of it 10 gauge or larger for the long runs. Having blow outs that cost $100K means bad equipment or incompetent staff.

The technology involved has been basically the same for decades. Unfortunately, the ability of the most straight-forward of government contracts to fall victim to rip-offs and cost overruns has been like this even longer.

7 comments

1 Kryten42 { 10.09.13 at 5:17 am }

LOL Yeah! 🙂

There are some good & informative comments in that post. 🙂

I ws reading a Forbes article about it:

The NSA’s Hugely Expensive Utah Data Center Has Major Electrical Problems And Basically Isn’t Working

Worse, it sounds from the WSJ’s reporting as if the contractors — architectural firm KlingStubbins which designed the electrical system, along with construction companies Balfour Beatty Construction, DPR Construction and Big-D Construction Corp — are still scrambling to figure out what’s causing the problems. The Army Corps of Engineers sent its “Tiger Team” to sort things out this summer but they were unable to pinpoint exactly what’s wrong.

“The problem, and we all know it, is that they put the appliances too close together,” a person familar with the database construction told FORBES, describing the arcs as creating “kill zones.” “They used wiring that’s not adequate to the task. We all talked about the fact that it wasn’t going to work.”

I should be surprised that the USACE that can’t figure out a basic problem, but I’m not. And I’m sure it isn’t lack of technical ability that’s the issue.

NPR has an overview and the *official* BS from the organisations involved:
NSA Says It Has ‘Mitigated’ Meltdowns At Utah Data Farm

However, according to the Salt Lake Tribune:
Shhh … NSA’s Utah Data Center may be open already

😉 😆

All this is unsurprising to me. I’ve delt with US Military and also major Organisation incompetence before. The last time was in 2000 when my company did a last-minute fast security assessment for the Manager of a new data center for PowerTel in Melbourne before it’s opening as he had some concerns over the *official* security assessment report that concluded that “Everything was fine and they were good to go.” for which they paid a US Security consultancy firm a large sum of money. Here, I’ll post the first part of our report after 2 hours simply looking around the facility:

July 2, 2000
Summary of our brief look at PowerTel’s Security.

Exterior and Entrances

We could not ascertain the presence of any monitoring devices such as cameras on the exterior of the building or in the foyer or underground Car Park.

It appears that the building Car Park has no serious entry barrier. The only barrier device we could determine was a simple card-activated boom gate at the entrance. This would allow for easy access to the elevator from the Car Park.

No security checks upon entry. Our identities were not checked and we were not required to sign a logbook. We were not issued with any form of visible identification (e.g. visitor badges).

Primary Server Room

The primary Cache Server (large yellow box) had the standard issue computer key lock. This allows easy access to the internals of this server.

The primary Router (Cisco) did not have the main drive locked in place whilst the slave drives were locked. This is despite a clear warning notice that stated: “Any removal of this drive will cause data loss and may cause malfunction”. The cable connecting this Router was simply laying on the floor of the narrow walk space between the racks. Simple clumsiness or a determined effort would have easily resulted in the disruption of the operation of this facility.

Whilst the backup Power Supplies for the facility are apparently securely locked, the keypads for the locks are unsecured and there is no monitoring.

We noticed an unsecured Windows Terminal running MS SQL-Server. It would have been a simple matter to shutdown or reboot this machine. It was also connected directly to what we discovered was one of the facilities monitoring/management servers, which could also be compromised.

One of PowerTel’s clients had a rack of Sun servers. The displays showed the standard (and default) Solaris logon screen. This implies that no security patches have been applied and allows for known attacks against these machines.

The customer up-links were contained in a rack with standard glass doors. This would allow for easy access and disruption of the facility.

Whilst we were being shown around the facility, two engineers from Nortel were installing Fiber-Optic links. The engineers answered all our questions and we were able to gather a great deal of information about the capabilities and use of this facility. The engineers assumed (through no fault of their own) that we were PowerTel employees. This was in the presence of a PowerTel employee, who made no attempt to stop our questioning.

There were no security barriers on the windows on the floor housing the primary data-center. Entry could easily be obtained from the roof of the building on the North side that was only separated by a narrow lane way.

The only camera surveillance in the main data-center was focused on the main entry doorway leading from the elevators. Two emergency exit’s were unmonitored apart from a standard, and easily bypassed, alarm circuit.

At the time of our tour, we noted that the Security Office was unmanned apparently because it was lunch time.

Yep! Same old… yadda! 😉 😀

2 Kryten42 { 10.09.13 at 5:53 am }

BTW, Al Jazeera America have a good article up:

Report: Government data collection imperils liberty and security

“Brennan Center for Justice Report says US government collects vast amounts of data for no perceptible reason”

3 Bryan { 10.09.13 at 12:29 pm }

It sounds like they saved money or skimmed by under-sizing the wire and over-sizing the circuit breakers. Instead of the breakers tripping the wiring is melting.

The ‘appliance’ reference may be to the air handling motors, having too many on a circuit and not sizing for the start-up load. In a normal industrial setting the start-ups are spaced to reduce the peak load requirement which saves money as well as reducing stress on the system. There is a separate charge on the electric bill for the peak load required, so you do not start all of the motors at the same time.

This isn’t rocket science. The requirements of the ‘appliances’ is published, including start-up, and the load capacity of wire gauge is well known even to the ‘handyman/DIY’ people. I oversize wire and plugs in rehabs, because I know people will always overload the 15AMP sockets in most houses. It’s cheap fire insurance.

Everyone has become so concerned with ‘cyber-security’ I’m surprised they even remember to put locks on their doors, much less take physical security seriously. The problem with burglars isn’t really what they steal, but what they destroy while doing. They aren’t going to disconnect cables to steal equipment, they are going to cut them.

4 Steve Bates { 10.09.13 at 5:06 pm }

What’s shocking about the security described in the article Kryten references is that it is vastly inferior to what I’ve observed in industrial settings in my IT contracting days.

Once I had to go to the Shell Data Center, the international home of (among other things) their archived seismic data, for a meeting about a system we were developing that was intended for company-wide use. I already had badged access to another Shell office complex where I worked… but that was not nearly enough. I was escorted in the main door by my Shell administrator. Both of us were escorted by security staff to a different office where we were identified and badged in. More security staff escorted us to the meeting room. When the meeting was over, we paged still more security officers to reverse the process to get us out of the building. And this was a “mere” oil corporation, not a government agency whose entire reason for existence is security. Somehow, I’m shocked… but not surprised.

5 Bryan { 10.10.13 at 12:39 am }

When the management is concentrating on reducing costs to ensure bigger bonuses, it is not surprising that standard precautions of yesterday are ignored. They are an expense, an overhead, without any obvious return, so they get dumped. Most of the places that still have some semblance of security defeat the purpose by outsourcing it to companies that have no incentive to really care if their employees are actually trustworthy, only that they are willing to accept the lowest wages possible.

The greed of the current corporate management style is the seed of the corporation’s destruction, but as long as the money is coming in, no one in power cares.

6 Badtux { 10.10.13 at 1:56 pm }

I’m sort of dubious about this story. It seems rather… convenient. I can imagine that there’s issues with the power supply at this facility, it’s sucking up most of the output of a power plant for cryin’ out loud, but sheesh.

7 Bryan { 10.10.13 at 2:59 pm }

Yeah, this is all standard, cookie cutter stuff, not whole new concepts. Someone has to have made a stupid mistake. It is time to check the math on the plans, and check the specified plans with the ‘as built’ plans.