This is one of those things you wait for, like the other shoe to drop,
but with the movement away from the data center to the cloud, it is an
expectation that malware would follow the migration. There is no reason
to think that your cloud applications are any more secure than any of
the other applications that you run.
Computer Associates reports and PCW, MX and other news sites
are reporting this morning that the Zeus Trojan has been found using
Amazon’s EC2 (Elastic Compute Cloud) as a command and control server
for a botnet operation. Being in the cloud does not absolve anyone of
good system administration, nor does it absolve anyone from updates,
patches, and generally checking the health and welfare of the systems
that anyone runs in any cloud computing environment. The cloud
environment is only as safe as the security engineers and system
administrators make it. If you are running applications that can be
hacked (meaning almost any application that faces the public internet)
they will be, regardless of where they reside, in the cloud, in the
data center, on the desktop or on a mobile device.
When I did my first big cloud computing project back in March, there were a lot of hoops to jump through, and part of it was learning how to connect to a Linux AMI using windows.
One of the wonderful issues when using the Linux AMI’s at Amazon is the
use of PKI keys to authenticate root. This at least helps on people
wanting to Sudo to root, without the key they won’t be able to do that
simple and often used hacking step.
The other part was to make sure that all the updates, yast/yum
worked as they needed to work, and one of the core issues with some of
the Linux AMI’s was that they were not easily updated through yum. This
is a liability so when choosing your Linux AMI the first thing you
should do is make sure that you can update the applications through yum
as your second step after logging in. This will at least help ensure
that the common core components that you are using in the Linux
Distribution are at least updatable easily and then can be easily set
into your normal update rotation.
The next step and this is the really important step is to keep up
with patches and updates like you would with any other computer system.
Windows and Linux can all be updated centrally using any one of the
automatic update software programs that are out there. Pushing patches,
and making sure the applications you are running in the cloud are
updated is step three.
What it looks like though is that the person pushing the Zeus
Trojan essentially did not update an application that allowed the
hacker into the cloud system as an opportunistic target. Hackers will
always go for the easy target first and bypass the harder targets.
Hackers will also use just about any system from Twitter to RSS to do
C&C operations for their networks. Botnets are the way that hackers
are building communications systems, and something you want to ensure
does not happen to your cloud computing systems. All it takes is
ensuring that you are doing the normal standard update, patch, and
security operations that fit in line with the policies and procedures
that the company has for securing computer systems.
The cloud is just one big computer system that is managed according
to the standards and practices at the company running them. Companies
should ensure that they treat their cloud assets and their cloud
software like they do with any asset under their control and practice
safe security.
(Cross-posted @ IT Toolbox)
Malware is evolving so fast that I am surprised the security companies are able to keep up as well as they have. It seems that it would not be surprising to see detection rates in the 79& area, but we get 98-99% rates and still complain..