Interview Series - Mourad Ben Lakhoua SecTechno Founder

The next instalment of the Infosec Professional interview series, see us talking to Mourad Ben Lakhoua, Security Researcher, SecTechno founder,  and contributing editor of The Cloud Security Rules.

Ed: Hi Mourad, and thanks for taking the time to answer this weeks questions.
How has information security changed in the last 3 years?
Mourad:  Over the last few years, there has been a big change in the global technology of information security as it's become an important part of today’s innovation.  In the past we had no social networks, VoIP communication or cloud computing.  We are now detecting more new malware that is targeting these newer forms of technology, providing a new income stream to the cyber-criminal.

DDoS attacks are also on the rise, with new automated applications that give criminals a way to shut down a website from a single host.  But the question that we should always ask is 'are prepared for such kinds of attackes?'

Previous incidents have shown that few organisations have been prepared, and the reaction to protect our infrastructure always come after that the system was hacked and never as a preventative approach.

What do you think are the main threats facing organisations in 2012?
Mourad:  Last year we saw the evolution of Stuxnet into DuQu (another kind of malware that targets the industrial systems) which in the past, have usually been highly secured and isolated.  2012 will continue to see other targeted attacks conducted by different hacking groups, with greater focus on existing web applications, CMS systems and web servers.

Social network phishing and spamming is also going to rise.

Mobile malware such as Spyeye and ZeuS will continue to target embedded systems, as they are easily and remotely accessible.  It's likely that Android is going to be the first main target.

Are organisations ready to deal with those threats and what can they do to protect themselves?
Mourad:  If we are talking about SMB’s (small to medium sized business - ed.), than there is a big challenge for them to implement the right security measures.  Relying totally on technology, will not protect against new threats.  If we look at the risk of getting hacked, then all organizations are at the same level of risk, but for smaller organisations, the risk is often higher, as they have a limited budged for security investments.

What do you think are the main threats facing individuals in 2012?
Mourad:  There are a lot of security threats that exist in 'cyber-space', the main risk being user awareness of things like virus spreading and global malware infestation.  As social networking increases, phishing and spam will be a concern.  An initial simple approach for users who are not IT aware, is to keep systems software and anti-virus continually updated, using encryption for sensitive information and checking for basic things like HTTPS and the 'padlock' symbol for secure web access.

Which service or skill will be in hot demand in 2012?
Mourad:  This year, the top demand will be on cloud computing services.  Many organisations are looking to reduce budget by outsourcing key services that are hosted internally.  We are also finding many companies using mobile and flexible working, allowing personnel to access corporate data 24/7, by utilising on-line cloud based portals and services.

With the proliferation of social media and mass collaboration, does security policy and governance need to have separate management for threats in this area?
Mourad:  Security policy needs to be continually updated with any new technology that appears.  Social networking is not an exception and I think it should be managed by better internal security policy and governance.  Many companies are now using social media to promote their services and if the site page or social media account is hacked, that can be a big loss to the company's brand and image

Ed - thanks Mourad for your views on the current trends for 2012.

The Password is Dead (Long Live the Password)

User and password combinations for authentication have been around for decades, arguably since the mid-60's when MIT's time sharing computer CTSS had a password based authentication system.  But does simple longevity make it a good approach?

Every day, on a Twitter or Google search, you will find several recent stories referring to password cracks, hacks, break-ins, losses and the like.  Password complexity policies are standard on nearly every COTS software product within the directory service, database and ERP spaces.  Password complexity simply refers to a password of at least 8 alpha and numeric characters and a special character too.  So if you apply that approach, you're safe right?  Certainly safer yes.  But how safe?

A recent report on enterprise 'worst' practices still shows the most common passwords being things like 'password', '123456', '654321' and so on.  Not exactly imaginative [1].  The report also identified 30% of users chose a password that is less than or equal to 6 characters.  So, for the systems those users are accessing, a complex password policy isn't in place.  If so, you can't blame the user for choosing something simple and easy to remember, as they'll hope the guy who works with the computamabobs will fix it, right?  

So first things first, a secure basic password management approach requires buy in from both the system administrators and the end users.  Administrators are required to implement a management policy and the user to chose a complex password they'll remember and not adhere to things like writing the password down or disclosing it.

From an on-line perspective, there are several tools that allow the user to save their password data within a vault which can be read securely when they access the registered site.  Google has also recently announced a new password generation tool that will automatically create a 'complex' password for a newly registered account and replay the password the next time the users accesses the site, negating the need to remember the actual password. 

These sort of approaches go a long way to prove that the main issues with password management is with the user.  If the user is left to select and manage their own password, they often choose the most simple password the policy will accept, often re-using a password on several sites and performing a simple password+n+1 the next time the password expires. 

So it's all the users fault?  Not exactly.  Encryption and hacking techniques used to store the actual password data also plays its part in the circle of security.  Symmetric encryption of password data is generally seen to be a poor concept, but still widely adopted.  Whilst it allows system administrators to perform password recovery activities for users who have forgotten their password, it also opens up the possibility of hackers performing a cracking exercise on the encrypted value.  Hashing is generally seen as a more secure way to store the password data, along with a salt to prevent a cracking attempt.  Hashing is generally seen to be one-way, with the application custodian being unable to recover the password into plain text.

The increase of many enterprise organisations using multi-factor authentication, goes some way to combat the single username and password approach.  The use of tokens, biometrics and one-time-passwords have all become market options for organisations wishing to provide applications with a lower cracking factor.  

Each multi-factor approach will have a cost and benefit associated with it, and biometric based authentication will need to have a satisfactory cross-over rate to make it a viable approach for authentication.

It's been 10 years since Bill Gates pronounced that the password is dead , and many new innovations have been developed to help improve and ultimately attempt to remove the use of the password as a means for authentication.  None have fully taken over the mantel as the default authentication mechanism, but with such innovation, surely proves that passwords on their own are failing to keep information secure.

(Simon Moffatt)

Cloud - Check Your Keys to The Castle First

The 'cloud'.  It's all around us.  Many organisations are utilising cloud based services as part of their overall IT strategy.  This could be in the form of large scale infrastructure such as servers and storage from the likes of Amazon, right through to smaller components such as particular business processes like identity management.  Many Managed Service Security Providers (MSSP) provide a totally outsourced security model with many software components available on-line and on-demand via subscription pricing and the like.

Cloud security is a big concern and quite rightly so.  There should be great emphasis on the necessary agreements that both the client and provider sign up to.  SLA's for example should be well understand as well as provision demarcation points for things like server and hosting platforms.  This helps to avoid the 'no it's on the OS it's your problem', 'well no, you told me to patch it...' issues.

Many issues have been raised regarding data storage with external providers.  Data in-situ and data-in-transit is always a well discussed area.  Encryption via transit over HTTPS/SSL/TLS is all quite well known with plenty of discussion around the coffee machine over what is the best way to encrypt data at rest - hardware based crypto processors, backup shredding and the like all good for a feisty talking point.

Whilst on the topic of data, multi-tenant providers are always under the microscope to avoid data bleeding - where one client can have access to another clients data.  Strict separation of duties at both the logical access control level, right through to operational and physical separation need to be understood and managed.

Whilst all of this is important, it brings me to some simple points that I think are often overlooked.  Security is best managed using a defence-in-depth or circles approach, with the protection of key information assets being complete when several thin layers of security are merged together.  It's pointless having the best encrypted SAN in the western world if you don't have physical access control to the data centre.  Having a complex password policy loses it's muscle if you don't have well managed access control lists on the data shares and so on.  The same can be applied to cloud providers.

For example: to set up an account on any number of platform/server/data/hosting providers, you simply need an email address, password and valid credit card details.  In a few minutes I could launch a few servers, copy several gigs of data, host a new web platform all with a few clicks.  The one single entry point into that environment, which has just become an extension of my organisation's IT department, is an email address and password.  Any attacker now has a much smaller attack space to hit.  One password is what separates a malicious user to a host of services and data that are provided externally.

So, whilst the underlying components could be secured, the entry point to the externally provided components is actually much smaller.

It's often worth thinking like a user (sometimes a malicious user) when managing and implementing any external internet or 'cloud' based extension of your IT provision as it's always the smallest things which are often the most costly.

(Simon Moffatt)

Interview Series - Barry Hodge CEO SecurLinx Corporation

For the next entry in the Infosec Professional Interview Series, we chat with Barry Hodge CEO of SecurLinx Corporation on biometric authentication and the current trends in information security.

Ed: Hi Barry and thanks for spending the time with Infosec Professional.
How has information security changed in the last 3 years?

Barry:  As more companies expand their core operations to include web or cloud based services, the potential for compromised information flow and financial losses has grown exponentially. Identity management is extremely difficult in the virtual world and even the most routine interactions can have severe consequences. Most business leaders feel the pressure to have a presence on social media without a clear understanding of the risk/reward ratio of doing business in the “Wild West”. In my prior experience in corporate America most losses of intellectual property or business information were inside jobs. Now the losses can occur without the management team even being aware of the breach. As unsettling as these prospects are, they pale in comparison to the threat of the liability and business killing publicity associated with having losses of customers’ and employees’ private information. Awareness of the problem is high but solutions are expensive, often ineffective and can inhibit organizational productivity.  

What do you think are the main threats facing organisations in 2012?
Barry:  Other than a growing exposure to a major incident, I don’t see anything much different from the current issues in securing access to physical places and information systems. Protection from real financial loss and increasing liability for stored data are still the two major concerns. That being said, there are other localized or niche issues that are trickling down in the commercial marketplace. From health clubs to construction sites, identity management and productivity losses can cost companies dearly in a time when profit margins are slim and the viability of the business is threatened.

Are organisations ready to deal with those threats and what can they do to protect themselves?
Barry:  Some are some aren’t. Larger enterprises have solid security plans and measures in place. In my opinion, smaller organizations are open to finding a solution but they are not getting good advice or consultative services from their vendors. Most security technology companies are small relative to their potential customers and approach the sales process from a narrow point of view involving their particular product or service. Organizations that develop a holistic security plan and engage vendors who openly collaborate for the customer’s benefit will reap the greatest rewards. It starts with a strong in-house or contracted service for IT integration. Once that is in place, working on specific problems for performance improvement follows a well known path and enables the user to select the appropriate solutions. The best protection is to develop a plan and an implementation program. Getting started is the difficult task and sometimes you just have to take that first small step out of the comfort zone. On the vendor side of the equation, there is still a lot of technology and very little supportable product. Choose wisely.

The last 3 years has seen global organisations make significant in roads to protect data from a logical and network perspective.  Does physical access control need to play a greater part and are organisations aware of it's benefits?
Barry:  Physical security is becoming more significant for several reasons. As the economy has weakened, the workplace is less stable and the potential for damage through vandalism by disgruntled employees in on the rise. Add to that the threats of anything from terrorism to Occupy Wall Street mischief and the physical environment is highly vulnerable. Theft is always an issue but increasingly so in a down economy. Innovations in biometric modalities such as facial recognition and iris scanning can increase productivity and reduce cost of use while significantly improving security. One of the first places an organization should examine in a comprehensive security program is physical access control.  

Infosec has now become it's own profession, with job titles, budgets and certifications.  What challenges do infosec professional face on 2012?
Barry:  The biggest challenge in our industry is the velocity of change. Information security is an arms race as the opposition keeps upping the ante and we play defence by applying countermeasures to threats. Speaking as one with a loud voice to increase the criminal penalties for online activities that cause damage is one opportunity. Information systems terrorists are just as lethal to our economy as those that do physical damage to infrastructure. Deterrence is our greatest challenge.

What are the key questions your clients ask when looking to select a product or services offering?  Experience, RoI, cost etc?
Barry:  Our clients seek all of the above with an emphasis on ROI. Cost will decline as acceptance and volume increase. ROI is the first barrier that must be overcome. Most companies tend to overestimate what can be done in a year and underestimate what can be done in ten. The advances in the last 10 years have made biometric solutions cost effective. The next 10 will be amazing.

With the global credit crunch effecting budgets across all areas, is security now seen as a luxury good for many projects?
Barry:  Security is looked at by most companies as a cost of doing business and if my competitor isn’t investing, I can let it go too. My personal opinion is that security can be a competitive advantage if it increases employee productivity and decreases cost. It is our job to design and implement solutions for our customers that do just that. Technology should facilitate the provisions of better security and lower the cost of ownership to the organization. I believe that is possible today.

Ed: Thanks Barry for your time today and giving us your insight.

SIEM - 1m Lines of Noise to 1 Line of Music

Security Information and Event Management (or Monitoring) has been a round for a while and was seen as the saviour for compliance initiatives regarding intrusion, abnormal usage, insider threat, Denial of Service attacks and more.

Nearly every computational device will store a record of internal transactions that can be used for monitoring, troubleshooting or forensic analysis.  I recently heard of a murder case using the program history of a washing machine to prove the accused had in fact used the washer, the night of the murder to cleanse away any evidence.  That is probably an extreme example, but any device, script or piece of code worth it's salt will give a verbose view of what is happening either to the console or to a file.

The format of the log file has long been under discussion with several different 'standards' vying to be the standard such as the Common Event Format.  Basically, the transaction history that gets written to the file output should contain the time and date of the transaction, the transaction itself (generally the computational transaction and sometimes a more detailed description) as well as the ID that initiated the transaction and any other context information such as IP address, node or form.

Now the idea behind SIEM is to provide a centralised view of all this log data from network devices such as routers, firewalls and switches, through to directory services, core applications and even proximity door access.  Aggregating the log processing seems like a great idea.  A single point of entry reduces the need for silo'd and point management of so many different log out put streams.  With a centralised view comes with it the opportunity to perform some intelligent analysis of all this information.  The word all shouldn't be underestimated, with many verbose log streams recording basic network activity or logon / logoff activity shifting a million plus lines of information a day when aggregated.

Trying to analyse so much data requires strict rules to help remove the noise and false positives.  One of the key themes the SIEM solution should be able to perform is that of correlation - the linking of the different log streams based on a unique identifier.  Now, not all log data will contain the same identifier.  The directory log on may use email address, whilst a mainframe application may use a userid, whilst the firewall may only contain source and destination IP addresses.  In an ideal world, the SIEM system should be able to correlate and provide a 360 degree view of user activity from door access to packet delivery.  So we have a linked up and centralised view of activity.  Now what?

Like any alerting system, it's important to know what is an alert and what is a false positive.  The process for defining an alert will vary but could include static policy definition.  Creating a criteria that is deemed to be a security risk and flagging if the criteria is met.  For example, if a user accesses a particular file or performs a certain transaction, or the number of requests from a certain source IP address reaches a certain threshold.  To create those policies requires an analysis of previous unsecure activities in order to view what is a risk in the future.

Another approach is to analyse existing 'normal' behaviour and then identify anything that falls outside of this baseline.  This form of behavioural analysis can provide much more depth and scope of alerting.  Baselining can be done by grouping log events together either based on their origin or time, or by individuals and the teams or jobs their perform.  Once a baseline of information has been created, it can be easier to identify deviances and in turn manage the potential risk associated with them.  A deviance could be an activity that has occurred out of hours or from different IP address.  Access exceptions can also be tracked, by looking for individuals who have access above and beyond a typical baselined user from a particular group.

With any large volume of data, it's important to develop an intelligence layer to help drive where limited resources should be focussed.  SIEM solutions are a step in the right direction by creating a platform that can allow further intelligence and context to be applied.

(Simon Moffatt)

Interview Series - Javvad Malik CISSP GIAC GWAPT

For the next entry in the Infosec Professional interview series we are lucky enough to get the views of Javvad Malik, an independent consultant with a deep specialism in risk management and security transformation programmes.

Ed: Hi Javvad.  Thanks for agreeing to the interview.  How has information security changed over the last 3 years from a perception, threats and protection aspect?
Javvad: I see a lot of people talk about how much information security has changed in recent times. But the reality is that information security itself hasn’t changed. The fundamentals are all still the same. We’re still protecting the same types of data, in roughly the same environments against more or less the same threats. 

What has changed are the company business models. You have companies, who 3 years ago previously had a big high street presence, are now shutting down their shops and moving totally online. This has led to their online site getting a much higher priority in terms of business value (it is the business). So the mindshift within organisations towards information security has changed. Add to this the big spotlight that has been shone on information security in the last few years with major breaches and ‘hacktivists’ and you get a lot of people thinking information security is something new or radically different from a few years ago.

What do you think are the main threats facing organisations in 2012?
Javvad: Major change. Banks are integrating or separating into large portions. Other companies are shifting their business models to more outsourced solutions, or pushing their customers to on-line channels. All of these changes are usually run under tight timescales and the biggest threats are that the basics of security are overlooked in favour of delivery.

Are organisations ready to deal with those threats and what can they do to protect themselves?
Javvad: As a whole, when you look at the capabilities of organisations, dealing with the threats is relatively simple. The real challenge is to get security embedded throughout the organisation in a consistent manner. The most important thing they can do though is to educate their workforce and make them aware of how they are integral to protecting their organisation.  

2011 saw a buzz around 'APT', corporate hacking and 'hacktivists'.  Which do you think will be the most important for 2012 or is there something else on the horizon...
Javvad: So far, these attacks have focussed on the confidentiality and availability parts of attacking a company. You’ve had sensitive information leaked and you have had websites subjected to denial of service attacks.  What no-one has been talking about much is the integrity of data, which, in my opinion if targeted in an attack could have far more impact to organisations. Having your customers records leaked is one thing. Having an unreliable set of customer records totally changes the game.

Infosec has now become it's own profession - with job titles, budgets and certifications.  What challenges do infosec professionals face in 2012?
Javvad: One of the biggest challenges that’s been facing infosec professionals for quite some time now, is the fact that they operate very much within an echo chamber. Within organisations, they still don’t have much influence outside their own circle. There are hundreds of security conferences that take place across the world annually. These end up being run by security professionals, for security professionals, for the benefit of security professionals.

We need to engage the wider business and customers at large to make them more informed of security. A handful of security professionals trying to secure a global organisation of 100k people will always struggle.

With the proliferation of social media and mass collaboration, does security policy and governance need to have a separate approach for the management of threats in this area?
Javvad: Not really. These aren’t new threats. They are existing threats that have evolved and grown slightly. Companies seem to have a fetish of creating new policies and new teams to manage something that looks new and scary.  Existing policies should be sufficient to cover most, if not all of these threats. What’s more important is ensuring your staff are fully aware of what the dangers are and how they can best protect themselves and their organisations better whilst using these tools.  

Ed:  Thanks Javvad, some interesting points.  Thanks for spending time to share your views and experience for what 2012 may bring.

Identity Provisioning to Identity Intelligence

Identity provisioning has evolved significantly over the last 8-10 years, with suite and point products providing an advanced array of system connectivity, workflow, audit, compliance and role life-cycle features to help manage user identities.  Why is this important?

The user identity and associated system accounts is a key area of information security control that many compliance initiatives, such as ISO 27001 clause 11 or SoX 404 focus on.  With the rise of insider threat, a complete and effective user life cycle management process is key.

Provisioning normally includes the following basic use cases:

  • CRUD (create, read, update, delete) actions for multiple system accounts centrally
  • Policy based associations and approvals
  • Role Based Access Control for entitlement association
  • Certification, audit and reporting for previous access control associations
  • Integration with an authoritative source of user identities

As provisioning has matured and become a standard requirement for many large organisations, so too have the products, vendors and services offerings that help to implement this sort of landscape.  With this maturity comes the need to derive much more business reporting to help drive RoI and TCO decisions as well as help to understand the effectiveness of identity controls and processes.

Where does this business information or 'intelligence' come from?  Most provisioning solutions will have touch points with many different platforms, components and services.  Feeds from authoritative sources, connectors to target systems, workflow queue information, historical reporting information, failed requests, policy breaches and so on.  Normally this information will be stored in a RDBMS or a database with pointers to where that information resides.

The intelligence layer should attempt to transform the raw component infrastructure to something that resembles useful business information such as:
  • Which departments are seeing the highest access request changes?
  • Which roles are the most over or under used?
  • Which Separation of Duties policies are continually being breached?  Is this a user access issue or a policy definition issue?
  • Can we identify high risk users, access or transactions?
  • What reduction in help desk calls was attributable to self-service password resets?
  • Which users have seen the most access changes in the last 12 months?
This is all pertinent business information that helps not only to show value from a complex middle-ware infrastructure, but also helps to drive where security effort and in turn underlying risk is located.

Identity provisioning projects tend to be long complex affairs requiring deep business and technical under standing.  Whilst adding an intelligence layer may seem like added complexity, the reward in the form of increased business and risk understanding should be effort well spent.

(Simon Moffatt)