| Subcribe via RSS

The (Immediate) Future of Ransomware

April 26th, 2016 | No Comments | Posted in IT Industry, Security

ransomware-chartIn keeping with the fact individuals and enterprises are seeing and experiencing a lot more occurrences of ransomware, I’m also seeing a lot of articles and comments either discussing it and what to do about it or providing some siloed indicators of where ransomware might go.

A number of comments, in my opinion, are aimed at what ransomware has been up until now and how to combat it.  Very soon, few if any of these suggestions are going to be effective in stemming the tide of ransomware. It’s my opinion that ransomware is already exhibiting some horrifying variations that we aren’t taking into consideration fast enough.

To Really Protect, Think Like a Criminal

It’s not a coincidence that some of the best minds out there when it comes to really understanding IT Security and how to actually address risk and stop these types of well conceived and formed attacks come from those who lived on the dark side and have come to the light – former hackers like Kevin Mitnick, Robert Morris and others. And we have a lot of white hats (too many to name here) who are extremely good precisely because they (a) think the same way as the criminal black hats and (b) have incredible intimate technical knowledge just as black hats do.

And don’t think your servers are safe. Hackers are already looking to get inside of your data center and maliciously encrypt and hold for ransom as much of your company as they can.

In order to effectively handle some of these malevolent attacks, you can’t be standing still. The whole history of dark-side hacking, breaches and generalized wreaking of havoc paints a storied picture of never standing still. Because once something is proven as technically possible, the very next thought by highly sophisticated hackers is instantly “How can this be extended?!”

Almost all hacks start out as “let’s try something” attempts.  Initially even conceiving of a new vector often takes intimate and expert knowledge of the target (usually operating system, but sometimes a target language flaw or other kind of architecture). But once a potential vector is exposed as having validity, it’s game on. There’s the initial hack, and then all the “mods” (modifications) that go with it come like a flood. (Reference the attached graphic associated with this article.)

Ransomware is no different. Just when you think you’ve got the attack scheme and the attack vector figured out, so many mods are hitting you, it makes your head swim. I’m seeing some articles, well-meaning, that state “if you just do this, this and this, you can stop ransomware.” If you do those things, yes, you can stop the ransomware of today or the ransomware of last week. But you won’t be doing much to stop the ransomware of next week or next month or that’s coming out in three months.

Hackers are always thinking fifteen steps ahead. It’s time we started doing the same. Here are some things to “look forward to” and expect when it comes to ransomware. A lot of these mods are already in the wild! But if they are not, you can be sure, hackers are already working on these: More »

Tags: , ,

The Case for “Encrypt Everything”

April 13th, 2016 | No Comments | Posted in Uncategorized

Visual concept of an internet connected laptop with server rack background doing virtually sophisticated data processing calculation.Within the IT industry, when considering data-at-rest (DAR) encryption, you may have noticed recently that security experts seem a little divided on how to leverage and apply this technology. Many experts have stated that only sensitive data should be encrypted. Others seem to be preaching a “new” gospel that all data-at-rest should be encrypted. Why this philosophical split and what, if anything, has taken us down this path? Does a right answer exist, and are there advantages to one strategy over another?

I will state up front as an initial thesis that, historically, obstacles around encryption have driven a lot of its present day conception and (lack of) acceptance, adoption and use within enterprises. If we take a quick look back in time as well take as a sampling of some uses of encryption today, I think it’s pretty easy to demonstrate that without these past obstacles, the intermediate and precautionary concept of “encrypt only sensitive data” would never have come into play and enterprises today would be encrypting all data-at-rest as standard operating procedure.

Historical Obstacles to Pervasive Encryption

Pervasive DAR encryption has always held at least conceptual appeal. As companies have faced escalating threats to their data in recent years, the idea of restricting data and information to only those parties with “need to know” status has quickly grown in time as an attractive option. But that attraction and appeal has in the past never been able to find realization due to the sizable obstacles surrounding DAR encryption.  Few realize that the dynamics around enterprise DAR encryption have changed such that these ambitions can now be very readily realized.

Taking an encrypt everything approach releases organizations to better and more quickly address expansion needs and develop and adopt efficient and cost-effective operational and scaling strategies while simultaneously and immediately addressing risk…

But before jumping too far ahead, let’s quickly review some of the main obstacles that have driven most companies to apprehension around DAR encryption: More »

The Problem of Non-User IDs in Organizations Today

February 4th, 2016 | No Comments | Posted in General Idm/IAM, IdM Engagement

identities(The contents of this article are captured here and reflected back in response to an article posted on SailPoint’s Identity Quotient Blog article entitled “Third-Party Contractors: The Target Breach’s Bulls-eye.” I recommend reading that article to establish context for this article.)

It is fairly well known and pretty much public knowledge that the Target breach took place leveraging 3rd party credentials that were phished from an associated Heating Venting and Air Conditioning (HVAC) vendor.  This was the initial point of entry into the Target network.

However, the HVAC credentials were primarily leveraged only for initial access. Credit card data was not being accessed and syphoned using that specific HVAC ID. Nevertheless, controls around time of access and other metadata information that could be policy driven within SailPoint IdentityIQ around that 3rd party access are still cogent to the discussion as per the aforementioned SailPoint article.

What isn’t mentioned in the article is that SailPoint IdentityIQ and ideally any IdM product could and should have a very big part to play in the gathering of and providing governance around Non-User IDs (NUIDs) — testing IDs, training IDs, B2B FTP IDs, generic admin IDs (that should be privileged access managed anyway), application IDs (huge!), etc.

Organizations typically have thousands, tens of thousands and yes, even millions of orphaned and ungoverned NUIDs, in terms of overall access, proliferated, orphaned and laying dormant on end-point servers and systems…

To an attacker, an ID is an ID is an ID. Any ID will suffice in order to establish a beachhead on a system and then begin trying to “walk” systems, ideally though the elevation of access. This is typically how deep penetration and spanning of internal networks has taken place in a lot of recent breaches. When attacking a system and attempting to establish access, it doesn’t matter to the attacker whether the initial ID used is technically a normal and established user ID (with or without governance around it) or a NUID that typically is not being properly tracked and governed within organizations. In fact, NUIDs represent an ideal target due to the fact they don’t have visibility and normal and established governance around them in many organizations.
More »

Tags: , , , , ,

History Demonstrates Strong Encryption Is Here To Stay

January 15th, 2016 | No Comments | Posted in Data Security, Security

magic-book-burning-247(Originally published on LinkedIn – January 13th, 2016)

I am a very firm believer that knowing the background and history of things provides a much better forward-looking perspective and present decision making capability. Would that this view was adopted more. If it were, the age old George Santayana quote that “those who don’t remember the past are condemned to repeat it” would never have come into existence. The fact mankind never really seems to learn the lessons of history also seems to trap the unfolding of events in a cyclical pattern.

The Encryption Debate and History’s Lesson

Encryption, that technology that for years in the computing world has done its job quietly in the background and without much acclaim, is suddenly a topic that is all the rage due to recent and tragic world events. Lawmakers stipulate and paint a gloomy picture that without the ability to intercept and decipher encrypted communications on the part of criminals and terrorists, national security is at serious risk. Technologists on the other hand, including myself, maintain that the implementation of so-called “backdoor encryption” in effect weakens encryption for all of us with severe consequences and effects to our normal, everyday security, economy and lives. Essentially, to weaken encryption would be to cut off our noses to spite our collective economic and everyday-life faces. Lawmakers and technologists and technology companies are digging the trenches and the staunch faceoff, while mostly civil at the moment, continues.

In a recent interview for The Wall Street Journal, Max Levchin, past co-founder of PayPal and a cryptography expert, questions along with other technologists (including yours truly) whether lawmakers really understand how encryption actually works. Levchin goes on to stipulate that if we’re going to continue the national debate, let’s at least make sure lawmakers do in fact understand how encryption works technically.  And perhaps few are more qualified to step up and provide such an education than Max and other well known cryptographers in the cryptographic community.

Not only do I question whether lawmakers understand how encryption works, I also question whether they’ve really taken into account how the world works. It would be easy for anyone to say “how the world works today” but history, if we’re willing to learn from it, demonstrates the world has been working a certain way for a very long time when it comes to widespread technological innovation leveraged in conjunction with outside agenda.

Let’s take a quick lesson from history that coincidentally has ties to today’s date – January 13th – and see if history has anything to teach us concerning how the weakening of encryption would very likely play out were lawmakers to insist on their position through mandatory legislation.
More »

Tags: , , , , ,

Considerations Around Application Encryption

December 22nd, 2015 | No Comments | Posted in Data Security, IT Industry, Security, Tools

person-encryption-623x420For years, the use of encryption to protect data-at-rest on computers within the enterprise was solely the responsibility of developers who coded the applications that used or generated the data. Early on, developers had little choice but to “roll their own” encryption implementations. Some of these early implementations were mathematically sound and somewhat secure. Other implementations, while perhaps not mathematically sound, were adequate for the risk developers were attempting to mitigate.

As technology progressed, choices for encryption matured and solidified across development stacks. Callable libraries were born. Algorithms were perfected, significantly strengthened and pushed into the public domain. And beyond application encryption, encryption itself began to offer benefits to the enterprise at an operational level – within turnkey, off-the-shelf solutions that could be aimed at specific enterprise use cases such as end-point data loss prevention (DLP), encrypted backups, and full-disk encryption (FDE) among others.

Today however, when CISOs and senior security, software and enterprise architects think of protecting data-at-rest, their conceptions can sometimes harken back to days of old and they will stipulate encryption solutions as necessarily needing to be implemented at the application layer.

And while it turns out there is actually no extreme fallacy in this thinking and some benefits at this layer remain, there are some considerations and tradeoffs surrounding application encryption that aren’t overtly obvious. These considerations and tradeoffs can get lost when not weighed along with more recent turnkey, transparent solutions that get implemented at a different architectural layer with nearly the same benefit yet with much less risk and cost association.

Let’s look at and consider some of the ins and outs of application encryption. Hopefully the following thoughts and considerations will help those who are deep in the throes of needing to make a decision around encryption of data-at-rest.
More »

Tags: , , , , ,

Two Years Later: Reflections from “The Breach”

November 6th, 2015 | No Comments | Posted in Data Security, IT Industry, Security

target-100221410-largePresident and CEO of Vormetric, Alan Kessler, blogged earlier this week concerning the far-reaching impacts of the Target breach – reflections from almost two years later. Alan remonstrated in his article that the Target breach was the most visible mile marker in 2014, a year full of breaches and continuing into 2015, and he went on to discuss and reflect on some of the other specific breaches.

In this article, I would like to reflect on some of the industry-wide changes that have taken place since the Target breach.

“The Breach”

The Target breach was so significant that for at least the first year afterward, it was referred to, especially in security circles and even on the news, as simply “The Breach.” And as Alan has already detailed, that breach was merely a harbinger of things to come with major breach after major breach taking place after the Target breach.

But what has been the impact of all these breaches? As one would expect, reactions and responses to “The Breach” by organizations have been all over the map.  Some have, as the saying goes, not “let a good crisis go to waste” and have become better companies as a result. Others have not fared or reacted as well.

While “The Breach” and the major breaches afterward has led most major retailers to reevaluate their data security approach, the retail edition of the Vormetric 2015 Insider Threat Report shows that retailers still have a long way to go. Over 51% of retail respondents reported being very or even extremely vulnerable to insider threats – the highest rates measured in the study. Many of these organizations continue to invest in security and utilizing traditional approaches that have proven over the last two years to be insufficient.

While the threat obviously still remains high and a number of organizations still admit they have a long way to go, positive changes have taken place since “The Breach” (hereafter referred to simply as the breach) that are moving the retail industry and other industries along in a positive direction.
More »

Tags: , , , , , ,

History Foretells the Raising of the Ante: Securing Your Data Now All but Mandatory

August 31st, 2015 | No Comments | Posted in Data Security, IT Industry, Security

Federal-Trade-Commission-FTCIt’s been said that those who don’t learn from history are doomed to repeat it. In my last article, I wrote metaphorically about the medieval arms race to protect the pot of gold inside the castle from outside intruders. This time I want to draw upon history as the telescopic lens through which we forecast the journey into the future in a world full of advanced technology. Through this lens, we will see that the future is already here and history is beginning to write the same story again.

We’ll aim our history telescope backwards in time to the technological breakthrough of the automobile. As with any technology, the advent of each is initially only embraced by a few, and the same is true of the automobile. While the first automobile may have been designed and custom-built as early as the late 1600’s, automobiles were not mass produced and available to the general public until the turn of the 20th century. Widespread, generalized use of the automobile came about right after World World I, thanks to the genius of Henry Ford.

Even in the early days of the automobile, there existed enough power in these “new” devices to wreak havoc upon lives whenever there was an automobile accident. Victims of such accidents were often left holding the bag in terms of the costs and consequences, as were the drivers themselves, regardless of who was at fault. At some point the repeat scenario of “cause and victim” attracted the attention of governments and the auto insurance industry was born through mandatory legislation. The ones welding the wheel of this new technology were made accountable and the ante was raised.

Shift ahead to the 21st century and we behold the power of a world full of automation, driven by the wonders of computer technology. And while computer technology is no longer new either, the global use of computer technology as the business engine fueled by its gasoline of endless data tied to the consumer is starting to have the same effect whenever the “accidents” that we call breaches take place. Governments are beginning to wake up and take notice, and questions concerning liability are starting to be asked. In effect, the future is happening now, history is in the process of repeating itself, and the ante is being raised once again.

More »

Tags: , , , , , , , ,

Data Is The New Gold: Getting Data Security Right in Retail

August 28th, 2015 | No Comments | Posted in Data Security, Security

+44 (0) 7710 787 708 images@adamparker.co.uk

Traditional security has always been metaphorically tied to the medieval castle building of old: building thicker walls and drawbridges, creating multiple perimeters, raising larger armies, you know – the whole nine yards. This paradigm extends into the modern world, which maintains its fascination with sophisticated perimeters. For exhibit A, witness the recent Mission: Impossible Rogue Nation Hollywood blockbuster where sophisticated perimeter security was the primary obstacle to overcome.

But imagine changing that mindset from traditional perimeter-based security to data-centric. A data-centric approach, cast against the metaphorical medieval art of castle building, would result in thieves penetrating outer defenses, only to find the pot of gold actually filled with worthless tokens or paper notes.

Throughout the movie, traditional approaches didn’t stop Ethan Hunt (the protagonist, manipulated by the antagonist into doing his dirty work) and they won’t stop Ethan Hunt-like hackers from infiltrating retailers’ networks.

As the world progresses from a mere “information age” into an age of “big data,” it’s simple – the volume, granularity and sensitivity of individual data is growing exponentially. With this growth comes severe risks and consequences of losing valuable data.
More »

Tags: , , , , , , ,

Great (SailPoint) Work Is Out There!

Today was it. Today was the day I finally broke down and went beyond lamenting that I can’t clone myself. Today was the day I looked in the mirror and called myself a little bit of stupid and a little bit of selfish.

The Problem I Wish Everyone Had

They always say start by defining the problem.

There are problems and then there are problems. Real problems are bad. Other problems are actually good to have. I’m happy to say I confront the latter almost every day and I’d really like to share these problems with you. More on that later where you can be part of the solution to a lot of open problems I know about, if you want.

But let’s face it… we all know it. Security is hot right now. And if you’ve done a good job in security and are somewhat known, it’s nuclear. My problem is I get lots of fantastic opportunities come my way every day. I think about a lot of you out there. I get some really, really nice opportunities. And I lament I can’t respond to them all.

Me At Vormetric

I’m doing well at Vormetric and Vormetric is doing extremely well in the market place. Vormetric is posed on the edge of what I believe is a radical change in how enterprises go about Data Security and Encryption.

Vormetric does what it does extremely well; better than anyone else in the market place. So I’m set. I love what I do and more importantly what I can do for other people. Vormetric fills an important void. (And believe it or not, Data Security and Encryption has a direct tie-in to how enterprises should approach Identity Management that I had never considered before and a lot of companies still aren’t considering — it’s the “bottom third” that Identity Management can’t touch. More on that in another post.)

Those are the things that really drive me at core… what I can do to legitimately help other people in the mission-critical security space. Which dovetails right in line with the theme of this posting. If you are interested, keep reading.
More »

Tags: , , , , ,

SailPoint IIQ: Rule Modeling in Real Java :-)

I’ve been sitting on this article and concept for months and have had others ask me about it via email — whether I’ve ever done something like this before — and well… here it is.

Tired of No BeanShell Coding Validation!

It turns out I was sitting around in my hotel room in Bangalore on India Independence Day last year, whacking away on some client code, doing some data modeling using CSV. I had a somewhat involved BuildMap rule I was working on and I was getting a null pointer exception I simply could not find. A few hours and one simple coding mistake later, once discovered, I was finally on my way. But it was really discouraging to know that if I had been coding in Eclipse, the coding mistake would have been spotted immediately.

The next thought I had was actually two-fold. While I have at times actually written test straps in real Java using the Sailpoint IIQ Java libraries (ie. jars) and dropped my BeanShell code into procedures to instantly validate the syntax, I have also wanted at some point in time to be able to simulate or partially simulate rule modeling and data modeling outside of Sailpoint IIQ using Java I had complete control over writing and executing.

So on this particular day, being particularly irked, I decided to combine those two wishes and see what I could do about having a place I could not only drop, for instance, BuildMap rule code into Eclipse and instantly validate it, but also execute the code I intended for Sailpoint IIQ against connector sources I also had connected to Sailpoint IIQ (in development, of course!) and see and manipulate the results.

Once I was done iterating my development over a real dataset, I could take my validated Java code, drop it back into Sailpoint IIQ in BeanShell and have not only validated but also working code in Sailpoint IIQ with very little or no modification.

Establishing SailPoint Context

One thing you will need if you want to run your Java code in an actual Sailpoint IIQ context outside of Sailpoint IIQ proper is establishing SailPointContext in your code. This, I will tell you, while not impossible, is not easy to do. You need to implement the Spring Framework and a lot of other stuff. If you are interested in doing this and have access to SailPoint Compass, you can actually read about establishing SailPointContext here4.

Since doing that much work wasn’t something I had the time for doing, almost immediately I decided to implement a partial simulation that would allow me to (1) model and validate my rule and (2) also allow me to model my data very simply and easily without establishing SailPointContext. I could still achieve my goal of iterating the solution to produce validated and working code to drop back into Sailpoint IIQ in this way.

The Code

Amazingly, the code for simulating a BuildMap rule, pointing it to the actual CSV I intend for Sailpoint IIQ, and simulating an account aggregation task is not that complex. Once you have the code, if you understand how Sailpoint IIQ works in general, you could conceivably re-engineer and simulate other segments of Sailpoint IIQ processing or modeling other rule types and.or data outside of Sailpoint IIQ1.
More »

Tags: , , , , , , , ,