| Subcribe via RSS

Two Years Later: Reflections from “The Breach”

November 6th, 2015 | No Comments | Posted in Data Security, IT Industry, Security

target-100221410-largePresident and CEO of Vormetric, Alan Kessler, blogged earlier this week concerning the far-reaching impacts of the Target breach – reflections from almost two years later. Alan remonstrated in his article that the Target breach was the most visible mile marker in 2014, a year full of breaches and continuing into 2015, and he went on to discuss and reflect on some of the other specific breaches.

In this article, I would like to reflect on some of the industry-wide changes that have taken place since the Target breach.

“The Breach”

The Target breach was so significant that for at least the first year afterward, it was referred to, especially in security circles and even on the news, as simply “The Breach.” And as Alan has already detailed, that breach was merely a harbinger of things to come with major breach after major breach taking place after the Target breach.

But what has been the impact of all these breaches? As one would expect, reactions and responses to “The Breach” by organizations have been all over the map.  Some have, as the saying goes, not “let a good crisis go to waste” and have become better companies as a result. Others have not fared or reacted as well.

While “The Breach” and the major breaches afterward has led most major retailers to reevaluate their data security approach, the retail edition of the Vormetric 2015 Insider Threat Report shows that retailers still have a long way to go. Over 51% of retail respondents reported being very or even extremely vulnerable to insider threats – the highest rates measured in the study. Many of these organizations continue to invest in security and utilizing traditional approaches that have proven over the last two years to be insufficient.

While the threat obviously still remains high and a number of organizations still admit they have a long way to go, positive changes have taken place since “The Breach” (hereafter referred to simply as the breach) that are moving the retail industry and other industries along in a positive direction.
More »

Tags: , , , , , ,

History Foretells the Raising of the Ante: Securing Your Data Now All but Mandatory

August 31st, 2015 | No Comments | Posted in Data Security, IT Industry, Security

Federal-Trade-Commission-FTCIt’s been said that those who don’t learn from history are doomed to repeat it. In my last article, I wrote metaphorically about the medieval arms race to protect the pot of gold inside the castle from outside intruders. This time I want to draw upon history as the telescopic lens through which we forecast the journey into the future in a world full of advanced technology. Through this lens, we will see that the future is already here and history is beginning to write the same story again.

We’ll aim our history telescope backwards in time to the technological breakthrough of the automobile. As with any technology, the advent of each is initially only embraced by a few, and the same is true of the automobile. While the first automobile may have been designed and custom-built as early as the late 1600’s, automobiles were not mass produced and available to the general public until the turn of the 20th century. Widespread, generalized use of the automobile came about right after World World I, thanks to the genius of Henry Ford.

Even in the early days of the automobile, there existed enough power in these “new” devices to wreak havoc upon lives whenever there was an automobile accident. Victims of such accidents were often left holding the bag in terms of the costs and consequences, as were the drivers themselves, regardless of who was at fault. At some point the repeat scenario of “cause and victim” attracted the attention of governments and the auto insurance industry was born through mandatory legislation. The ones welding the wheel of this new technology were made accountable and the ante was raised.

Shift ahead to the 21st century and we behold the power of a world full of automation, driven by the wonders of computer technology. And while computer technology is no longer new either, the global use of computer technology as the business engine fueled by its gasoline of endless data tied to the consumer is starting to have the same effect whenever the “accidents” that we call breaches take place. Governments are beginning to wake up and take notice, and questions concerning liability are starting to be asked. In effect, the future is happening now, history is in the process of repeating itself, and the ante is being raised once again.

More »

Tags: , , , , , , , ,

Data Is The New Gold: Getting Data Security Right in Retail

August 28th, 2015 | No Comments | Posted in Data Security, Security

+44 (0) 7710 787 708 images@adamparker.co.uk

Traditional security has always been metaphorically tied to the medieval castle building of old: building thicker walls and drawbridges, creating multiple perimeters, raising larger armies, you know – the whole nine yards. This paradigm extends into the modern world, which maintains its fascination with sophisticated perimeters. For exhibit A, witness the recent Mission: Impossible Rogue Nation Hollywood blockbuster where sophisticated perimeter security was the primary obstacle to overcome.

But imagine changing that mindset from traditional perimeter-based security to data-centric. A data-centric approach, cast against the metaphorical medieval art of castle building, would result in thieves penetrating outer defenses, only to find the pot of gold actually filled with worthless tokens or paper notes.

Throughout the movie, traditional approaches didn’t stop Ethan Hunt (the protagonist, manipulated by the antagonist into doing his dirty work) and they won’t stop Ethan Hunt-like hackers from infiltrating retailers’ networks.

As the world progresses from a mere “information age” into an age of “big data,” it’s simple – the volume, granularity and sensitivity of individual data is growing exponentially. With this growth comes severe risks and consequences of losing valuable data.
More »

Tags: , , , , , , ,

Great (SailPoint) Work Is Out There!

Today was it. Today was the day I finally broke down and went beyond lamenting that I can’t clone myself. Today was the day I looked in the mirror and called myself a little bit of stupid and a little bit of selfish.

The Problem I Wish Everyone Had

They always say start by defining the problem.

There are problems and then there are problems. Real problems are bad. Other problems are actually good to have. I’m happy to say I confront the latter almost every day and I’d really like to share these problems with you. More on that later where you can be part of the solution to a lot of open problems I know about, if you want.

But let’s face it… we all know it. Security is hot right now. And if you’ve done a good job in security and are somewhat known, it’s nuclear. My problem is I get lots of fantastic opportunities come my way every day. I think about a lot of you out there. I get some really, really nice opportunities. And I lament I can’t respond to them all.

Me At Vormetric

I’m doing well at Vormetric and Vormetric is doing extremely well in the market place. Vormetric is posed on the edge of what I believe is a radical change in how enterprises go about Data Security and Encryption.

Vormetric does what it does extremely well; better than anyone else in the market place. So I’m set. I love what I do and more importantly what I can do for other people. Vormetric fills an important void. (And believe it or not, Data Security and Encryption has a direct tie-in to how enterprises should approach Identity Management that I had never considered before and a lot of companies still aren’t considering — it’s the “bottom third” that Identity Management can’t touch. More on that in another post.)

Those are the things that really drive me at core… what I can do to legitimately help other people in the mission-critical security space. Which dovetails right in line with the theme of this posting. If you are interested, keep reading.
More »

Tags: , , , , ,

SailPoint IIQ: Rule Modeling in Real Java :-)

I’ve been sitting on this article and concept for months and have had others ask me about it via email — whether I’ve ever done something like this before — and well… here it is.

Tired of No BeanShell Coding Validation!

It turns out I was sitting around in my hotel room in Bangalore on India Independence Day last year, whacking away on some client code, doing some data modeling using CSV. I had a somewhat involved BuildMap rule I was working on and I was getting a null pointer exception I simply could not find. A few hours and one simple coding mistake later, once discovered, I was finally on my way. But it was really discouraging to know that if I had been coding in Eclipse, the coding mistake would have been spotted immediately.

The next thought I had was actually two-fold. While I have at times actually written test straps in real Java using the Sailpoint IIQ Java libraries (ie. jars) and dropped my BeanShell code into procedures to instantly validate the syntax, I have also wanted at some point in time to be able to simulate or partially simulate rule modeling and data modeling outside of Sailpoint IIQ using Java I had complete control over writing and executing.

So on this particular day, being particularly irked, I decided to combine those two wishes and see what I could do about having a place I could not only drop, for instance, BuildMap rule code into Eclipse and instantly validate it, but also execute the code I intended for Sailpoint IIQ against connector sources I also had connected to Sailpoint IIQ (in development, of course!) and see and manipulate the results.

Once I was done iterating my development over a real dataset, I could take my validated Java code, drop it back into Sailpoint IIQ in BeanShell and have not only validated but also working code in Sailpoint IIQ with very little or no modification.

Establishing SailPoint Context

One thing you will need if you want to run your Java code in an actual Sailpoint IIQ context outside of Sailpoint IIQ proper is establishing SailPointContext in your code. This, I will tell you, while not impossible, is not easy to do. You need to implement the Spring Framework and a lot of other stuff. If you are interested in doing this and have access to SailPoint Compass, you can actually read about establishing SailPointContext here4.

Since doing that much work wasn’t something I had the time for doing, almost immediately I decided to implement a partial simulation that would allow me to (1) model and validate my rule and (2) also allow me to model my data very simply and easily without establishing SailPointContext. I could still achieve my goal of iterating the solution to produce validated and working code to drop back into Sailpoint IIQ in this way.

The Code

Amazingly, the code for simulating a BuildMap rule, pointing it to the actual CSV I intend for Sailpoint IIQ, and simulating an account aggregation task is not that complex. Once you have the code, if you understand how Sailpoint IIQ works in general, you could conceivably re-engineer and simulate other segments of Sailpoint IIQ processing or modeling other rule types and.or data outside of Sailpoint IIQ1.
More »

Tags: , , , , , , , ,

Stupid SailPoint Developer Tricks

Hello, mates — as they say Down Under, where I happen to be at the moment on a rather large Sailpoint engagement. It’s been a while, and I’m sorry for that. I keep promising more, new and better content and haven’t delivered.

The last couple of months however have been absolutely crazy and there have been some changes on my end, as you perhaps can see. Now that things have shaped up a bit, maybe I can get back to the business at hand here on the blog, again as I have time.

Stupid Pet Tricks

When I was growing up and in college, a famous comedian became famous (partially) by having a segment on his show called “Stupid Pet Tricks.” Some were hilarious and some… belonged on the 1980’s “Gong Show.” (If you’ve never heard of “The Gong Show,” trust me, you aren’t missing anything).

Since that time, I’ve always thought of various developer tricks in the same light. Some are quite slick and useful and some… really just need to be buried. I’ll leave it to you to decide on this one.

Out of sheer laziness, while onboarding Sailpoint applications that feature a BuildMap rule (eg. BuildMap, JDBCBuildMap, and SAPBuildMap), I sometimes utilize a method for “printing debug statements” that I can see directly and immediately in connectorDebug, without having to jump into or tail the Sailpoint IIQ log or application server logs.

It’s also just a bit less verbose as the Sailpoint IIQ logs typically have a large class identification prefix in front of them, which can get rather cumbersome and make it more difficult to pick out one’s intended debug output.

Plus I hate changing logging levels in log4j.properties even though the Sailpoint IIQ debug page allows me to load a new logging configuration dynamically. In short, I’m just a lazy, complaining type when it comes to Sailpoint IIQ debug statements.

Someone mentioned this would be worth blogging about, so here goes. (At the very least, this is an easy article to write and perhaps will get me back into the blogging swing?!)

__DEBUG__ Schema

Now, I would definitely recommend doing this only on a local or designated sandbox and then making sure you clean up before checking in your code. (You are using some form of source code control for your Sailpoint IIQ development, aren’t you?!)
More »

Tags: , , , , ,

SailPoint IIQ: Move Over, Rover

I’m getting ready to do some customer training on Sailpoint IIQ v6.0. Getting ready for the trip has been a good impetus to get my rear end in gear and get up to date. I’ve been running Sailpoint IIQ v5.5 “bare metal” on my MacBook Pro pretty much since Sailpoint IIQ v5.5 was released. I have procrastinated getting Sailpoint IIQ v6.0 installed on my laptop. (Mainly because I have Sailpoint IIQ v6.0p5 running in the mad scientist lab on ESXi accessible via VPN.)

Side By Side Approach

So, it was time to install Sailpoint IIQ v6.0, but… I don’t and didn’t want to obliterate my Sailpoint IIQ v5.5p6 installation; I have too many customizations, test applications and rules I don’t want to loose and still want to be able to run live. I’ve been running Sailpoint IIQ with a context root of /identityiq and with a MySQL database user of identityiq.

When I run multiple versions of Sailpoint IIQ side by side on the same machine, I’ve adopted the practice of running each installation as /iiqXY where XY is the version number. So I wanted to run /iiq55 and /iiq60 side by side from the same application server. (I could also take the approach of running multiple instances of application server and run one installation from one port, say 8080, and another from another port, say 8081.)

So how to “lift and load” the existing installation at /identityiq to /iiq55 without reinstalling everything and re-aggregating all my sources? Here’s what I did.

DISCLAIMER: I’m neither advocating nor de-advocating this. Do this at your own risk, especially if your environment differs from mine. I make no claims or warranty of any kind. This worked for me. If it helps you… great.

The Environment

Here was my environment:

Operating System Mac OS X, Mountain Lion, v10.8.3
Application Server Apache Tomcat v6.0.35
JRE Java SE JRE (build 1.6.0_43-b01-447-11M4203) (64-bit)
SailPoint IIQ SailPoint IIQ v5.5p6
IIQ Database MySQL 5.5.15

Shut Everything Down

First, I shut everything down. This basically meant just spinning down the entire Tomcat application server. The command you might use and the location of your application server scripts may differ:

$ cd /Library/Apache/Tomcat6/bin
$ ./shutdown.sh

More »

Tags: , , , ,

Oh Ye MacBook Pro Of Little Memory :-(

March 25th, 2013 | No Comments | Posted in General, IdM Infrastructure, Tools

I’ve been a Mac user ever since 1993 and have always been extremely pleased with the platform in so many ways. Recently, Apple seems to have finally been realized in the consumer market as superior — I see Macs everywhere I go. And in the developer/power user arena, Macintosh and Mac OS X is the absolute “cat’s meow,” especially if one is a JEE developer. I couldn’t do what I do in Identity Management for Qubera without my 15″ MacBook Pro. It just does what I want it to do — no PC fuss or muss.

Apple’s Poor Memory Roadmap (IMO)

I’ve been disappointed however recently with one piece of the architecture: Apple’s maximum memory limits and their roadmap as it relates to upper memory limits on their non-Retina line of MacBook Pros. I feel it’s short sighted. (Even the new Retina MacBook Pros should max out at 32gb, not 16gb. Their memory footprints are just running behind the PCs at this point.) When I bought my MacBook Pro in early 2011, I laid out a lot of cash for this thing, and I instantly max’d the memory out at a {sarcasm}whopping{/sarcasm} 8gb, knowing I needed to run a lot of VMs, which Qubera uses for testing and support of customers.

Even more recently, after upgrading to Mountain Lion, I’ve pretty much bumped into the limit. I run a lot of stuff to do what I do in Identity Management, and I need it all open at once; Microsoft Word, Microsoft PowerPoint, Microsoft Excel, Google Chrome, Eclipse, emacs, Evernote, VMware Fusion and a Windows 7 VM (mainly for Visio, but also PC testing), Tomcat 6, MySQL, terminal windows galore, RDP sessions galore, calendaring, you name it. In recent weeks, I was beginning to despair a little bit. According to Apple, I had already max’d out my memory. 8gb just isn’t/wasn’t enough. What to do?!

Where Has All My Memory Gone?

I began trying to manage my memory better. I used Activity Monitor to monitor my memory, and I learned a lot about what was eating up memory. I didn’t realize I needed to treat just about every browser tab as it’s own application — there’s so much going on behind the scenes of every tab. I usually have a million tabs open too. But I need all this stuff opened. I can’t be closing it down, loosing context in my work.

I really needed a better solution. I began doing some research and in the end, I reached out to my good friends at The Chip Merchant for help. What I discovered was incredibly good news. Good enough news to document this in a blog entry.

8gb For i7-Based Macbook Pros Is NOT “The Max”!!

I’ve been using the guys at The Chip Merchant (in San Diego, CA) for over a decade. When it comes to memory, I know of no one better. These guys really know their stuff. I had a hunch that someone, somewhere HAD to be making an 8gb SODIMM that would fit the MacBook Pro. It turns out, after turning to The Chip Merchant, I was right.

If you go on Amazon and look for these memory SODIMMs, you’ll see they are available, but people are having mixed results with them per the reviews. I found out from The Chip Merchant that these are probably people running the i5-based MacBook Pro rather than the i7-based MacBook Pro, which is what I have. Crucial Memory makes an 8gb SODIMM that is stable and doesn’t over-heat in the i7-based MacBook Pros. For less than $150 to max my memory out at 16gb, it was a no brainer.

(The Chip Merchant really gave Crucial Memory the props as well — they said if Crucial Memory says it, you can book it. Something to remember when it comes to memory in the future.)

Ordering Information

So, there you have it. Despite what Apple indicates or recommends or states as the max for your i7-based MacBook Pro, Crucial Memory makes an 8gb SODIMM that fits and works — so 2x equals 16gb max. My life has been saved.

If you’re looking to upgrade your i7-based MacBook Pro to 16gb, give my friends over at The Chip Merchant a call. These 8gb SODIMMs are NOT in their online store at present, but they do have them and can get their hands on them. Worth every penny. Here is the item number from The Chip Merchant:


Account rep. Devin Charters helped me with this. What a life-saver. :-) This probably extended the life of my MacBook Pro for another 3 years at least. Thanks The Chip Merchant!! Hope this helps someone else out there who is despairing as I was.

Tags: , , , , ,

SailPoint IIQ: Aggregating XML

From an answer to a client this morning on aggregating XML in Sailpoint IIQ. I hope this helps others out there:

Regarding your question this morning on aggregating XML… I have seen XML aggregated through the OOTB RuleBasedFileParser connector. That connector requires that a rule be written to run the parser and through that, you could parse and aggregate XML. I mentioned this to one of our Solution Architects after our meeting and he was aware of the RuleBasedFileParser type, but personally felt it was enough work such that you may as well write a custom connector using libraries Java has available to handle XML.

I think between him and me, I would say the following:

(1) From an overall perspective, it’s technically possible using the RuleBasedFileParser connector to aggregate XML.

(2) There may need to be a discussion about the XML in consideration itself to determine the level of complexity of XML coming in, in which case:
(a)…The RuleBasedFileParser may be an adequate choice.
(b)…A custom connector for the XML may be in order.

One other approach could be:

(i) Use a DelimitedFile connector.
(ii) Write a pre-iterate rule leveraging the Java XML classes available to (a) read the XML and (b) create a CSV from the XML for the DelimitedFile connector to consume.
(iii) Use the post-iterate rule to clean up.

As you can see, there is more than one way to skin the XML cat here. This is the case as with most things in Sailpoint IIQ, as I demonstrate in at least one blog post, can be “tricked” in various places into doing what it is you ultimately want it to do.

As with any of this, it’s very common to have to sit down on an engagement and triage between a number of approach options to decide on the best implementation approach. I hope this information helps you with that process.

From the Twin Cities, where we shrug off the second day of Spring with a second helping of Winter, Amigos…

Tags: , , , , , ,

Ian Glazer: Killing IdM to Save It

February 22nd, 2013 | No Comments | Posted in General Idm/IAM, IdM Infrastructure

I recently watched Ian Glazer of Gartner‘s presentation on Killing IAM In Order To Save It and whole heartedly agree with a lot of what he advocates in this quick presentation. Enough to feature it here. You can view embedded below, but I also encourage you to visit the original posting on his site in order to view the valuable comments and dialogue others left there as well.

If you’ve been in Identity Management for very long, you should be able to relate to a lot of what Ian is presenting here. Great job.

Tags: , , , , , , , ,