Keil Hubert: Consider the Bigger Picture

It ought to be simple to declare that a file contains inappropriate content, in violation of a company’s Acceptable Use Policy.  Business Technology‘s resident U.S. blogger Keil Hubert suggests that it’s short-sighted to make snap decisions on isolated pieces of evidence without investigating the accused employee’s intent and associated actions.

‘Pornography,’ the gentleman once said to me, ‘is a touchy subject.’

The gentleman in question wasn’t trying to titillate; the subject we were discussing was inappropriate user behaviour on company networks. The discussion itself, in a sense, illustrated the core problem that us cyber security leads had to contend with back at our own companies: a user’s intent often has far more bearing on the IT department’s handling of an incident that the content that triggered the initial complaint of investigation.

For disclosure’s sake, let’s be clear for the outset that I am not a lawyer. [1] I approach this problem as an IT department head and cyber security researcher. I’ve spent a great deal of time studying the problem, both in theory and practical application. I’ve supported a great many employee misconduct investigations, and this column references some of the things that I’ve learned in the trenches. It isn’t – and shouldn’t be taken – as legal advice.

Back to my point, a large element of the employee misconduct problem is the thorny challenge of defining what constitutes acceptable versus unacceptable use on company equipment or over company networks. A while back, I did some work for a large company where my contract required the company to issue me a company-owned and –paid mobile phone. When I signed for my handset, I asked the issuing tech what the company’s Acceptable Use Policy was, since the phone had better Internet connectivity than my personal mobile. The tech shrugged and said ‘Just don’t put porn on it.’ I laughed along with him, as one technologist to another, but went away noodling about how he might choose to define ‘porn’ during a potential future investigation. Odds are, his catch-all definition would fall woefully short.

That’s a harder decision to make than you’d expect. As a practical example, married US Senator Anthony Weiner lost his job in 2011 after he sent some sexually-explicit photographs from his mobile to a university student. His disastrous attempt at a political comeback in 2013 was scuttled when he engaged in some further inappropriate sexting to three additional women. In the end, Sen. Weiner’s digital snaps were widely viewed as unacceptable … and might be considered pornography because of how they were used, not because of what they showed.

Personally, I find the smarmy arrogance of the bankers who plunged the world into the Great Recession more obscene and offensive than a happy-snap of a mostly-disclothed git. To each their own.
Personally, I find the smarmy arrogance of the bankers who plunged the world into the Great Recession more obscene and offensive than a happy-snap of a mostly-disclothed git. To each their own.

When it comes to the term ‘pornography,’ Wikipedia starts its entry on the subject with the generally-definition: ‘Pornography is the portrayal of sexual subject matter for the purpose of sexual arousal [emphasis added]. The context of a questionable action is one of the critical elements that investigators have to factor when assessing a conduct issue. Whether it involves a user consuming content (e.g., watching online videos) or sharing content with others (e.g., forwarding photos), the user’s intent and the surrounding details of the event have to be understood.

The actual photograph that cost Sen. Weiner his august position wasn’t a nude shot and could easily be shown on the prime time news. His damning selfies showed the gentleman in a state of undress that would actually be unacceptable in advertising or over-the-air programming. He wasn’t wearing enough to meet most company dress codes, but he wasn’t actually nude either; his photos (while ridiculous) would actually be considered fully acceptable in other contexts (vacationing at the beach). What made them wrong enough to lose a job was the fact that he was sending them to people other than his spouse with the presumed intent of inspiring sexual desire.

How, then, should the head of IT security navigate the wicked little grey area that arises in a questionable content case? If you limit your investigation to merely validating the existence of a single piece of media (e.g., an e-mail, photo, etc.) you run the risk of misrepresenting the totality of what happened to HR and upper management. For the sake of the accused, the accuser, the company, and all of the investigators, I advocate for capturing as much of the supporting content as possible along with the actual item(s) of concern, so that management can make a fully-informed judgment as to what really happened

I worked on two such cases several years ago that help to illustrate this problem. In the first one, we received a call from a Tier 1 tech support agent that he had ‘discovered’ child pornography on a co-worker’s PC – if substantiated, this act would have cost the accused worker his job.

Our forensic analysis of the compromised PC revealed that the ‘child pornography’ in question was a single black-and-white GIF that was ‘discovered’ in a user directory in the user’s local profile. The photo itself was of an adult female bathing – so, the ‘child’ part of the accusation was incorrect. Still, we had to determine if the photo counted as traditional pornography for purposes of misconduct under our company’s AUP.

First, we couldn’t find any other inappropriate content in the user’s director, browser cache, browsing history, e-mail attachments, or media file collections. Second, a review of the PC’s log files revealed that the accuser had logged into the PC with his personal network account collection ten minutes before the ‘discovery’ had taken place. Further, the time/date stamp on the questionable photo indicated that the accuser had written it to the directory while he was logged in as the active user. The logs also showed that the accuser logged out as a regular user, then logged back in as an administrative user (supposedly to ‘perform maintenance’) and conducted a search for all files on the PC with the .GIF extension, and thereby ‘found’ the objectionable content.

'Hello, police? I've found a dead body. Where is it? Exactly where I left it after I strangled ... er ...'
‘Hello, police? I’ve found a dead body. Where is it? Exactly where I left it after I strangled … er …’

The investigation further revealed that the accused worker was a deeply unpopular supervisor that had been clashing with his accuser in the weeks preceding the event. The totality of the evidence suggested that a disgruntled employee had engineered a good old-fashioned frame up – an employee who knew a little something about computers (but not nearly enough), and creatively tried to get his boss both disgraced and fired. [2]

In the second example, upper management suspected a junior supervisor of viewing sexually explicit content on his work PC and over the company network during work hours. During the forensic analysis of the accused’s PC, we found evidence that the user had logged in to his company PC during work hours (as suspected) using his network authentication token. No one else had the user’s security token to authenticate into the network with his identity, and no one else was in his private office at the time he browsed the sites.

Records on the PC’s hard drive indicated that the accused had browsed to a ‘swingers’ meet-up website, had browsed users’ profiles, and then went to a different site to view some pornographic movie reviews. The evidence was incontrovertible; the user had performed these actions. That alone was probably grounds for administrative action in and of itself, but it behoved the investigators to check everything in case there was some evidence that might exonerate or mitigate the accused actions.

The investigators looked at all the different log files that might exonerate the accused or suggest some intent other than inappropriate browsing. What if the URL had been an accidental mis-spelling? Or what if he’d thought it was a completely different type of site (say, a wiki for people who build playground swing sets). The time/date stamps on the cached files made it clear that this likely wasn’t an accident, though; the accused had browsed hundreds of user profiles on the site.

The company’s AUP allowed users to use their phones and PCs for minor not-work activities during work hours in order to clear up conflicts that might interfere with work. Although it was unlikely, the investigators looked for other records to see if the actions could somehow be interpreted as acceptable use. For example, the user could have logged into the swingers’ meet-up site to message a person he was scheduled to meet for a date after work in order to let her know that he’d be working late. That might have netted him a mild scolding for having browsed a questionable site, but the intent behind the visit would have been less damning. It’s worth nothing that some of the user photos were a bit risqué, but the pages that the user browsed to didn’t display any actual fully nude or prurient content.

The search proved fruitless; the investigators couldn’t find any exonerating data. The user hadn’t made any phone calls on his company phone and hadn’t sent any e-mails from his office PC that might (by association) change the context of his browsing. His further extensive browsing of pornographic movie reviews clinched it for management: this was deliberate misconduct under the AUP. Further investigation revealed that the accused had already received a stinging reprimand from the company for viewing adult content at work once before. [3]

The digital pictures that investigators unearthed in both of these cases were effectively identical – both were of topless, adult women. Had we presented the image files alone – devoid of context – to an HR rep, it would be nearly certain that both accused employees would have been terminated. As the larger stories show, that reaction would have been wholly inappropriate.

Firing a person for specious reasons in a boom economy when he or she can instantly get a new job is an injustice. Firing a person for specious reasons when jobs are terribly scare and most people are one lost paycheck away from insolvency and homelessness is a vile travesty.
Firing a person for specious reasons in a boom economy when he or she can instantly get a new job is an injustice. Firing a person for specious reasons when jobs are terribly scare and most people are one lost paycheck away from insolvency and homelessness is a vile travesty.

US Supreme Court Justice Potter Stewart once quipped about the categorizing of pornography, ‘I know it when I see it.’ With respects to the justice, I find his subjective approach to be an abdication of management’s duty to treat employees fairly and justly. Deciding the fate of an employee solely on one person’s arbitrary interpretation of a piece of content devoid of any context is potentially detrimental to the organization (i.e., if you fire someone over a misunderstanding, they’ll probably sue your company – and may win). Further, nearly any content can be considered to be ‘unacceptable’ by one manager and ‘acceptable’ by another when taken out of context.

Consider the beach photo problem: Bobbie complains to management that Bob sent her a sexually-explicit photograph, which constitutes sexual harassment. The photo in question is of Bob standing on a sunny, crowded beach with his wife and children, all of whom are cheerfully waving to the photographer. Bob sent the photo to everyone in his department while on vacation. The photo’s caption reads: ‘Having a great time at the beach. I’ll see you all next week.’ An assessment of the photo in context suggests that sexual arousal was probably not a factor in the taking or sending of the photo. Even though Bob was wearing very little in the photo, his clothing was appropriate to the setting in which the photo was taken.

That exact same photo can also be evidence of inappropriate user conduct: for example, Bob complains to management that his co-worker Bobbie has been stalking him. A search of Bobbie’s PC uncovers a large collection of digital photos of Bob (including the aforementioned family photo at the beach). Some of the photos of Bob appear to have been taken surreptitiously, perhaps with a camera phone, as Bob does not appear to be aware that he’s being photographed. Bobbie’s browser history shows that she frequently logs into a social media site and looks up Bob’s public profile – and then downloads every photo that Bob appears in from that site. Even though the beach photo itself isn’t inappropriate, the inclusion of it – among others, when placed into context – may constitute inappropriate (and creepy) conduct on Bobbie’s part.

These are, admittedly, simplistic examples. I offer them only to make the point that context matters more than content alone when it comes to ferreting out inappropriate behaviour. A responsible security leader has an obligation to the organization and to the accused both to find and present all of the evidence that has a bearing on a misconduct allegation. A responsible leader has a moral (and sometimes also a legal) obligation to weigh all of the factors involved in a potential misconduct case before finalizing anything resembling punishment.

Further, a responsible leader must not allow his or her personal biases about content to colour their interpretation of any specific piece of evidence; just because you find something to be distasteful doesn’t mean that others share your interpretation. Before taking action against an employee, review the evidence with a diverse group of people whose personal standards differ significantly from your own. Make sure that your personal standards (conservative or libertine) aren’t skewing your interpretation. Never rely solely on your personal perspective alone, even when you’re confident that you’re right. There may well be a nuance to the situation that you’re blind to.

Diversity isn't just an HR buzzword; when evaluating evidence of suspected wrongdoing, it's critical to bring as broad a range of perspectives to bear on the evidence analysis process as possible.
Diversity isn’t just an HR buzzword; when evaluating evidence of suspected wrongdoing, it’s critical to bring as broad a range of perspectives to bear on the evidence analysis process as possible.

Lastly, a responsible leader must abstract the questionable content from the affected employees’ intent. Was the accessing or transmission of the content meant to be titillating? To traumatize? To intimidate? To offend? Or was it something meant inoffensively that was misperceived when it was taken out of context?

While it’s extraordinarily difficult to know what’s going on in another human being’s private thoughts, there sometimes are external indicators that help to suggest what a suspect was thinking or feeling at the time of an alleged incident. As a corporate IT security expert, we’re usually not held to the same standards as prosecutors and police officers. We’re also not as restrained – in many companies, a manager can deprive an employee of his or her economic viability on a whim. Therefore, because our power to degrade and destroy the lives of others is sometimes greater than instruments of the state, it strikes me that we should expend greater effort than the state in trying to divine a suspect’s intent before we take an irreversible negative action against them.

If the evidence corroborates the accusation of misconduct, so be it. The time wasn’t wasted, and whatever necessary administrative action HR decides to take can be executed with a clean conscience. If, however, the evidence suggests a possible alternate – and less harmful – intent behind the incident, then a miscarriage of company justice can averted. If a valuable employee can be rehabilitated and spared the trauma of an overzealous punishment, then we’ve done some much-needed good in the world.


[1] If you ever see the prurient-sounding initialism ‘IANAL’ in an online discussion, that’s what it stands for. Very easy to mistake for something provocative, especially if you assume that there’s a comma between the I and the first A.

[2] Oddly enough, no charges were ever filed by management. The user who engineered the frame up got away with it. The framed supervisor was later terminated for cause for sexual harassment, but only after his disgruntled co-workers tried several more times to frame him for inappropriate behaviour – no charges were ever filed against the miscreants in those events, either.

[3] The accused lost his job and appealed. The forensic investigators testimony about their investigation methodology and findings convinced the judge that the termination was warranted.


POC is Keil Hubert, keil.hubert@gmail.com
Follow him on twitter at @keilhubert.
You can buy his books on IT leadership and IT interviewing at the Amazon Kindle Store.

Keil-Hubert-featuredKeil Hubert is a retired U.S. Air Force ‘Cyberspace Operations’ officer, with over ten years of military command experience. He currently consults on business, security and technology issues in Texas. He’s built dot-com start-ups for KPMG Consulting, created an in-house consulting practice for Yahoo!, and helped to launch four small businesses (including his own).

Keil’s experience creating and leading IT teams in the defense, healthcare, media, government and non-profit sectors has afforded him an eclectic perspective on the integration of business needs, technical services and creative employee development… This serves him well as Business Technology’s resident U.S. blogger.

Keil Hubert

Keil Hubert

POC is Keil Hubert, keil.hubert@gmail.com Follow him on Twitter at @keilhubert. You can buy his books on IT leadership, IT interviewing, horrible bosses and understanding workplace culture at the Amazon Kindle Store. Keil Hubert is the head of Security Training and Awareness for OCC, the world’s largest equity derivatives clearing organization, headquartered in Chicago, Illinois. Prior to joining OCC, Keil has been a U.S. Army medical IT officer, a U.S.A.F. Cyberspace Operations officer, a small businessman, an author, and several different variations of commercial sector IT consultant. Keil deconstructed a cybersecurity breach in his presentation at TEISS 2014, and has served as Business Reporter’s resident U.S. ‘blogger since 2012. His books on applied leadership, business culture, and talent management are available on Amazon.com. Keil is based out of Dallas, Texas.

© Business Reporter 2021

Top Articles

How a digital revolution is transforming banking and financial services in Asia

Asia has become the hotspot of digital innovation in the global financial and banking sector.

Conscious customers: a year of change and the UK consumer

As the pace of change continues in the insights industry and beyond, it’s clear that the Covid-19 pandemic has not…

Related Articles

Register for our newsletter