Early in a forensic investigator’s career, many are cautioned to not rely solely on the tools used to study forensic artifacts. Knowing where information displayed by the tools comes from is important and is often raised in deposition or at trial with the inevitable question, “What is the basis for this finding?” I recently worked on cases that highlighted the importance of that early cautioning and caused me to pause and reflect on how investigations are conducted generally.
The first had familiar fact pattern: Mr. X leaves a company and moves to a competitor. He may have documents related to the former employer and may have accessed these files while working for the competitor. We were to review an opposing expert’s report, perform our own analysis, and submit a report detailing our findings.
The opposing expert found what appeared as evidence of file access and use, backing it up with exhibits from the output of his tool of choice. His report indicated the content seemed pervasive and occurred frequently right up to the date Mr. X’s computer was returned to the former employer.
In our review of his exhibits, we realized that while the reporting from the tool was accurate, it was misleading. The expert had doubled the frequency of connections in his report by misappropriating artifacts related to a registry backup. This, in addition to the expert’s incorrect findings related to the machine’s time zone settings, ended up casting his findings in doubt.
In the second case, two opposing experts already opined on the editing history of a single Microsoft Word document. Our task was to uncover facts that might support a positive outcome for our client after the case was remanded for trial on an unrelated technicality.
Both experts relied on a third-party tool to report internal metadata. Each expert made essentially the same findings, which were generally accepted at the original trial.
Our testing showed the tool reported metadata that did not exist in the native files. This happened because the tool leveraged the native application to open the files prior to reporting, creating the non-existent metadata. We documented this effect and relied on other proven, defensible methods to verify the original analysis was incorrect.
These cases highlight the importance of knowing the source of the data reported by a forensic tool and, more importantly, understanding this information in context. This is not a criticism of any tool, but a reminder to be diligent when investigating and always have your findings reviewed by a trusted peer.
iDS integrates a peer review process into our report writing and testimony workflows. Together with a healthy skepticism of the various tools we use, this process has yielded compelling results and positive case outcomes for our clients.
To access a case study based on one of the investigations mentioned in the above piece, please CLICK HERE.
The opinions reflected in this post are solely those of the author and do not necessarily represent the views of iDiscovery Solutions (iDS), nor any additional current or former employee of iDS. Moreover, any references to specific litigation or investigation work or findings are fact-specific.
Download Keys to Today’s Information Governance Landscape