EDRM Diagram: increased focus on Information Governance

EDRM.net recently published the third version of its Electronic Discovery Reference Model (EDRM) diagram. This version emphasizes what was previously called ”information management” by changing its name for “Information Governance” and by leveraging the importance of a good information governance all along the EDRM process.

EDRM.net is also the creator of the Information Governance Reference Model. The link between the two models is now clearer than ever.

Posted in E-Discovery, EDRM, Governance, Information Management | Tagged , , | Comments Off

Action required: Heartbleed vulnerability

What is Heartbleed?

Heartbleed is a vulnerability within OpenSSL; a popular software product used by many websites and network devices to provide secure connections. The vulnerability exists due to a logic error within the OpenSSL code. This flaw allows criminals to access parts of a web server’s memory that may contain sensitive information

How serious is this problem?

Very serious. The Heartbleed defect could expose information such as usernames and passwords, credit card information and other sensitive information that would be sent by the user to the website, network device or mail servers. Web technologies are present in devices that are not web servers meaning you may have more at-risk technology than is immediately obvious. There is some indication that certain web browsers may be affected although specifics are not yet known.

How KPMG can help

KPMG’s member firms have designed and implemented Cyber Security capabilities in some of the world’s largest corporations and assisted clients in handling complexsecurity breaches. This insight provides our teams with a unique viewpoint on the building blocks for detecting and defending against cyber criminals. KPMG can assist in addressing the Heartbleed issue by:

  • Assessing systems and networks for the presence of the vulnerability
  • Performing forensic analysis of affected systems and supporting networks to identify indicators of abuse
  • Analyzing the risk associated with compromised systems

 

For more information on Heartbleed, including a quick decision tree, on our related services and on whom to contact across Canada, please consult our Heartbleed Slipsheet.

Posted in Cyber Security, Information Security | Tagged , | Comments Off

Roadmap for the Long Term Preservation of Digital Assets

One year after the release of the UNSECO/Vancouver UBC Declaration (PDF) on preservation of digital information, UNESCO has taken measures to achieve the objectives mentioned in the declaration.

It all started in September 2012 in Vancouver during the UNESCO conference Memory of the World, to which we had the opportunity to assist. The declaration was released a few months later in January 2013.

Following the declaration, an international conference was held in The Hague in December 2013 with the aim of regrouping different actors from ICT, government and heritage organizations. “The discussion between such diverging stakeholders showed that there was insufficient awareness between industry and heritage institutions about the relative concerns of the other and that there needed to be a platform to discuss digital preservation”.

To answer this problem, UNESCO have set up PERSIST (Platform to Enhance the Sustainability of the Information Society Transglobally) with the aim of encouraging collaboration and discussion between the industry, heritage institutions and government on a variety of long term preservation aspects to build a proper roadmap in a year time.

For more details, we suggest the reading of the executive summary (PDF) of the December 5-6 conference.

Posted in Governance, Information Management, Records Management, Technology | Tagged , , , | Comments Off

Organizational culture check

Information management initiatives focus in part on explicit, recorded information; in other words, systems are implemented that support the management of tangible assets embodied in such documents as reports, articles, meeting minutes, and training materials. In contrast, tacit knowledge, or the expertise and know-how residing in the heads of individuals, is an indispensable intellectual asset of organizations that can also be harnessed and utilized in the implementation of new systems and initiatives. Moreover, technology alone will likely be unsuccessful without a supportive organizational culture.

Research on organizational culture and values sheds light on effective strategies, favourable cultural characteristics, and best practices to enhance knowledge sharing and ultimately increase overall organizational effectiveness and competitiveness.

Proven practices:

  • Cooperation: Build cooperative relationships among employees by providing opportunities for collaboration and learning
  • Trust: Use language and behaviours that cultivate trust and demonstrate a high level of corporate commitment to employees to increase knowledge sharing
  • Organizational memory: Encourage mindfulness and respect for the past to develop a culture that improves the preservation of knowledge within the organization
  • Autonomy: Give employees freedom in their work (e.g. scheduling, approaches) to improve their willingness to collaborate
  • Avoid hierarchy and competition: A focus on rules, standardized procedures, and productivity hinders knowledge sharing

Knowledge management principles focusing on the transfer of tacit knowledge between individuals as well as to the organization itself in its corporate memory represents an invaluable process with significant potential for positive outcomes, and should therefore be a consideration in the planning and implementation of information management initiatives.

Posted in Information Management, Knowledge Management, Uncategorized | Tagged | Comments Off

Who said records and information management (RIM) were dull?

Did you know U2 has written songs about RIM? Or maybe it’s the contrary?

Jeffrey Lewis, a RIM expert, developed his own Information Governance playlist from U2 in his last blog post for AIIM’s Expert Blogs.

An article mixing U2 and RIM? Nothings’ best a few days before Christmas!

And you, what would be your RIM favorite playlist?

Posted in Governance, Information Management, Information Security, Records Management | Tagged , , | Comments Off

KPMG Canada Helps Develop Technology that Uses Text Analytics for Priv Review

Technology-assisted review (“TAR”) is being more widely adopted as a way of saving time and money in large-scale document reviews. But virtually everyone assumes that the advanced text-analysis algorithms underlying TAR and embedded in tools like Relativity, Recommind, Clearwell and Equivio can really only be trusted to perform basic relevance coding, so-called first pass review or maybe issue-based classification.

Can you think of anyone who has suggested that these tools can perform the much more challenging and nuanced task of identifying privileged content? Most lawyers would scoff at the idea. And for very important reasons, their scepticism must be taken seriously. Entrusting priv review to a machine is a high-stakes proposition. Some work in this area as part of TREC 2010 (Interactive Task 304, see pp. 33-35) yielded poor results.

But what if a tool could be trusted to find the kind of text that lawyers should look at, to assess both privilege and waiver? And what if that tool did a better job of finding that kind of text than the average reviewer? Even a well-trained reviewer?

Working with Porfiau, a well-credentialed software development company run by a team of text-classification experts with connections to the University of Waterloo, KPMG Canada has both helped to enhance the computational tool built by Porfiau and developed a set of workflow protocols whereby a new generation of text-classification technology appears to have achieved this goal. The technology uses finite state machines and the workflow is much like those already adopted in TAR situations (start with a seed set, build the algorithm, get a first set of results, have a SME code a sample, retrain the algorithm, and so on). Initial results, based on several iterations and retrainings against a target population of over 300,000 documents, strongly suggest Porfiau’s technology, together with carefully designed processes, can identify potentially privileged content with a level of recall of 0.90 or higher.

And the tool has consistently found important potentially privileged material that the lawyers missed.

This work is described in the final section of a paper that I co-wrote with Chris Paskach and Manfred Gabriel of KPMG US, The Challenge and Promise of Predictive Coding for Privilege[PDF] The paper is one of four selected by peer review for presentation at the DESI V Workshop in Rome this coming Friday.

Posted in Conference, Document Review, E-Discovery, Legal Technology, Predictive Coding, Technology | Tagged , , , , , , , , , , , , | Comments Off

Canadian Judicial Council develops eDiscovery Cost Benchmarks

In a discussion paper sponsored by the Canadian Judicial Council entitled, “Guidelines on Benchmarking of Costs,” Sandra Potter of Indicium Legal has presented for discussion an approach to determining appropriate costs for standard ediscovery services. The paper should serve as a useful source of guidance, even as a sanity check, for anyone trying to figure out what is a reasonable price for ediscovery services — whether they are offering those services or paying for them.

As stated in the Foreward: “These guidelines have been prepared by the Canadian Judicial Council to assist courts. Judges retain discretion in making any cost orders in a particular matter. Use of these guidelines should assist both lawyers and judges in considering e-discovery as an affordable tool, particularly when pursued in proportion to the scope of the litigation.”

Noting the recent Practice Directions issued by BC and Alberta, as well as the CJC’s own National Standards, the paper argues for the value of benchmarks for the pricing of ediscovery services. Benchmarks, it argues,

1. … provide a guide for the Courts to assess costs in a matter where technology has been used to assist with the litigation
2. … ensure that Law Firms which choose to do this type of work in house are still able to get money back for their client if they win a matter; [and]
3. … provide a guideline to firms and end clients alike as to how the Court might rule on such costs and provide a predictive costing model.

The report provides line-item detail for services in all of the following categories:

1. Document Preparation
2. Database Creation
3. Numbering (Electronic Bates)
4. Scanning
5. Objective Coding
6. Processing Electronic Files
7. Database Management
8. Determination of Production Set (Legal Analysis)
9. Project Management
10. Examination for Discovery
11. Common Trial Book Preparation
12. Hearing Preparation
13. Hearing
14. Appeal Preparation

The paper does more than present sample costs (which are derived mostly from consultations with jurists and lawyers in B.C.); it also offers a method for developing benchmarks in other jurisdictions. For this broader, forward-looking initiative, it draws on definitions and approaches already in place in B.C., Ontario, N.S. and elsewhere. And the framework it presents allows interested parties to continue this important work on a local basis.

Anyone working in this area will have put together his or her own cost breakdowns and lists of vendor pricing. The CJC’s approach offers a way of standardizing these fragmented, apples-and-oranges efforts so that we can all begin to speak the same language. Not to get too excited (we are Canadian, after all), but this has the makings of an EDRM-style analytical framework for understanding ediscovery costs.

CJC’s Discussion Paper is an important contribution to the important work of developing meaningful, intelligible metrics that can guide how ediscovery costs are estimated, compared, calculaled and awarded.

We will be revisiting the thorny issue of pricing and pricing models — and what might be a better approach to RFPs and vendor proposals — in future posts.

Posted in E-Discovery, Pricing | Tagged , , , , , , , , , , | Comments Off

The Vancouver Declaration on Digitization and Preservation finally released!

On January 17th, the Vancouver Declaration on Digitization and Preservation [PDF].

The Declaration was written and adopted during the Vancouver Conference held in September. The purpose of the Declaration is to set up goals to be able to protect and allow a long term access to world digital assets.

The Declaration regroups many recommendations and asks political and economical institutions to collaborate for the safety of our digital archives.

At this time, we do hope that the declaration will have an impact and that the UNESCO will have the strength and power to promote and put in application the recommendations.

Posted in Conference, Information Management, Technology, Uncategorized | Tagged , , , | Comments Off

Predictive Coding in Unpredictable Order from US Chancery Judge

Over the last year, there has been some important but still fairly cautious movement in the US courts on two fronts: (1) increased willingness to accept computer-assisted review (“CAR”) as a valid and defensible means of reducing the cost of discovery, and (2) greater encouragement of cooperation between parties during the discovery phase. The most salient decisions have been da Silva Moore (finding that CAR technology can be an appropriate technology in large and costly cases) and Kleen Products (strongly encouraging the parties to cooperate).

Now we have a Vice Chancellor of the Delaware Chancery Court going far beyond his colleagues and actually ordering the parties to adopt CAR and even to use the same vendor to host their data.

Many of us who have been active in the eDiscovery field for some time have been arguing for the adoption of advanced technologies and for greater cooperation between parties in the choice and deployment of particular technologies, but this order will likely take even the most ardent advocates by surprise.

There is already significant chatter about this ruling, much of it going so far as to suggest that Vice Chancellor Laster went beyond his proper role when he forced a particular approach to discovery on the litigants. Fair enough: Laster is essentially declaring here that no reasonable party can refuse both to adopt CAR and to agree to have its data hosted by the same vendor that is hosting the opposing side’s data. Still, it is more than likely that, once feathers have settled back into place, this ruling will be seen as part of a trend in which judges will move litigants – however reluctant they may be – in the direction of sensible, defensible, responsible and cost-effective means of streamlining the eDiscovery process. There is clearly a role for CAR technology in appropriate cases – particularly in early, first-pass binary responsiveness reviews involving vast amounts of data. There are also significant logistical and cost benefits to parties’ agreeing to share technical resources and developing standardized protocols for processing, search, analysis, review and production. Choosing an appropriate vendor to host all case data (with appropriate permissions, partitions and safeguards) is an avenue that many litigants will want to explore.

Clearly there is much here to debate. Watch for more discussion of this important ruling.

This case: EORHB, Inc., et al v. HOA Holdings, LLC, C.A. No. 7409-VCL (Del. Ch. Oct. 15, 2012)

Da Silva Moore v. MLS. (Case 1:11-cv-01279-ALC-AJP) (S.D.N.Y.)

Kleen Products, LLC, et al. v. Packaging Corp. of Amer., et al., Case: 1:10-cv-05711, Document #412 (ND, Ill., Sept. 28, 2012)

HT: Ralph Losey.

Posted in Case Law, Document Review, E-Discovery, Legal Technology, Predictive Coding, Technology, Uncategorized, USA | Tagged , , , , , , , , , , , | Comments Off