Audit crackdown as FSA taps FRC for info exchange

Main entrance - 25 The North Colonnade (Canary...

FSA's HQ at 25 The North Colonnade in Canary Wharf

The Financial Services Authority (FSA) and the Financial Reporting Council (FRC) have agreed a new memorandum of understanding (MOU) to enable a greater degree of cooperation and information exchange..

According to the FRC, the MOU underpins the increased dialogue between the FSA and the FRC on accounting and disclosure issues. The collaboration has been in place since 2005, the FRC said, and follows the publication of a joint discussion paper on the audit of financial institutions published in June 2010.

The new agreement will deliver a closer working relationship between the FSA and the FRC’s Audit Inspection Unit and will improve oversight of the audits of authorised firms, the regulator said.

The AIU operates a risk-based system of audit inspections and its scope has been extended to include all banks incorporated in the UK to better support markets and the prudential regulator.

The FSA and FRC will assist each other in the performance of their respective functions by providing timely information, subject to any legal constraints. Where information shared is subject to confidentiality undertakings both regulators will handle the information in accordance with those requirements.

Richard Thorpe, FSA’s accounting and auditing sector leader and said: “Our recent discussion paper highlighted our concerns that some auditors may not be exercising sufficient professional scepticism in their approach to the audit of key areas of management judgement. The FSA relies on audited financial information to meet its regulatory objectives and it is imperative that we have confidence in the information provided to us – this MOU is a public statement of the way we will share information with the Audit Inspection Unit at the FRC. Sharing information with the FRC will go someway to mitigating our concerns.”

Paul George, Director of Auditing at the FRC added: “It is vital that audit serves the interests of the UK’s capital market by providing relevant and high quality information to prudential regulators as well as the market. This agreement ensures that the dialogue established during the financial crisis is both durable and meaningful. By working more closely with the prudential regulator we will enhance our collective ability to identify and correct weaknesses in the quality of audited information being provided to regulators and the market.”

Leave a comment

Filed under Accountability, Audit, Corporate governance, Oversight, Regulatory compliance, Reporting, Risk management

FRC to consult on ‘priorities and funding’ for 2011

GRC Analytics – The Financial Reporting Council has published its priorities and funding proposals for 2011-12 in a paper entitled ‘Draft Plan and Levy Proposals 2011-12’.

The consultation document can be found at the FRC website at www.frc.org.uk/about/plans.cfm.

FRC said that responses to the paper will be accepted until March 9th 2011.

According to the FRC, their so-called ‘key outcomes’ In 2010/11 will focus on four ‘major outcomes’ as they call it. 

The first being ‘stronger and better-informed engagement between institutional investors and company boards’

The second ‘outcome’ will be ‘corporate reporting and auditing that deliver greater value to investors and better serve the public interest’

The third is to be ‘a strong UK voice in the EU and international debate on the future regulation of corporate governance, reporting and auditing’

The fourth ‘outcome’ will be ‘a more effective UK regulatory framework that adds value for investors and other stakeholders at low incremental cost;, 

The FRC said it will finalise its 2011/12 Plan and Levy Proposals in May 2011, taking account of responses to the consultation paper and ‘any outcome of the work by the Government on the reform of the FRC’.

FRC Chairman, Baroness Hogg, commented: “The FRC’s work in the coming year will focus on the key ingredients of corporate reporting, governance and audit, all of which support the integrity, efficiency and competitiveness of the capital market.”

Hogg restated: “We will encourage a more productive dialogue between investors and company boards and will strengthen corporate reporting and auditing frameworks to ensure they better serve the needs of investors and the public interest.”

“We must also continue to influence the international policy and regulatory agenda” Hogg stated. “In 2011, the European Commission will develop its proposals on the future of audit and corporate governance and it is vital that the FRC is involved in those debates to ensure that the UK voice is heard.” 

Added Hogg: “Reform of the FRC by Government strengthens our independence and effectiveness.”

Leave a comment

Filed under Accountability, Audit, Corporate governance, Oversight

FRC speaks out on audit committees and actuarial info

The City of London is a city within the greate...

The Square Mile

Brace of reports aim to reassure on reporting and financial data 

GRC Analytics – The Financial Reporting Council has just published two new documents highlighting the latest challenges faced by audit committees and users of actuarial information.

According to the FRC, the current year “Update for Audit Committees” focuses upon risk identification and reporting and said it ‘seeks to stimulate an appropriate environment for key estimates, assumptions and models produced by management to be challenged in a constructive way and for providing support for auditors carrying out their work with an appropriate degree of professional scepticism.’

The FRC’s second document for the current year “Update for users of actuarial information” reported that is was particularly relevant to the governing bodies of insurers and pension schemes, but may also be useful for scheme sponsors, auditors and audit committees.

The focus of the actuarial document is on quality controls for actuarial work, understanding the business model and the way cash flows are projected and reported, and the way risks are assessed and managed.

Stephen Haddrill, Chief Executive of the FRC commented “Many companies and pension schemes did sterling work last year to make sure that all material issues were captured properly and reported in an appropriate way in their financial reports. Whilst insolvencies have passed a peak and some asset prices have recovered, the nature of the crisis has changed significantly. Those exposed to government contracts will need to devote more resources to their forecasts and assessments this year”

Louise Pryor, Director of the Board for Actuarial Standards (BAS), added: “The BAS has nearly completed its new outcome-focused standards which aim to ensure that users can place greater reliance on actuarial information. However, the main users of actuarial information may not yet be making full use of these new standards in monitoring the quality of actuarial work. They may find these questions useful as they exercise their responsibility for taking decisions and reporting on the basis of actuarial information and advice.

Leave a comment

Filed under Audit, Corporate governance, Regulatory compliance, Reporting, Risk management

Brinqa hits the bigtime in GRC platforms

Frame of reference for research of integrateg ...

A frame of reference for research of integration

Brinqa GRC Platform 3.0 simplifies governance, risk and compliance for large organisations

GRC Analytics – Brinqa GRC Platform 3.0 has just been launched as the newest version of its GRC platform which offers capabilities for policy and compliance management, process governance, incident management, and threat and vulnerability management. In contrast to the traditionally manual, inefficient and costly processes that characterise risk and compliance, Brinqa establishes a centralised, integrated and re-usable platform for GRC programs, enabling customers to improve risk posture, minimise compliance costs, and address current risk issues while enabling automation for future GRC initiatives.

Brinqa GRC Platform 3.0 manages the complete lifecycle of policies, processes, and controls from a centralised repository within the enterprise. This ensures consistent mapping to regulations, industry mandates, frameworks, standards and best practices, as well as efficient communication, audit, and enforcement of policies.

Brinqa said that further simplifying GRC initiatives, Brinqa intelligently maps business policies to the processes and controls that implement those policies. Low level measurements are gathered in near real-time with Brinqa’s agent-less connectors, and are translated into relevant business terms that can be used by executive management in making strategic business decisions. Brinqa GRC Platform manages policy approval processes, revisions, audit history, and global updates.

New features of Brinqa’s 3.0 GRC Platform include event-based assessments, with multi-point distribution to support multiple respondents; hierarchical policies with the ability to overwrite parent policy sections. Policy review in natural language; cloud security and data protection standards, controls and compliance reports; updated regulations, industry mandates, frameworks, standards; simplified administration of access controls through pre-configured access roles that limit access to what the user needs to do their job.

“Enterprises must address their GRC requirements as a whole versus implementing disparate products that still mandate manual processes” commented Amad Fida, President of Brinqa. “Current GRC processes are not only time-consuming and error-prone, but they must also be repeated from scratch every time an update is required. Brinqa 3.0 establishes the reusable services that underlie all GRC programs, which increases data and control quality while reducing costs. With Brinqa, customers can design the infrastructure once and know they are leveraging existing technology investments through supported integrations. Most importantly, any future business requirements for managing risk, privacy and business continuity will be supported since we enable reusability across GRC programs.”

Brinqa said GRC Platform is a Java-based application which runs in all standard java web containers on most major platforms. The high performance back-end repository leverages an RDBMS server for storing policies, processes, controls, assessments, incidents, and a complete audit trail of all GRC-related activities.

Leave a comment

Filed under Corporate governance, Regulatory compliance, Reporting, Risk management

M&A integration woes as metadata management moves centrestage

Meta-data model of Functional Model Iteration ...

Metadata model of functional model iteration

Metadata Management: An Essential Component Of Integration Projects

Tim Cianchi, Business Unit Leader at Zuhlke, looks at the impact metadata management is having on business IT systems, and considers how organisations can approach systems and information in a way that both addresses general business efficiency and integration requirements.

For as long as organisations have been recording business information there have existed a number of issues surrounding data integrity, accuracy and management that, if addressed correctly, have resulted in competitive advantage for the companies involved.

This is no secret and comes as no surprise. 

In fact it is common sense that a business whose data is well-structured will demonstrate general efficiency, agility and capacity gains over less organised counterparts.

Auditing and reporting are obvious day-to-day business functions which will benefit from well structured data and optimised management systems. However, the level of control over an organisation’s business information becomes of exceptional importance at a number of critical junctures, such as system integration, when the information and systems belonging to previously discrete companies or business units are required to integrate.

The most common integration scenarios arise from mergers and acquisitions, replacement of existing systems, and implementation of interfaces to a new business partner or department.

In all these cases, data flows from one system (or collection of systems) into other systems. For data to be correctly processed (understood as information) in the receiving system, its elements must be correctly transformed and mapped into the expected formats.

This mapping is rarely a trivial exercise since in most cases the systems exchanging information have been designed and built by different teams at different times and with reference to different standards (if any). Leaving aside technical aspects relating to message transport protocols and packaging formats, the primary problem for an integrator is to ensure that the business semantics (meaning) of the data is retained as it travels from one system to another.

We have seen many examples both within and between organisations, where this is much harder than it might at first seem. Because IT systems have typically evolved in isolation from each other, the vocabularies used to describe data elements usually differ significantly. A common problem is that of synonyms and homonyms. Synonyms – when the same data entity is described with different names in different systems. Homonyms – when fields having the same name in different systems describe different data entities. A more complex problem is a mismatch between sets of elements needed to describe a unit of business information.

For example, one system might require three fields to describe an address, and the other system four fields. Further complex problems arise with references to “master data”, where each system has copies of information relating to third-party or common data, but stored using different codes, identifiers and structures. Even worse, systems often have highly conditional datasets, for example “if field A has value X, then fields B and C must be present and mean this, otherwise field D is used and means that”. These rules need to be captured and understood in order to preserve meaning. The information describing the data entities, their semantics and the rules for validating, enriching and transforming them are collectively known as “metadata”.

Historically. metadata has not been treated with the respect it deserves.

With luck it is present in a relatively complete form when a system is first proposed as a specification. During implementation it ends up embedded in a variety of forms, some of which (for example database schema) can be easily interrogated, and many of which (such as application programs) cannot. It is not that some forms intentionally hide their metadata, but rather that there was no requirement at the time to expose it.

Following initial deployment, the system is modified through a series of change requests and bug fixes and the original specification rapidly becomes outdated and no longer corresponds to the actual implementation.

In this fairly standard application lifecycle, the metadata becomes scattered across a number of artefacts, and at any point where it is necessary to reconstruct it for a given purpose, such as implementing a new interface, a costly manual exercise involving business analysts and or developers is required. This typically results in a metadata document – perhaps a spreadsheet – which may be accurate at the time of production, but is never maintained, and again rapidly becomes out of date. The whole exercise has to be repeated when the next interface request comes along or a client of the system demands to know how a particular set of information is derived.

This typical approach, where there is no explicit metadata management, has been acceptable in the past, and may remain so in the future where systems and their interfaces are simple, stable, and where the costs and risks of unmanaged metadata can be reasonably borne. However many organisations today are coming to realise that it is critical to have an explicit and accurate representation of their data stores and flows. This is particularly true of organisations whose core business is conducted through large and complex IT systems which have grown over many years through combinations of departmental and organisation mergers using both home grown and packaged solutions.

Active metadata management is currently gaining traction within these kinds of organisation, and there are two main reasons – firstly the cost and time involved in repeated manual recovery of metadata to meet new business requirements has been recognised, and found to be unacceptable, especially in the current economic climate, and secondly because awareness of and tool support for managing metadata has improved significantly over the past few years.

So let’s look at what it means to manage metadata.

Firstly, metadata is just data about other data and its relationships. One characteristic of metadata is that the data volumes are typically small in comparison to the data described – a few megabytes of metadata might be more than enough to describe terabytes of transaction data. On the other hand, the complexity of metadata can be high, since it has to deal with complex and conditional relationships.

A vital aspect of metadata management is governance – since the point of collecting and maintaining the metadata in the first place is to provide a “source of truth” about the systems described, it is critical to ensure the quality and currency of the metadata, and to have a governance process that describes how metadata changes are authorised. A metadata repository therefore needs both a good version control system, with the ability to highlight changes between versions, and workflow support to implement governance.

To ensure accuracy and currency of the metadata, the ideal solution is to refresh metadata from actual deployed systems. A good metadata tool will be able to automatically import data (scheduled or on demand) from a number of sources, such as database and XML schemas, mapping tools, report definitions, and other application artefacts, as well being able to cope with static sources such as spreadsheets and comma separated files.

A metadata repository must allow for flexible metadata representations, and support features such as glossary building. It must support lineage capture and analysis, which allows for tracing data flows and derivations between systems. It must also have good support for querying and outputting the metadata in various forms. Lastly it must be accessible to business analysts – it is primarily a tool to make their jobs more productive.

Given that an organisation understands the benefits of managing metadata, how should it go about the process of implementing a solution? The first thing to realise is the strategic nature of any solution. The main use of the metadata is to understand how information is managed between systems, rather than in any one individual application. At the very least a strategic vision needs to be in place from the outset. This must encompass the key sources and uses of the metadata, and the expected business benefits.

This vision will inform the next step, which is the selection of a suitable toolset and metadata repository. The wrong choice here can cripple an initiative, as we have seen in the past. It is vital to establish sound selection criteria including scaleability, automated import features and good usability, which will meet the organisation’s needs. It may be tempting to take a product on offer from an incumbent supplier, just because it is cheap, or compatible with their software stack, but this is not a sound basis for a successful return on investment.

Following tool evaluation and selection, the next steps are to design a suitable structure for the metadata which supports the needs of the organisation, and to agree a lightweight governance process. Both of these will need tweaking as experience is gained – however adjusting the metadata structure can become very expensive as the volume of captured metadata increases – it is worth leveraging people with experience in this task to get reasonably close on the first attempt!

Actual implementation should follow a more tactical path picking low hanging fruit, where it is reasonably easy to both harvest a sufficient quantity of metadata, and to use it for the creation of immediate business value. If metadata is available in a form that can be readily imported into the selected tool this is a huge advantage. In other cases, it may be possible to build a specialised importer. An example of this might be an importer that can parse SQL statements in stored procedures and perform a static analysis to extract the data lineage.

Ideally the tool will provide sufficient “out of the box” reporting that it can be used to deliver business value to analysts, clients who need interface definitions, compliance officers and so on. Again, it may well be worth writing specialised reporting tools that deliver metadata in a form more usable by other teams or applications. For example, it is possible to generate partial report definitions or system specifications from metadata, and to include relevant glossary definitions, so that an outsourced development team can leverage these to deliver valuable applications faster.

As soon as sufficient experience has been gained on some typical systems, the metadata repository structure should be reviewed and updated if necessary, based on the strategic vision. Unfortunately, although all metadata shares the same general format – descriptions of entities and the relationships between them – we are not close to a “universal solution”. Each organisation needs to implement a metadata management solution that meets their strategic needs, and delivers an appropriate cost/benefit ratio.

There is no easy answer to how organisations should approach the integration of IT systems. However is it becoming clear that for many organisations with complex IT systems a proactive investment in metadata management can generate immediate tangible returns, in addition to very substantial benefits over the longer term.

Leave a comment

Filed under Corporate governance, IT Governance

Research reveals poor corporate policies on data management

Close-up of a hard disk head resting on a disk...

Heads will roll

Data wiping survey by Kroll Ontrack finds half of businesses are erasing sensitive data

By Paul Quigley

According to a new survey on data wiping practices by Kroll Ontrack, less than half of businesses regularly erase sensitive data from old computers and hard drives.

Kroll’s survey found that of the 49 percent of businesses that systematically deploy a data eraser method, 75 percent still do not delete data securely, leaving most organisations ‘highly susceptible to data breaches’, which plague businesses at least once a year.

According to the 2010 Kroll Ontrack Annual ESI Trends Survey, such breaches are costing organisation an average of £4.2 million per breach.

Furthermore, the 2009 Ponemon Cost of Data Breach Study, which looked at over 1,500 participants from twelve nations in Europe, North America and Asia as regards their data wiping practices, revealed that some 40 percent of businesses gave away their used hard drive to another individual. 22 percent admitted they did not even know what happened to their old computer.

In total, over 60 percent of all old businesses computers are fully intact with proprietary business data in the second-hand market, the survey found.

Kroll’s survey also found that only 19 percent of businesses deploy data eraser software and even fewer (6 percent) use a degausser to erase media. When asked if and how businesses verify their data has been deleted, very few (16 percent) reported relying on a product or service report to confirm all their data had been wiped. Aside from businesses that “do not know” (34 percent) how they ensure their data has been erased from an old device, the next most popular response, reported by 22 percent of businesses, was “reboot the drive” to see if the data is still there.

Leave a comment

Filed under IT Governance, Regulatory compliance, Risk management, Uncategorized

BMC automates IT governance, risk and compliance

Image representing BMC Software as depicted in...

BMC Software goes enterprise-wide for GRC

BMC ITGRC goes enterprise-wide for GRC

By Paul Quigley – BMC Software has just expanded its Business Service Management (BSM) platform with a new set of automated capabilities enabling an IT-centric approach to governance, risk and compliance.

According to BMC, its IT Governance, Risk and Compliance (ITGRC) software offers automation to orchestrate the ITGRC lifecycle from policy creation to assessment reporting across the enterprise.

BMC said its ITGRC solution includes functionality, including the ability to define and manage policies, manage and automate controls and audits, automate and enforce compliance across infrastructure, suppliers and end-users as well as assuring appropriate user access permissions across the entire IT environment including mainframe, distributed, virtual and cloud.

“By facilitating the mapping of controls to specific IT resources, and by automating the collection and reporting of information on the degree to which those controls are being performed, IT GRCM can strengthen an organization’s position with respect to external audits, and can reduce compliance reporting costs and improve an organization’s capability to address IT risks” write Gartner analysts Mark Nicolett and Paul E. Proctor in their report, Critical Capabilities for IT Governance, Risk and Compliance Management, April 30, 2010.

The company added that ITGRC is fully-integrated into its BSM platform, which provides customers with the crucial link between compliance management and compliance execution. This offering constitutes one of the first steps in BMC’s strategy to further develop functionality and products that address the growing customer and industry need to automate these processes.


Leave a comment

Filed under Accountability, Corporate governance, IT Governance, Regulatory compliance, Risk management