Online Event: Rules vs. Principles-based Regulation: What can we learn from different professions?

Posted on

Bristol Centre for Economics and Finance is hosting an online event on 28th May 2020: Rules vs. Principles-based Regulation: What can we learn from different professions?

There is an active debate in many disciplines about the most appropriate approach to regulation and enforcement. The workshop intends to bring together participants from different disciplines to provide an overview of the predominant approaches, along with the respective debates, experiences, and challenges. Common experiences and core issues can be identified.

The workshop aims to spark debate about regulation and whether we, across disciplines, could respond differently to the challenges we face and find novel ways to more efficient regulation.

Obtaining insight into other disciplines’ experiences shall enable us to rethink the predominant approaches. By learning from each other we can ask: Can we do better, both in our own disciplines and the common regulatory landscape? Might there be a better way?

The event is of interest to both public and private sector participants: Policy-makers, government enforcement agencies, academics, and industry professionals in the area of, and affected by, regulation, in various disciplines.

Sign up for this free event here

Workshop programme

13:00 – 13:05 Welcome Professor Felix Ritchie
  Cluster 1 Presentations: Data regulation in the public and private sector
13:05 – 13:15 Data in the public/private sector Design of incentive systems/evidence base Professor Felix Ritchie/Elizabeth Green
13:15 – 13:25 Data in the public sector Organisational trust Andrew Engeli – Office for National Statistics
13:25 – 13:35 Data in the private sector (I) Data Protection & Privacy Martin Hickley – Director Martin Hickley Data Solutions Limited
13:35 – 13:45 Data in the private sector (II) Data Analytics & Privacy Luk Arbuckle – Chief Methodologist Privacy Analytics
13:45 – 14:15 Cluster 1 Discussion
14:15 – 14:25 Break
  Cluster 2 Presentations: Financial markets and accounting
14:25 – 14:35 Rules vs principles in financial markets Financial Regulation & Compliance Expert witness Paul Keenan – Visiting Practitioner Professor in Financial Regulation in the Business and Law Faculty of the University of the West of England (UWE)
14:35 – 14:45 Rules vs principles in accounting (I) Practical accounting & Regulator Perspective Bryan Foss – Digital Non-Executive Director, Risk & Audit Chair, Visiting Professor and Board Readiness Coach
14:45 – 14:55 Rules vs principles in accounting (II) Auditing & Corporate Governance Ismail Adelopo/Florian Meier
14:55 – 15:25 Cluster 2 Discussion
15:25 – 15:35 Break
  Cluster 3 Presentations: Legal perspective and non-financial regulation
15:35 – 15:45 Legal perspective Financial crime Nicholas Ryder – Professor in Financial Crime
15:45 – 15:55 Non-financial regulation Modern slavery and other required reporting Jaya Chakrabarti – CEO Semantrica Ltd (tiscreport)
15:55 – 16:25 Cluster 3 Discussion
16:25 – 16:55 Summary and Closing remarks Nicholas Ryder Professor in Financial Crime

Checking research outputs for confidentiality risks

Posted on

By Professor Felix Ritchie and Anthea Springbett

UWE Bristol has recently been commissioned by the Office for National Statistics (ONS) to develop a course in ‘output checking’ for research data centres. This is where researchers working on confidential data have their statistical outputs checked before publication, to ensure that they don’t break the law by inadvertently releasing information about individuals; for example, without proper checks a table of earnings in a small village could reveal the income of the highest earner. This checking process is called ‘statistical disclosure control’, or SDC.

Output checking is a well-established field, and there are experienced trainers and automatic tools to help those producing statistics. Why then does ONS need a new course? The reason is that new forms of data, new ways of working, and new types of users have all created a need for a different kind of output checking.

SDC training is largely focused on the tables produced by national statistical institutes (NSIs) such as ONS. NSI outputs have particular demands: similar tables are produced year after year, multiple tables are produced from the same data so consistency across tables is important, and NSIs publish a lot of information about their tables, including sampling methods.

Research outputs are quite different. Researchers aim to find new and interesting ways to extract meaning from data. Researchers choose data based on the hypotheses they want to explore and sub-samples of the population they are interested in, including or excluding data according to their own criteria. Finally, and most importantly, researchers don’t tend to produce detailed tables of the type NSIs generate; they are interested in multivariate analysis, non-linear models, heat maps, survival functions… For researchers, tables are often just used to describe the data before they get on to the interesting stuff. As a result, the forty-odd years of SDC designed for NSIs is of limited practical use in this environment.

For fifteen years, we have been developing an approach designed specifically for the research environment; we call it ‘output SDC’ (OSDC) to emphasise that this is a general approach to outputs, not just tables and not just for NSIs. There are two strands to this approach, one statistical and one operational.

The statistical strand comes from the ‘evidence-based, default-open, risk-managed, user-centred’ approach that we apply across our work in confidential data management. The way that researchers use data, and the confidentiality risks that they generate, require the output checker to be familiar with a wide range of statistics, which we address through classifying outputs into types, with a higher or lower inherent risk; this allows the checker to spend more time on the more ‘risky’ outputs. For these ‘risky’ outputs, context is everything. The traditional approach has been to apply simple yes/no rules (are there enough observations? Are there any outliers?) but this can be a very blunt instrument. Our approach emphasises the use of evidence in decision-making, which places more of a burden on the output-checker but increases the range of allowable outputs.

The operational strand reflects the fact that researchers are initially, on the whole, resistant to what they see as restrictions on output. A key part of the training will be helping output checkers build relationships with researchers; for example, emphasising that this is not about restricting output, but about keeping the researcher out of jail…

Photo by Chris Liverani on Unsplash

This is, we believe, the first formal course targeted specifically at (a) research outputs, and (b) those checking the outputs of the researchers, rather than the producers of statistics themselves. ONS is sponsoring the development of this training for all interested UK organisations. Many overseas organisations also run facilities that vet researcher outputs. We hope therefore that this will be of interest to a wide range of organisations, and may prompt a sea of change in the adoption of more general OSDC principles.

Improving the pay of UK apprentices

Posted on

By Professor Felix Ritchie and Dr Hilary Drew

 

Apprentices are amongst the lowest paid workers in the UK. Their statutory minimum wage is lower than for any other worker aged 16 or over. Despite this, our research, along with Michail Veliziotis at Southampton University, showed that up to a quarter of all apprentices still seem to be paid below their legal minimum hourly rate. In comparison, the rate of non-compliance amongst the whole work force is less than 5%.

We argued that this was partly due to the minimum wage for apprentices being more complicated than for other workers. However, in this segment of the economy there are weak mechanisms for checking that the correct wage is being paid. Apprentices are often unaware or incurious about pay rates, while employers showed a confidence that they could work out pay rates which wasn’t strongly supported by statistics. Apprentices had a high degree of trust in employers to pay the right wage, which meant that mistakes were unlikely to be uncovered. Finally, all this was set in a low-pay culture, where it was accepted that “rubbish pay” (to quote one apprentice) at the early stage of your career was one of the rites of passage.

We advised that more targeted information could help to resolve this problem; in particular, we proposed an ‘app app’ (a wage calculator designed for young people fresh out of school or college), and working with trainers at FE colleges who were best placed to help apprentices check their pay.

Our recommendations have now been taken forward in the South West. Earlier this month we presented our findings at the “Great Apprenticeships – Treated Well – Treated Right” event organised by the South West TUC at the City of Bristol College.  Union representatives, training providers, local government and apprentices attended the event.

One of the aims was to showcase the South West TUC’s new wage calculator,  developed specifically for apprentices. The meeting also presented an opportunity for interested parties to examine a regional approach.

The event indicated that there was a clear common interest in taking action to improve pay awareness amongst apprentices and to better promote the TUC’s wage calculator. We are excited to remain involved with the work taking this forward in 2019 and plan to support the City of Bristol College in targeting their apprentices as part of the South West TUC’s campaign. We hope as a result of this we can demonstrate how a simple intervention, allied to a targeted information programme, can make a material difference to some of the lowest paid employees.