With frauds, trust was broken between regulators and drug firms
With frauds, trust was broken between regulators and drug firms

This week, Speak Pharma interviews Dr. Bernard Plau, CEO of Akovia Consulting, a pharmaceutical quality consulting company. Bernard is a quality and process expert with over 35 years of experience working in drug companies such as Rhone-Poulenc Rorer, Aventis and Sanofi-Aventis. In the first of a two-part interview, Bernard speaks on the evolution of the regulatory framework governing the global pharmaceutical industry. Excerpts:

Your work experience has ranged from being the process development manager at Rhone-Poulenc to being one of the quality directors at Sanofi. How has the world of pharmaceuticals changed over the last few decades in Europe?

 

It has changed drastically. Post 1980s, governments reacted to the rising cost of public health. In the US, the landscape changed with the Hatch-Waxman Act (1984). This new legislation suppressed the need for an applicant to demonstrate the safety and efficacy of a new generic drug by way of submitting an ANDA (Abbreviated New Drug Application). A bioequivalence study was sufficient to market a generic drug, provided there was no patent infringement.

Northern and East European countries adopted similar policies. The southern part of Europe – including France – followed suit, albeit at a slower pace. As a result, the market share of generics increased rapidly. Today, the average price of a generic drug is about 10 percent of the brand name.

Prior to 1990, part of the pharmaceutical industry was integrated with big companies that produced commodities such as phenol, nitric acid, acetic acid and hydrofluoric acid. They diversified by producing intermediates, and then the final API (active pharmaceutical ingredient) and DP (drug product/finished formulations).

The knowledge was integrated in big centres of expertise. Over time, some of the smaller companies disappeared due to rising (development) costs and the pressure from generics. Between 1990 and today, the number of pharmaceutical companies in France has decreased by about 25 percent.

Factors like pressure from generic firms and the need to rationalise businesses compelled giant companies to split in order to focus on their ‘core domain of activities’ (this con-phrase has become rather trendy these days). Therefore, chemical commodities were separated from fine chemical production, which in turn was split into agrochemicals and healthcare production.

In each of these new business units, targets were set on cost reduction, which in turn resulted in important process improvements and cheaper sourcing of commodities in the beginning, followed by cheaper sourcing of intermediates and later even APIs.
 

How were the cost savings realized in the newly created business centres? What did this mean for the industry in Europe?

Getting products manufactured in Asia, where labour cost was much lower than Europe and where skilled scientists could be found, became the inevitable means of lowering costs. That’s how much of manufacturing started in China, India, Indonesia, Taiwan and Thailand. This also led to transfer of knowledge to Asia. 

With time, Europe lost out on knowhow. Today, only a few industrial chemists in Europe have worked on large production units. Knowledge has flown away with the products. It’s not that Europe does not have skilled scientists. There are excellent universities in Europe. But the knowhow is no longer there. The challenge is – how do you put textbook knowledge into practice? 

Today, a young scientist needs to be employed by different companies in different positions before being able to select the most viable pathway to a complex heterocycle. Even more so as those big temples of expertise – the development research centres of big companies – have disappeared.

 

How did regulations emerge and what role did the US Food and Drug Administration (FDA) play in the beginning?

In the 1990s, the pharma industry was on the learning curve. Science came first and regulations followed. Regulations were the necessary administrative paperwork required to put the product on the market. For example, in those days the raw material (RM) specifications could just be a copy of the catalogue specifications for laboratory reagents.

All the main actors, including the regulators, were learning. The players were trying to find out what to expect from the FDA, the leading agency at the time. The FDA was the first agency to formulate rules for quality. More importantly, it regulated a big unified market.
   

How did a global approach towards (pharmaceutical) quality develop?

The US drug price attribution increased its market attractiveness. Therefore, all the European pharma players tried to grab a bigger share of the US market. This was possible only if the US market became accessible.

An easy way to achieve this was by merging with the US-based companies. This way, the newly born consolidated company would benefit from the US commercial network. Moreover, the American quality and regulatory teams would nurse their European counterparts. This is how the American GMP (good manufacturing practices) approach irradiated through the European pharmaceutical companies. 

Globalisation in the pharmaceutical industry got accelerated by the need for having common regulatory rules, based on solid scientific backgrounds. Country-specific rules were (initially) harmonized by providing common guidelines between the three major geographic pharmaceutical zones – Europe, the US and Japan. 

The International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH), spurred by the World Health Organisation (WHO) and by the European Union (EU) community, was born in Brussels in April 1990. The ICH guidelines responded to an urgent need to rationalise regulations, as a result of rising costs, and in order to meet public expectations. Such an agreement could only be obtained from disparate stakeholders because they were based (i) on good science; and (ii) at the equilibrium point between the regulators and industry.

This is the reason why the guidelines formulated 20 years ago are still valid. Over the last 30 years, one or two major rules were published every year. The regulation armada is now mostly constructed!

 

Did regulations create standardization in the industry? Or did they create complications?

After the 1990s, one major regulatory rule was published every year. The pharmaceutical industry had to become compliant with a complex set of rules. Fortunately, it became less country-dependent.

If the rules are relatively easy for finished products, a grey zone still exists for intermediates. The development team had to take decisions on various ‘flexible’ rules. For instance, they had to determine to what extent an analytical method should be validated for a raw material; and how in-process controls (IPCs) should be managed in terms of validations and in terms of out of specification (OOS). As always, there were several views on how to apply the flexible rules.

  

What role did information technology play in pharmaceutical quality systems?

Information technology brought about a silent revolution. Initially, computer systems were ignored by both the regulators and the industry. The regulators had other areas of concerns, while the industry had more urgent work to carry out.

Prior to 2000, a signed or a dated document was regarded as an undisputed raw data. With the advent of information technology, fakes could be easily generated. Today, paper-based documents can only be regarded as raw data if the system generating the raw data has been validated.

Today, safety is required in all spheres, and products are not sold without meta data. For example, pictures from our digital cameras are encrypted with a lot of data, such as the date, time, aperture, shutter speed, ISO etc. Such a shift is also visible in medicines – a drug is sold with full traceability, expiration date, leaflets about possible side-effects, recommendations etc.

This meta data reaches the patient. During drug manufacturing, the DP plant receives all information about the quality of the various raw materials and their adherence to the regulatory files. Such huge data flow is only possible through big integrated computerized systems.

 

Over the years, how has the relationship between the industry and regulatory agencies evolved?

The heparin adulteration case in China (2008) placed regulatory agencies on the hot seat of public opinion. Similarly, the Mediator scandal in France (2009) resulted in a complete change of dialogue between the French health agency (ANSM) and the pharmaceutical industry.

Confidence was broken between the agencies and the drug manufacturers. After all, in all these scandals, the regulatory agencies appeared to not have adequate control to protect the public. And, the scandals harmed the pharmaceutical companies the most.

The agencies couldn’t trust the industry any longer. The inspection mode changed from ‘innocent until proven guilty’ to ‘guilty until proven innocent’. In the dossier review, the expertise on some products would lie with some experts within the pharmaceutical industry. This was stopped, as a possible collusion could arise.

In the coming weeks, Bernard will share his views on more recent aspects of the industry’s evolution. Interested in talking further with Bernard Plau? Email him at bernard.plau@akovia.com

 

 

The PharmaCompass Newsletter – Sign Up, Stay Ahead

Feedback, help us to improve. Click here

“ The article is based on the information available in public and which the author believes to be true. The author is not disseminating any information, which the author believes or knows, is confidential or in conflict with the privacy of any person. The views expressed or information supplied through this article is mere opinion and observation of the author. The author does not intend to defame, insult or, cause loss or damage to anyone, in any manner, through this article.”