Big Data Meets Measurement in Manufacturing



Big Data headlines not only tech news but also popular news—as in what’s the government doing with all the information it’s storing about us. Big Data comprises just a twig compared with the fullgrown oak that Big Analog Data can generate. National Instruments Fellow Tom Bradicich mentioned twice in separate interviews during NIWeek last month that all of the analog data acquired from manufacturing and products—a.k.a. the Internet of Things (IoT)—dwarfs what is currently known as Big Data.

Keep Your Document Control System in Control

Document control is always an interesting topic for discussion.  It seems like a simple topic and area for compliance, but I often run into companies with document control systems that are overly complicated and difficult to manage.  Many companies separate document control for quality system procedures/processes from other types of change control.  In reality, document control is one element of the overall change control and records management requirements.  There are many procedures, formats, tools, and/or styles for managing quality documents and records.   Let’s take a look at how these key activities are related and why we should focus on implementation of strong change control processes within the quality systems — rather than just a document control process.

As you look at the various regulations and/or standards like 21 cfr (210, 211, 820, 600, etc.), ISO (9001, 13485, etc.), EU, JPAL, etc., they all require document and change control/management.  Additionally, 21 cfr 11 describes the FDA requirements for the use and management of electronic records.

We are all familiar with the standard document control pyramid. The diagram below reminds us of the relationship between documents and records within the Quality Management System.



You can see from this diagram that there are many types of Quality System documents you could generate.  I strongly recommend that you “right size” your procedures and records to assure compliance and simplicity for the organization. 

  • Determine the relationship between the various documents and which ones meet the needs of your specific business. 
  • The regulations/standards identify the minimal procedures that are actually required to meet the defined requirements. 
The following list was extracted from the ISO 13485 standard
  • 4.2.3 Control of documents
  • 4.2.4 Control of records
  • 6.4 Work environment
  • 7.3.1 Design and development planning
  • 7.4.1 Purchasing process
  • 7.5.1.2.3 Servicing activities
  • 7.5.2.1 Validation of processes for production and service provision —
  • General requirements
  • 7.5.2.2 Particular requirements for sterile medical devices
  • 7.5.3.1 Identification and traceability — Identification
  • 7.5.3.2.1 Identification and traceability — Traceability
  • 7.5.5 Preservation of product
  • 7.6 Control of monitoring and measuring devices
  • 8.2.1 Monitoring and measurement — Feedback
  • 8.2.2 Internal audit
  • 8.3 Control of nonconforming product
  • 8.4 Analysis of data
  • 8.5.1 Improvement — General
  • 8.5.2 Corrective action
  • 8.5.3 Preventive action

  • You do not need a work instruction/procedure for every activity in your operation.  Create these additional documents if they help the employees perform the operations/tasks or assure higher level of quality.   
  • Keep it simple.

There is specific information that should be addressed as you implement the document/record/change control system:

  1. Document identification/number system – develop a simple but intelligent number system.  I have seen everything from basic sequential numbering (1, 2,3,etc.) to extremely long alpha/digit numbers with built in intelligence.  I recommend something that is meaningful and simple for the employees to understand and find documents.

  • Revision control – identify how the employee knows they are using the correct version of the SOP.  There are many approaches including, but not limited to, alpha characters, numbers, and date codes, to name a few.
  • Purpose – why are you generating the procedure? If you can’t explain why you are writing the document in a couple of sentences, maybe you should think about whether or not it is really needed. 
  • Scope – what/who is impacted by this procedure.  There are times a procedure is not applicable to all functions/sites.  This is a great time to identify what is or isn’t included in the scope of the process.
  • Definitions – this is really a tricky area.  I prefer a stand-alone glossary of terms, rather than building them into the procedure.  Putting all definitions/terms into a glossary document assures consistency in how terms are used.
  • Roles and responsibilities – You may choose to use the RACI (Responsible, Accountable, Consult, Inform) model to identify what roles different functions have in the process. Another option is to call out cross functional  and or interrelationship responsibilities.
  • Records – it is always good practice to identify the records generated as a result of the procedure. 
  • Procedure – this is the actual process being documented.  There are many forms/formats for writing the procedures.  A process flow diagram may work and be more effective than a 14-page document.  Select the format that best meets the process and is easy to demonstrate compliance.

Procedures and records can usually be distributed into three major buckets:

  • Product – those documents that support the design and development of products.  These records make up the Design History File (DHF) and serve as the master record for how the product was designed.
  • Process – those documents that support the manufacturing, procurement, and/or operations of the organization.  These records make up the Device Master Record (DMR) and serve as the master record/recipe for how a particular product is manufactured.
  • Quality System – these documents define the activities and processes necessary to support the quality system requirements. These procedures do not normally impact product design or manufacturing operations.

Implementing an effective document/record control system requires a simple change control process. The FDA is taking a much closer look at planned changes in relation to product and process design.  Unplanned changes are normally documented as deviations and should be included in the evaluation of nonconformances, as a feeder to the CAPA systems.    

There is no requirement to have a separate change control system for each type of document.  One simple process can be established to support all types of changes.  Evaluation of changes should use a risk approach based on major/minor types of change.  The risk based definitions must be included in your procedures and should be determined on specific business needs.  Administrative changes (typo, wording clarification, page numbers etc.) must be addressed but do not require the same level of scrutiny or evaluation.  A good rule of thumb to use when evaluating change is the impact of the change on the form, fit, or function of the product/activity. 

I have seen change processes implemented in many different ways.  I recommend one basic process to address all types of change, rather than individual change processes.  You can develop one process flow and form to support all areas of the quality system.  The following table gives you an example of how you might establish a risk based approach to change control with one process:

Change Control

Product
Process
Quality System
Administrative
Requirements
X
X
NA
NA
Risk Assessment
X
X
NA
NA
Product Verification
X
X
NA
NA
Product Validation
X
X
NA
NA
Training
X
X
X
NA
Process validation
X
X
X
NA
Process Risk Assessment
X
X
X
NA


There are several very effective automated tools on the market that could be considered to support the document/record/change control processes.  The key is to have thought through how you want the processes to work, map out a plan for implementation, validate that the tool meets your business needs and requirements, educate the organization on how the system works and why key decisions were made.

I strongly recommend defining user needs and requirements for the document/record/change control electronic tool prior to purchasing.  Having a defined procedure and process will facilitate the selection activity.  Identify those functions/features the system MUST have versus those that would be NICE to have.  Build a table of these requirements and ask the various suppliers to address your needs.  Each system/tool will have key features and benefits  Knowing what your business needs and wants before making the selection will make the implementation much more effective.

In summary,
  • Document/record control is one of the main cornerstones for a successful quality management system.
  • It is critical that you establish the document/record/change process and controls as simply and clearly as possible.  Make it easy for the employees to understand and use.
  • Change control should be simple and risk based.
  • Automation is great – as long as you are clear about what you are automating and how it needs to work for your business.

Validation Strategies For Nonsterile Solid Dosage Forms

Process qualification, which includes pharmaceutical drug products at a stage prior to commercialization or prior to submitting a New Drug Application (NDA) or Abbreviated New Drug Application (ANDA).  In this stage, it must be demonstrated that the process for manufacture of a drug product is consistent and can produce drug products that are compliant with the Food and Drug Administration’s requirements for filing. 

This stage includes two elements.  The first element focuses on facility design and equipment installation and maintenance while the second includes process performance qualification (PPQ).  Some prerequisites to facility design and equipment installation and maintenance includes validation and qualification of analytical methods, approved standard operating procedures (SOPs) for process validation, implementation of preventive maintenance program (PMs), cleaning validation of equipment, and process specific GMP training.  In structuring a PPQ, CPPs and CQAs must be defined, justified, and documented.  Other process performance qualification activities must be controlled by an approved protocol that includes the scope, strategy, testing, sampling plan, and acceptance criteria. This study is conducted at a manufacturing site, according to a site validation master plan.  Additional strategies are explored below.

Process Qualification Study Strategies:

A strategy needs to be developed for every qualification study and it must be based on deep process understanding gained from the manufacturing experience. Some elements to consider include:

1. Number of Batches: The protocol should include three consecutive batches, with the results summarized in the final process qualification report.

2. Material Selection:  Selection of more than one lot of API and critical raw materials should be used during process qualification (especially if the API or critical raw materials are not dissolved or distributed in solution). Based on the selection of these materials, a process should be designed to validate the robustness of the process.

3. Equipment Selection: The process must be qualified on all equipment intended to be used in the manufacture of the product. For equipment determined as being equivalent, qualification on one piece of equipment is sufficient.

4. Design Space/Parameters Ranges:  For products that have an established design space, the process qualification should be executed at specified conditions within the design space. For conditions that are high risk, high and low parameter ranges should be considered for the process qualification.

5. Process Re-qualification:  Process re-qualification may be required if, for example, a significant deviation from desired process performance is uncovered through stage 3 continued process verification.  The process re-qualification studies should bridge back to the original or pivotal clinical biobatch.

Process Qualification Testing:

Process qualification testing should be based on the established CPPs and CQAs and the control strategy and risk assessments, which characterize the product quality and process consistency.

CPP Monitoring:  Defined critical process parameters should be monitored and reported in the final qualification report.

  • For processes that have a processing fluid (e.g., granulation), microbial testing and hold times will need to be established. For processes where critical ingredients are added as a part of the processing fluid, testing related to the critical ingredient (e.g., assay) at make-up and at the end of hold time should be considered.  
  • For processes involving lubricated granulation, blend uniformity testing must verify that the active ingredient has been distributed throughout the blended bulk uniformly. To determine unit-dose equivalent (1-3x) blend uniformity of the active ingredient on the final, blend samples should taken from the blender. For combination products, all APIs need to be tested for blend uniformity while respective blenders need to be tested for bilayer tablets.
  • For the manufacture of compressed tablets or capsules, during the compression operation a sampling plan should be adopted with around ten evenly spaced intervals throughout the batch processing. After the compression machine is set-up, location one should include the first salable dosage units and location ten should contain the last salable dosage units while location two through nine should be evenly spaced across the lot.  For a process using a double sided machine, both the sides of the machine should be sampled at each location. For the manufacture of combination or bilayer tablets, ample samples should be obtained to properly evaluate all API specific tests.  Specific to bilayer tablets, samples should be taken to evaluate both layers.
  • For the manufacture of film coated tablets, random location sampling should be taken of representative samples. For application of functional membranes, testing should be performed on samples taken from each film coating pan load. For nonfunctional coats, samples may be divided among the film coating pan load.

Acceptance Criteria:

The manufacture process must be validated and reported within the regulatory filing, batch records and final validation report, including all the appropriate specification and procedures (e.g., selected CPPs and release requirements). Any deviation must be investigated and addressed in the validation report. 

Each of the processing steps can be analyzed by evaluating the process control charts (based on three sigma limits) and the historical process capability charts.  These charts should be evaluated for nonrandom systematic behavior.  Ultimately, for all initial process qualification, a comparison must be made to the dissolution profile performed on the biobatch or pivotal reference batch.  A batch may be excluded if a nonprocess related assignable cause, like mechanical failure, has been identified.

Statistical Analysis:

Intra-batch and inter-batch variability should be examined from data collected during process qualification through the analysis of process control charts and process capability charts.

Once a product has completed performance qualification, process validation continues through implementation of continued process verification.  This verification includes monitoring operating procedures, preventive maintenance and calibration programs, deviation investigations, annual review, and change control procedures. Any changes to the process must be evaluated through the process change request system and procedures for process change control to determine the impact to on-going process validation.  A list of intermediate tests for nonsterile solid dosage forms (tablets and capsules) are compiled in Table 1.

Stage 2 process qualification is conducted at the manufacturing site, which is usually a far distance away from product development/process development facilities. Therefore, another important piece of the process validation includes technology transfer to manufacturing facilities. Technology transfer includes detailed process fit, manufacturing readiness, and an execution phase. Process qualification falls under the execution phase.

Continued Process Verification (Stage 3)

The goal of continued process verification (CPV) is “continual assurance that the process remains in a state of control (validated state) during commercial manufacture.”[5] Once a process has gone through process qualification, an ongoing program to collect and analyze product and process data that relate to product quality is necessary. The objective of the on-going process verification program is to understand the sources of variation, its impact on the process and product attributes, and finally to devise a way to control the variation. The knowledge gained through stage 3 of continued process verification provides ongoing assurance that a product remains in a state of control.

In summary, QbD is not a mandated requirement, however, any pharmaceutical company that instills the QbD approach in their DNA of product development (process design, process qualification, and continued process verification) will come out ahead in their value curve, since we live in an increasingly science-driven regulatory environment in 21st century — where compliance and quality have been essential elements of competitiveness and quality drug products. Hopefully these tips and strategies will support you in designing and validating a quality manufacturing process.




A Step-by-Step Guide to Implement Track and Trace

With every recurrence of confirmed counterfeit drug product in the pharmaceutical supply chain, the pressure increases to establish a national standard for pharmaceutical track-and-trace solutions. Although a universal standard has yet to emerge, there is one set of requirements that is currently driving action: the California Board of Pharmacy’s ePedigree.

Moreover, with considerable attention from legislators, regulators, and standards organizations to track and trace a pharmaceutical product throughout its lifecycle, drug manufacturers and packagers are under increased pressure to implement serialization to provide supply-chain integrity to the public.

However, with the deadlines for the California Board of Pharmacy’s ePedigree law quickly approaching, it will be vital for the industry to accelerate its efforts to start putting compliance measures in place in 2013.

With the accelerated pace of regulatory involvement worldwide in serialization and track and trace, a strong working foundation of imminent requirements is necessary.

Several of the U.S. states are in the process of creating legislation for ePedigree; however, California is, by far, drawing the most attention for two reasons:

  1. Although it has pushed the initial enforcement date out, there seems to be a common understanding that the current initial enforcement date of January 1, 2015, will stand.
  2. Of those states actively pursuing similar legislation, California’s requirements are the most demanding. For example, Florida’s law requires the tracking of product to the lot level, whereas California’s goes to the smallest saleable unit level.


California’s legislation is written such that it will effectively be usurped by federal legislation, but it is not known when the feds may introduce similar or overriding legislation. The bottom line? The deadline is fast approaching, and action needs to be taken now in order to comply with the impending law. As you contemplate your track-and-trace efforts or are putting systems in place, you should first keep the following considerations in mind.

Compliance Matters:

Any and all systems added or maintained to resolve the ePedigree initiative must be compliant with the regulations. Although the need to attack the problem of counterfeit medicines in the supply chain is obvious, many of the system design considerations may be derived from an analysis of the regulations the system(s) will address. The following are regulations to keep in mind to help guide your process:

  1. ISO Requirements: Depending on the extent to which a company implements serialization for a packaging operation, there will be significant regulatory impact to a serialization project. At a minimum, there are ISO requirements for a formalized computer-system life-cycle management process. In any case, the use of best practices in today’s world for well-tested and documented business-critical systems is a safe assumption.
  2. CFR Part 11: Other regulations may also be impactful. In the case of pharmaceuticals, 21 CFR Part 11, Electronic Records; Electronic Signatures, there are many requirements that must be met. These include specific actions to ensure authenticity, integrity, and confidentiality of e-records where appropriate.
  3. GAMP 5: The pharmaceutical industry adheres to industry standards that can be found in a document published by the International Society of Pharmaceutical Engineers (ISPE) in its most recent guide, GAMP 5: A Risk-Based Approach to Compliant GxP Computerized Systems. Immediately upon the formalization of a systematic serialization approach within an organization, the resulting project then becomes subject to standard Good Manufacturing Practices (GMP) Quality Management System requirements. GAMP 5 will drive the project planning and implementation of a structured track-and-trace program in the typical pharmaceutical company.


Within GAMP 5, topics include how:

  • the system will be designed, built, and tested
  • the system will be documented
  • the system will be handed over to the users
  • incidents will be documented, and corrective and preventive measures will be captured and coordinated
  • system changes will be managed
  • system audits will be conducted
  • electronic records will be maintained, retrieved, and archived
  • ... and more

Ask The Right Questions To Form Your Strategy:

Apply a structured analysis of the integrated automation system to assure adequate understanding of the processes and scope of the effort (i.e. determine what business processes, system interfaces, and human control activities need to be assessed) prior to formal project Scope Statement.

Questions to be addressed for this phase should include the following:

  • Is this a global project? If so, what communication and collaboration tools will be used to support the project?
  • What is planned for a proof-of-concept?
  • Which lines will be upgraded?
  • Are multiple facilities involved?
  • Is there a serialization system already in place? If so, has a gap analysis been performed to identify the differences between the “as-is” and the “to-be” processes?
  • Have all important stakeholders been included?
  • How will we be able to insure a good level of communication with stakeholders?
  • Is there an opportunity to leverage standardization in order to reduce or eliminate the need for redundancy and duplication?
  • Is there potential for leveraging corporate standards or guidelines and supplier standardization?

The project plan should consider task ownership so that there are clear expectations of who will deliver what and when. Although this consideration is universally applicable to all elements of the project, the bottom line is that the delivery of the system, including all software, hardware, documentation, and on-going support and maintenance, must be well understood and agreed upon in writing.

Click to view larger image

Test, And Test Again:

As with any complex project, there will be a lot of trial and error, potentially resulting in rejects and rework. This is primarily because now we are aggregating serial numbers into containers holding smaller serialized units. When the serial-number “chain” is disrupted for whatever reason, the human business processes’ and automated systems’ capabilities to cope with the management of such an issue is key.

Although there are a number of potential scenarios that will inevitably occur, the following is an example based on a standard high-speed packaging operation of between 100 to 200 bottles per minute:

Assumptions:

All labeling within the batch must be uniquely serialized, verified, authenticated, and aggregated during the packaging operation.
At a minimum, the following automated systems, including mechanized ejection capability, are parts of the packaging train: 
(1) controlled creation and issuance of a pool of uniquely serialized values for the specific packaging operation; 
(2) confirmation and verification following application that the bar code is machine readable and is an issued value from the approved pool; 
(3) aggregation of the serialized product during the final boxout into shipping containers to create the parent–child relationship between the uniquely serialized bar code applied to the shipper case and each uniquely serialized container within the shipper; 
(4) reconciliation of used, destroyed, and remaining values against the issuance for the original pool.
Product, labeling, and packaging materials (e.g. container, closures, corrugated shippers) are issued to a packaging suite.

Inevitably, equipment fails, whether using inkjet, laser, thermal transfer, or some other labeling technology.
Upon failure, defective units result, causing rejects that are automatically detected and separated from the rest of the batch.

Depending on product value, rejects are placed in a secured location and destroyed at the conclusion of the batch, or rejects are fully reworked using an approved rework process that specifically addresses the final disposition of the serialized labeled primary container.

As the rejects contain controlled serialized values created for the specific batching operation, these units must be closely controlled and reconciled during the batch-approval process. Discrepancies will result in delayed batch release, potential quarantines, and investigations.

It is evident that any fault in the data linkage will result in significant lost production time. Moreover, considering throughput on today’s high-speed packaging lines, a failure scenario such as the one described above could have significant cost and/or compliance impact.

Internal Corporate Requirements:

Of course, not all of the system requirements will be gleaned from a review of the regulations. So, in addition to the user requirements focused specifically on meeting the regulations, additional requirements will need to be gathered and documented clearly describing our internal users’ expectations for the delivered system. Examples of requirements for this group may read similar to these:

“The system shall be able to monitor availability of serial numbers and notify an operator when a ‘low level’ limit is reached.”

“The system must be able to ‘read’ all serial numbers applied by scanning with appropriate equipment and immediately reject defective units and notify operator(s) when illegible serial number(s) is/are encountered.”
Although the expectations from legislators, regulators, and standards regarding track and trace of pharmaceutical product are being communicated to the industry, it is quickly becoming evident that there are resource limitations throughout the industry to address these new expectations. Attention must now be focused on the extended project piloting and implementation time requirements to get these systems up and running in an accelerated fashion.

At this point, you must implement an aggressive time line to meet the minimum compliance requirements of the California ePedigree Law. I have included one project plan approach, focused on fulfilling the basic requirements. Adherence to the plan does not guarantee a successful implementation, as there are a myriad of variables; however, failure to establish a plan at this late stage certainly guarantees the inability to effectively comply.

ISO 11011: Standardizing Energy Audits

ISO 11011:2013 aims to standardize the energy audit process by establishing guidelines for assessing compressed air leaks. It also addresses the competency of the assessor and the methodologies employed.
 For more than a decade, money wasted through compressed air leaks has often been cited as the number one quick fix manufacturers can take to begin getting a hold on their energy costs. Going back to 1998, a Department of Energy “Compressed Air Challenge” fact sheet notes that “leaks can be a significant source of wasted energy in an industrial compressed air system, sometimes wasting 20-30 percent of a compressor’s output. A typical plant that has not been well maintained will likely have a leak rate equal to 20 percent of total compressed air production capacity.”

Adjusting data from that 1998 Department of Energy fact sheet to 2013 dollars, a 1/4-in. leak that cost $8,382/year in 1998 would now cost a manufacturer $12,026/year. And that’s not even adjusting for the average kWh rate, which was 5 cents/kWh in 1998 and now averages about ~12 cents/kWh.

The bottom line today is the same as it was in 1998: By simply fixing compressed air leaks in your facility, the impact to your bottom is significant. The real question is: Why is this still an issue today?

Prior to ISO 11011, virtually anybody could offer to provide energy surveys, air audits and data logging of compressed air usage, to no recognized standard, with wildly varying results and findings.
 One possible reason is the lack of standardization around the energy audit process in general and dealing with compressed air leaks, specifically.

In October 2013, ECOskills, an environmental training group based in the U.K., held an event to highlight the new ISO 11011:2013 standard to improve compressed air energy assessments.

Speaking at the event, Stephen Boults, capital equipment manager at Thorite (an independent U.K.-based distributor of compressed air products and process systems), explained that over 10 percent of electricity consumed by British industry is used to generate compressed air, yet many unmanaged systems waste 30-40 percent of the compressed air produced.

“Reducing current energy costs is the main driver for instigating an energy efficiency assessment,” said Boults. “Yet, up to now, virtually anybody could offer to provide energy surveys, air audits and data logging of compressed air usage, to no recognized standard, with wildly varying results and findings.”

By establishing requirements on how to conduct an energy efficiency assessment, ISO 11011 is expected to dramatically change the energy audit process. The standard addresses three aspects of compressed air systems: supply, transmission, and demand.

Boults noted that the standard also covers analysis of the assessment data, how the findings are documented, and how estimates of energy savings can be achieved. The standard also addresses the competency of the assessor as well as the assessment methodology, objectives, and scope of the audit.

“ISO 11011 enables industry to receive accurate assessments of the savings achievable by professional management of compressed air systems and the installation of energy-efficient compressors and controllers,” Boults said. “It's a win-win situation for those companies that implement ISO 11011's new energy efficiency assessments, as less electrical power consumption not only saves money but also cuts carbon emissions too.”

A New Method to Ensure Bar Code Quality

Is it Time to Start Teaching Basic Diagnostics?

A pressing movement within the profession, which is being partially propelled by national universal healthcare initiatives, is the drive towards “provider status” for at least some groups of pharmacists.1 Given the long history of our profession, it is startling that in 2013 we find ourselves still advocating for “recognition” as healthcare providers, even within the confines of bureaucratic federal processes. A corollary and perhaps more long and winding road has been the battle for pharmacist prescriptive privileges across the United States.2,3 Some momentum has been made on a state-by-state basis, especially in terms of collaborative care agreements.4 From a medical perspective, the lynch pin in blocking progress in this area is a lack of even basic diagnostic training for most pharmacists.5 Physicians and mid-level practitioners alike have often used the “inability to diagnose” battle cry as a reason to obstruct sometimes even the most tepid efforts by pharmacy to assume greater responsibilities in the care and follow-up of patients.

Some might argue that pharmacists graduating from modern doctor of pharmacy (PharmD) degree programs, while often not formally exposed to diagnostic training, do indeed have some level of skill in this area. The movement in professional pharmacy education towards more significant and continuous patient contact exposes students to a greater breadth and depth of clinical training and thinking. Patient assessment, which encompasses skills associated with physical assessment, are now a component of accreditation standards and are instructed to varying degrees within almost every US college and school of pharmacy.6 Patient and physical assessment requires students to think critically, evaluate laboratory and diagnostic testing, develop a sense of inquiry, and process information in a logical, stepwise fashion. Also, the increased emphasis within the profession on authentic assessments, such as objective standardized clinical examinations (OSCEs), continues to force students to practice and refine patient care skills, including information gathering and assimilation, communication, and clinical reasoning. Models of pharmacists effectively utilizing diagnostic skills to manage uncomplicated disease states exist in various settings including the Veterans Affairs System and within ambulatory care settings. In these settings, pharmacists everyday utilize very basic diagnostic assessment in arriving at recommendations for both prescription and more commonly nonprescription medications and drug products.

Despite the aforementioned intentional or unintentional changes to pharmacy curricula, which may have moved professional education closer to instructing some degree of diagnostic skills, there are no reports of existing schools that have explicitly added required curricular work in this area. One school has reported the development of an elective course in very basic clinical reasoning and differential diagnosis.7 A small number of colleges and schools have developed dual degree programs that award both PharmD and master of science in physician assistant studies (MSPAS) credentials. Certainly graduates of these hybrid programs will have gained proficiency in diagnosis.

It may be time for the academy to take a more aggressive stand with regards to diagnostic instruction within professional degree programs. The profession has recently seen many retail chains experiment with and implement care models in which either nurse practitioners or physician assistants are placed in close proximity to or within a pharmacy in order to provide basic primary care.8,9 This may represent a missed opportunity for the profession, leaving some wondering why a pharmacist with appropriate training could not provide these services along with a high level of medication therapy management. Implementation of the Affordable Care Act may provide another impetus for change in this direction as many more Americans are expected to become increasingly eligible for covered primary care services. These patients will require some level of care that the current system, devoid of sufficient primary care practitioners, will be stressed to provide. At a minimum, an increased emphasis on diagnostic training will allow pharmacists to more autonomously influence health outcomes and contribute to the general wellbeing of patients.

Movement towards a curriculum with an increased emphasis on diagnosis will likely be met with some opposition by other health care providers. On the surface they will likely find most training to be insufficient in breadth and depth. If the profession were to move in this direction, a critical tenet would be that instruction would be aimed at basic assessment, triage, and diagnostic skills. The objectives for instruction in this area might specifically focus on commonly encountered, uncomplicated primary care disease states (eg, allergic rhinitis, otitis media, rash, etc). It should not be the intent of the academy to advocate for the creation of “pseudo-physicians,” but rather to empower pharmacists with the ability to more impactfully care for patients. The profession should also tactfully limit its sensitivity to noise from other professions and instead focus on collaborative efforts to define its own scope of practice. Any sentiment towards increasing curricular burden within professional degree programs with greater course work must be balanced with deletion of some other existing content. Without an effort to achieve balance in credit hours, programs will become increasingly burdensome for students, limiting time to actually think, reflect, and learn. The evolution of pharmacy practice and future practice models should continue to dictate what is taught and what topical areas could potentially be removed from curricula to provide room for new instruction in the area of basic diagnostics.

As previously mentioned the concept of instructing even basic diagnosis within PharmD curricula will likely generate deliberation and controversy both within the profession and from our colleagues in other health care colleges. This is dialogue that needs to occur. In considering provocative proposals such as this, the academy should exhibit mutual respect but not cower from sensible, healthy, and productive debate. Regarding diagnostic instruction and skill building, the time seems right for this discussion given external factors within the health care system and internal changes within the academy and our profession.

Corporate IT vs. Manufacturing IT

Though automation personnel and corporate IT share some common priorities, they often disagree over whose standards apply. Automation personnel focus on isolating production on “islands” of automation, believing that segregation helps keep information safe. IT, on the other hand, focuses on security and more open access to business networks and information.

We live in a world of increasing connectivity. Divisions of business that used to operate in isolation must now be integrated with the rest of the enterprise. As an example, business leaders expect to see real-time production information direct from the plant floor to evaluate operations and make business decisions. Data collection and presentation drives business decisions; protecting intellectual property, overseeing network access and assessing vulnerabilities must now be ongoing priorities for all facets of the business.

We live in a world of increasing connectivity. Divisions of business that used to operate in isolation must now be integrated with the rest of the enterprise.

In this setting, isolation of plant floor automation is no longer feasible. Isolated systems did not require the updates and ongoing evaluations that IT has dealt with for many years; process control can learn from IT here. The idea of “continuous operation” has a different working definition for plant floor automation systems than it does for IT. Network downtime that stops production could represent a financial catastrophe, whereas not being able to access a network printer is a mere annoyance. Both situations affect network users, but with varied degrees of impact. Collaboration and ongoing conversation are no longer optional, but required.

Though it is tempting to apply a blanket IT method to the control systems world, this approach is disastrous. What works for corporate IT may not be what is best for control systems. The first step in the process should be to identify bridges between the two perspectives. Both sides want security that works effectively without getting in the way of business operation. Both sides want a say in decisions,i.e., a sense that their concerns are heard and considered moving forward.

In many companies there are people who understand both perspectives and can speak the language of both sides in ways that promote understanding. Such people, who can function as “translators,” are invaluable. If you can't find this in a particular individual at your company, look for a relationship between individuals that transcends the manufacturing/corpporate IT bias. The strength of a longstanding relationship provides the trust needed to hear and understand conflicting perspectives. If neither of these is a reasonable option, a third party can be brought in, whether another individual to help with bridging the gap in understanding or another company.

Whatever your means of integrating plant floor automation and control systems into corporate IT networks, cybersecurity is a living concern. Security threats and vendor offerings change in very short lifecycles. The conversation is not a one-time decision, but an ongoing collaboration that must be fostered and factored into the business moving forward.

Microneedle Drug Delivery Could Draw Blood and Other Fluids Out of the Skin

Close up of the swollen needles on a microneedle patch. Image courtesy Dr Ryan Donnelly, School of Pharmacy, Queen’s University Belfast

Microneedles on a sticking-plaster-like patch may be the painless and safe way for doctors to test for drugs and some infections in the future, following research by scientists at Queen's University, Belfast. The work has been supported by £300,000 in funding from the UK's Engineering and Physical Sciences Research Council (EPSRC). Samples of the rough, absorbent patches are being tested in the Queen’s University Belfast laboratories of Dr Ryan Donnelly, a researcher in the School of Pharmacy. 

The experiments show that the forest of tiny polymer needles on the underside of the patch, when pressed into the skin, can absorb the fluid in the surface tissue, taking up at the same time the salts, fatty acids and other biological molecules found there. 'The important thing is that we typically find the same compounds in this interstitial fluid as you would find in the blood,' explained Donnelly. 'But, compared with drawing blood, our patches can get their samples in a minimally invasive way. 

And it’s far safer than using a conventional needle. These microneedles, once they have been used, become softened, so that there’s no danger of dirty needles transferring infection to another patient, or healthcare worker.' Two million healthcare workers are infected by needlestick injuries every year. 

The microneedle sampling technique is a development of earlier and ongoing experiments using similar patches to deliver drugs and vaccines painlessly – the sensation when they are pressed onto the skin is a bit like the roughness of Velcro, said Donnelly. The microneedles are made of polymer gel – similar to the material used in super-absorbent nappies. For their original, injecting function, they are pre-loaded with vaccine or drug compounds that will be released into the skin on contact with the interstitial fluid.

But the flow can go both ways, so that for the sampling variants, the backing material can be made chemically attractive to target compounds, encouraging them to diffuse into the gel with interstitial fluid drawn out of the skin and locking them in place for later analysis.

Dr Aaron Brady, a clinical pharmacist in Donnelly’s group, is currently conducting the first clinical evaluation of the technology using caffeine as a model drug. Eyman Eltayib, a PhD student with the group, is also trialling the technique for blood-free glucose sampling at her home university in Khartoum, Sudan. Future targets for sampling could include therapeutic drugs where monitoring the correct dose can be important.

'Theophylline, the asthma drug, is one compound doctors might want to track this way,' said Donnelly. 'It has a very narrow therapeutic range – too much and you can harm the patient, too little and it won’t do the job. During our EPSRC project, my PhD student Ester Caffarel-Salvador has shown theophylline in the blood of rats can be indirectly detected using our microneedles. In the future, patches could also be designed for medics treating TB, particularly in sub-Saharan Africa. Patients are very bad at completing their long courses of antibiotic treatment, the main cause of drug-resistant TB. A simple, cheap technique like this would let healthcare workers monitor compliance, even with a minimum of training.'
Real-time monitoring could be a realistic option in the future and might involve combining the microneedle technology with a simple laser-based detection called surface-enhanced Raman spectroscopy (SERS) of drug compounds inside the gel. The group already has proof-of-concept for this idea and is now looking to extend the range of drug concentrations that can be detected in this manner.

Electrochemical detection is another attractive possibility that might allow patients to use the technology in their own homes. If connected wirelessly to their healthcare provider, they could then have their medicines or doses changed based on the microneedle readings, both enhancing patient care and saving NHS resources.

AHU Validation : The Concept

The degree of cleanliness of air in the pharmaceutical manufacturing and related operations area should be established depending on the characteristics of products and operations in the area.

In order to establish and maintain such standards, careful attention has to be exercised to keep the standards from the stage of design and construction through to the monitoring in the stage of routine operations.

Air Quality

A total air handling system, covering the open air intake, treatment, the supply to the manufacturing area, and the exhaust, should be designed and validated. The handling system contains units or prefiltration, temperature and humidity control, final air filtration, return and exhaust. When the air is supplied to the manufacturing area, care is required in maintaining the required air quality during the operation or at the point of product exposure to the environment.

This point is closely related to the layout and construction features of the manufacturing area.

  • The air flow from the critical or most clean area to the surrounding area; that is, the less clean area. For this purpose, rooms used for the manufacturing operation have to be laid out according to the order of the required air cleanliness.

  • In order to maintain the air cleanliness in the area and airflow, the amount of air supplied and exhaust have to be balanced to keep the designed air exchange ratio, airflow pattern, and air pressure differentials. In each room the operation site should be maintained in the most suitable status.


For each purpose, the following items must be carefully controlled.
  • Locations and number of air supplies
  • Locations and number of air exhausts
  • Ratio of air exchange
  • Return ratio of exhaust air
  • Location of local air exhaust, if necessary
  • Airflow pattern at the site of product exposure
  • Air velocity at the point of product exposure


These features have to be well designed, installed, validated, and maintained. Critical operation has to be performed under the unidirectional airflow (laminar airflow). Air turbulence deteriorates air quality by intake of air from surrounding less clean areas.

The amount of air supplied and exhausted is related to the air pressure differentials. After the system is validated, air quality should be continuously monitored and maintained during manufacturing operations.
Filters used for the prefiltration and final filtration should be maintained to operate to their design specifications. Deterioration of filters is caused by leakage and/or accumulation of particles. The former is tested by periodical integrity test (usually dioctylphthalate DOP test), and the latter is tested by the increase of air pressure differentials between the upstream and downstream sides of the filter.

Regulations and Standards

All of the environmentally-controlled areas of pharmaceutical manufacturing and its related areas should meet the requirement of air cleanliness, which is expressed as classification specified by official standards, such as ISO (International Organisation of Standardisation) or FED-STD (U.S federal standard) 209, and/or GMP.

Autoclave Validation / Qualification : 10 Steps

Autoclave Validation / Qualification is mandatory for all machines used for biological sterilization, in the biomedical and pharmaceutical industries. Sterilization can be accomplished by either physical or chemical means. The principal physical means is autoclaving; other physical methods include boiling and dry heat.
Chemicals used for sterilization include the gases ethylene oxide and formaldehyde, and liquids such asglutaraldehyde . Of all these sterilants, autoclaving is the fastest and most reliable, which is why the regulators always scrutinize autoclave validation / Qualification activities.

Here are 10 items to consider when qualifying autoclaves.

1. Testing of Steam for Porous Product Sterilization

Before commencing temperature testing the correct conditions must be satisfied. The first condition for sterilization of porous product is saturated steam quality. The ideal for steam sterilization is dry saturated steam and entrained water (dryness fraction รข‰¥ 97%). The largest heat transfer occurs when the steam is at boundary conditions. If the steam is dry or contains gas then it cannot condense and its effectiveness is reduced.

2. Equipment Used for Testing

The equipment used must support 21 CFR Part 11 and must be of adequate accuracy. Testing with equipment that is not appropriate can be a major problem so also buy from a trusted vendor. As a small temperature range is required for accuracy (A few degrees) the accuracy of the equipment is very important for the overall measuring chain.

3. System Description of Autoclave

The first impression is very important when the qualification of critical equipment such as an autoclave is at stake. A good description of the system in a protocol shows that you understand how the process works and which critical points you need to keep under control. This description must contain the programs that are used, how they work, how many, where the control probes are located and what regulates this process.

4. Operating Instructions, Calibration and Maintenance

Before the temperature is tested it must be checked whether the operating instruction are valid, whether the instruments are calibrated and what was changed in the system since the last qualification occurred.
Operating Instructions must include parameters of sterilization, the scheme item and position of the control probes in the chamber. The emphasis is on the calibration of instruments because small errors in temperature can affect the Fo value to a great extent.

5. Procedure

One of the most common mistakes is inaccurate testing procedures. The test procedure must be unambiguous and accurate and must not leave the possibility for different interpretations.
Ambiguous: Put the thermocouple in a glass bottle.

Unambiguous: Put the thermocouple in a glass bottle at the contact between the bottom of the bottle and the side of the bottle (This is the most critical place to collect condensate).

6. Load

The most common objections are about loads. That they are not sufficiently described and that regular production does not reflect the qualified load being used.

This can be avoided by item photographing and releasing the same patterns in the protocol and work instructions. This will avoid the arbitrary interpretation of descriptive configuration.

Although sometimes this may seem trite the differences in the temperature profiles of the solution and air filters can be great. Pay special attention to the worst case loads and explain the rationale (mostly filter and silicone hose).

7. The Position of the Thermocouples

The position of the thermocouples must be unambiguous and precise to avoid different interpretations by individuals that perform tests or inspections. You do not want to enter into a debate about where the thermocouple is positioned.

Of course, the critical areas must be covered, and this must be explained in the rationale. When sterilizing liquid loads studies must be done to define coldest and warmest point for min and max load.

8. Acceptance Criteria in the Heat Penetration Tests

Since there are differences in the standards (e.g. PDA Technical Report and EN 285) it would be best if all eligibility criteria are taken into account. Special emphasis should be on equilibration time and temperature because it is a requirement for a good Fo.

9. Deviations

Deviations cannot be forgotten as they may be encountered regularly throughout the qualification process. A robust deviation management process should exist as they may impact the quality of the product. Good handling of deviations help us to improve our qualification, though they are often viewed negatively and not as a mechanism for process improvement.

10. Reports

The report should be accurate as it eliminates the use of data from the protocol and the ability to find errors in them. The report should contain the time and the date of tests, parameters, results related to temperature, BI results , positions of themocouples, Fo and comparison of results from initial and previous qualification in order to see in which direction the process is moving.

Soluble Painkiller Sodium Heath Risk Warning

The sodium levels of certain dissolvable painkillers are a potential health risk, according to new research now published by the British Medical Journal.

Those behind it state that particular formulations, at their maximum dosages, can push patients above and beyond recommended daily salt intake levels. Specifically, eight soluble paracetamol doses per day could be enough to make patients over 20 per cent more at risk of having a stroke, 28 per cent more at risk of premature death and a staggering seven times more likely to develop elevated blood pressure.
Based at the University of Dundee, the researchers urge for these painkillers to boast stronger warning labels and stress that patients need to exercise caution.

Painkiller Sodium

The university's painkiller sodium levels study assessed soluble aspirin, calcium, zinc, ibuprofen and many other supplements and drugs. All of them were available both over the counter and via prescription.
Close to 1.3 million patient data sets were examined, involving people who had taken sodium-content drugs or sodium-free drugs and the next few years of their lives were then looked at. On average, if health problems emerged, they did so in the fourth year.

Painkiller Health Risks

The painkiller health risk researchers make the point that soluble drugs are required in cases where rapid drug delivery is needed, or if the patients involved can't swallow pills too well. Doctors, they add, should really consider the circumstances when issuing sodium-content painkiller prescriptions, while the public needs to be better guided on what they're buying when purchasing these medications from pharmacies.

"We believe that our findings are potentially of public health importance", the researchers commented. "As a minimum, the public should be warned about the potential hazards of high sodium consumption in prescribed medicines, and these should be clearly labelled with the sodium content in the same way as foods are labelled."

"It's crucial to be aware of our sodium intake, as it is a component of salt. Excess salt in our diet can lead to high blood pressure, which is the single biggest risk factor for stroke", added Stroke Association neuroscientist Doctor Madina Kara, in comments quoted by The Telegraph. "A diet low in saturated salt and fat, regular exercise and blood pressure checks can go a long way to keeping your stroke risk down."