Prototyping is a necessary part of any well-executed product development process. Analysis can help build confidence in a design, but often there’s no substitute for a physical model. Models, mockups, and prototypes help industrial designers and engineers get critical feedback on appearance, proportion, fit, function, usability, durability and more. Often, creating a prototype is absolutely necessary to allow software and controls engineers to finish their work. In an ideal world, production parts could be made at low cost and in very short timeframes, making it easy to build and test the real product. In the real world, time and budget constraints force engineers to build prototypes with materials and processes that are not identical to the final product. This is where engineers have to be creative to maximize the value of their prototypes. Medical device development places some special requirements on prototypes, especially later in the development process.
Ultimately, when developing a medical device, the development team has to prove that they have tested samples of the device that will be put on the market. At some point, devices built with production equivalent parts and processes have to be tested. Especially with molded plastic components, there is a strong motivation to be very confident in the design prior to committing to expensive tooling. But how does an engineer gain that confidence with prototype materials that don’t behave the same or cannot reach the same levels of dimensional tolerance? By being a little creative!
What’s important? The first key is to really understand what is most important about the part. Need tight tolerances? Look to SLA, machining, or a combination of additive processes and machining processes. Are material properties most important? Match the critical properties to the best additive or casting material or consider machining the part from the final materials. The key is to select materials and processes that are good enough but not better than the final part. Analysis can usually point to the critical areas of the design where strength or fit are most important.
In many cases, an engineer just needs to get parts in hand to get a better feel for how they fit and function. A number of lower cost 3D printers using the fused deposition modeling (FDM) process are now available and can produce parts quickly and inexpensively. When these printers are available, parts can sometimes be created in as little as a few hours, saving the design team days of speculation about the function of a design.
Here are a few examples to help demonstrate the idea.
Break it Down
Let’s first consider the issue of prototyping plastic components. Injection molding is widely used in medical devices because it’s inexpensive in higher volumes, tolerances can be held tightly, a wide variety of surface finishes can be produced, and the different resins available offer a huge range of material properties. It’s often difficult to match all of these features in one prototype. Consider performing secondary processes like machining or painting parts to improve tolerances or appearance. Another option is to build multiple prototypes for different purposes. One version may help test the system function because key features can be machined to meet the necessary tolerances. A second prototype can be built from materials of similar strength to test durability.
The quantity of prototypes needed can often be a determining factor. Rapid tooling cycles are available for aluminum or soft steel injection molds. At quantities around 150-500 parts, this is often more economical than any additive prototyping process or machining, and offers the added benefit of incorporating final materials and geometry. Even if the tool is scrapped after the prototype run, the added confidence in the design and the lower total prototype cost can justify this option. If the design works well, some molders, parts and processes can be qualified for production, and the tooling can be used to manufacture parts until the production tooling comes on line.
Many larger pieces of capital equipment, whether they be robotic systems, cautery units, or an endoscopy system, will never reach production volumes that can justify injection-molded components. These sometimes use either thermoformed or cast urethane parts for housings. Both of these processes have complementary prototype processes that produce parts that are very similar to the production versions, giving a great deal of confidence in the final design.
The Metal Paradox
Let’s now consider metal components. Machining and sheet metal fabrication are common production processes, and can be readily prototyped. Laser sintered metal parts are also an option, but are available in a limited number of materials. Many medical devices use one of several casting or metal molding processes. The properties of cast metals are rarely identical to those of wrought metals used for machining, and most casting or molding methods have poorer tolerances. This is a classic case where the prototype can be better than the production parts, providing a false sense of security in the design. Alternative casting processes are often available. SLA patterns can be used in investment casting processes for several classes of metals. In aluminum, several gravity-fed casting processes can simulate die cast components with different alloys that tend to be a little weaker than most die cast alloys. There are even processes for investment casting very small parts in aluminum with enough detail and surface finish that they can provide a decent functional simulation of a metal injection molded part, albeit with lower strength.
A Different Type of Composite
Sometimes a single solution is not the best solution. By breaking a component down into pieces and bonding or fastening them together, it is possible to optimize different areas of the design and learn a lot about it. In all prototypes, the key is to understand where the design has risk, which areas need to be tested to reduce that risk, and decide on the best process and material for those sections. This method has risk, because the connections between pieces can be weaker than really intended, and additional stresses can be imparted in places that are not realistic.
With all of these ideas, it’s important to remember that testing a prototype made from non-production components is not a substitute for verification testing on production equivalent units, but it can help provide a level of confidence in the design so that tooling expenses can be justified with reduced risk of surprises.
Keep in Mind
Prototyping is expensive. A fully functional prototype can often cost 5 – 20 times more than a production equivalent. Prototypes for high volume disposables can be even more costly because the economies of scale are so favorable in production. The key justifications are always that the prototype improves the development process and saves the development team a significant amount of time. The cost of engineering time can be far greater than a prototype, and the opportunity cost of being late to market can be even greater. Creative construction and use of prototypes can dramatically shorten development time, saving a lot of money.
There are dozens of factors that contribute to the cost of developing a medical device and every project is different. There’s no question that the rigor required to meet standards in the medical industry plays a huge part, but based on my experience in medical device development, I’ve identified several other issues that often influence a project’s bottom line. This is by no means a comprehensive list but I hope to highlight a few critical elements of the development process that may help you on your next project.
Regardless of whether you’re an OEM, a contract manufacturer, or a consultant, time-to-market is critical in the medical device world. The time or money it takes to reduce risk in a project is always going to cost less than following blind optimism down the development path to a big, expensive roadblock. Mitigating risk can take on many forms, such as prototyping early, proper testing, or even developing parallel concepts. It takes a heavily involved program manager with proven intuition to know when to act and what mitigation strategy is appropriate for the risk at hand.
Ask the Users
Managers, key opinion leaders, designers, and engineers often believe they know what’s best for the end user or patient but there is no substitute for asking the user directly. Engaging the user early when requirements are being written and then again when deciding between various design alternatives can help transform an okay product into a great user experience. Users may not always know what they want but they often know what they don’t want and that is information you need. This is a critical and necessary part of the development process that can often be the key to a positive patient outcome and lead to wide market adoption.
Have you ever worked on a next-generation device and encountered that one feature or component that you’re told by the client can't be changed because it “contains the magic?” I can’t tell you how many times I’ve heard clients tell me this! My definition of "magic" in a device is either some arcane tribal knowledge that’s broken down over time or a happy accident that resulted in something that miraculously works but you don't know why. This can be a significant hurdle to overcome in the development process; it can suppress creativity, and add excessive time spent finding a workaround. In the end, it may take less effort to investigate and understand the magic than it does to blindly accommodate it. Understanding all of the factors that make a technology work is usually better than ignoring the unknown.
From marketing requirements to assembly instructions, the amount of documentation required to properly define a medical device can be time consuming and often overwhelming. A comprehensive design history file (DHF) is critical to accurately capturing design inputs, important decisions, and the overall development path taken to achieve the end result. For every document created there is a checking and approval process that is not only good practice but essential to maintain accuracy. The effort required for proper documentation is often overlooked but can have a significant impact on schedule, budget, and resource time. Planning ahead, defining deliverables in advance, and clearly assigning responsibility will help to make the documentation process more efficient.
There are many types of analysis that can be applied to a given project. From tolerance analysis to structural simulations – these all play a critical part in the development process. If you choose to rely too much on a prototype or pilot build without appropriately characterizing and understanding your design, unexpected problems can arise as the build quantity increases. This can often result in you chasing your tail trying to understand the root cause. Identifying the key components and sub-systems and applying the appropriate level of analysis will give you the confidence that the design is going to meet the requirements. Understanding what makes your design work will become valuable knowledge and help save time when you encounter that inevitable problem in manufacturing.
Prototypes can have varying fidelity and provide different value throughout the development process. From early foam models used for preference testing to fully functional prototypes for engineering evaluation - you need prototypes to assess your design and help the team make decisions. Prototyping can take hours, days, or weeks depending on the process and the end goal. Even though prototyping takes time, there is nothing more valuable than creating something the team and the client can touch and have available for experimentation. The cost of not prototyping can be catastrophic and potentially result in lost capital or a finished product that doesn’t function as intended.
Medical devices typically have elements that are critical to the user experience or the device’s proper function. These areas should not only be prototyped but also appropriately tested. Defining what you want to learn will help you identify the right materials and testing methods, and help the team hypothesize what a successful outcome would be. Writing a test protocol will help you think through the details and reduce the chance that your intended test will be inadequate. Appropriate documentation of the elements being tested should also be in place so you know what you’re testing and can record any deviations you make along the way.
Because every project and product is different, you need to be able to react to things you didn’t expect while exercising good practices along the way. Remember that having some level of requirements or goals is always a good place to start and requirements should evolve throughout the development process. There is always a balance between planning and executing and too much of one without enough of the other can increase cost and lengthen your schedule.
The AAMI/FDA Summit on Healthcare Technology in Nonclinical Settings took place in October in Herndon, VA. The event brought together leaders from the medical device industry and regulatory bodies, clinicians from healthcare institutions, researchers, and others to identify, discuss, and formulate strategic initiatives and priorities focused on ensuring the safety and effectiveness of medical technology in nonclinical settings.
With so much going on around human factors in healthcare, I was particularly interested in the challenges surrounding home healthcare, especially since an increasing number of Farm’s clients are developing devices destined for use outside the hospital.
The format of AAMI/FDA summits is more participatory than most, which I like. Only one topic is addressed at a time, beginning with one or more speakers who are experts on the designated topic followed by a moderated brainstorm where the audience participates in answering the following questions:
- What are the key issues regarding the topic?
- What are the barriers to overcoming these issues (including research gaps)?
- What changes need to occur in order to overcome the barriers?
- Based on the issues, barriers, and what needs to change, what are the top 3‐5 priorities for follow-up assessment and action?
One key takeaway for me was my fellow participants’ heightened awareness of the critical roles played by human factors engineering and user-centered design in solving the issues that were raised during the discussion. This was exciting to see.
AAMI just published a report from the summit, titled: A Vision for Anywhere, Everywhere Healthcare.” The five clarion themes from the event were:
1. Deepen all stakeholders’ understanding of use environments, and their remarkable variability. Research, information exchanges, and assessments of nonclinical use environments and practices—in homes, schools, offices, and public venues; in transit and beyond—will help the healthcare community improve patient outcomes.
2. Coordinate multiple and recurring transitions in care to improve patient safety. Delivering seamless care and support services to patients (and caregivers) as they move between clinical and nonclinical settings, interact with service and equipment providers, and adapt to medical technology will help instill a culture of safety.
3. Adopt a systems approach, encompassing people, workflows, therapies, technology, and payment, to redesign the full spectrum of healthcare in nonclinical settings. Synchronizing the disjointed components of healthcare delivery in nonclinical settings will help improve the quality of patient care.
4. Standardize and simplify. Creating consistency and clarity in regulations, data, information, and testing will support integrated products and services and instill confidence in the security and safety of medical equipment.
5. Design with empathy. Attending to human factors in developing medical devices that are “home-ready” and designed to add value from the patient’s perspective will support innovation and safety in healthcare.
Sure, your firm has designed and developed a lot of medical equipment. You’ve developed a solid understanding of the requirements and pitfalls of designing medical equipment that will be used in a hospital, in an ambulance, and maybe even in a patient’s home. But while you were busy, a confluence of technical breakthroughs and regulatory updates have awakened the wearable medical device industry. It’s not just about hearing aids and Holter monitors any more, and now your Marketing people have asked your Engineering group to specify and design your company’s first product to be worn by a patient directly on their body. Now what are you going to do? The following is an attempt to illustrate how several hardware-focused electrical and mechanical considerations are prompting developers to adopt an approach to wearables design that may differ greatly from that of desktop or portable medical devices, along with suggestions on how to adapt the wearable product development process.
- Consider your intended user population. Is there a certain patient age, weight or size that represents your typical user? Ideally your product could be designed so that one model of the device - perhaps with an adjustable feature - could work for your entire end-user population. Should it appear that two or more versions/sizes of the device are required to satisfy your user population, you’ll need to consider the trade-offs between excluding users at the extremes of the target population versus the complexity and sales/support implications of releasing multiple devices. A program of usability testing can often reveal problems with early prototypes and help the product developer arrive at the most effective solution for the target end-user.
- Another important consideration is the user’s ability to operate small electronic devices. Some patient populations may not have the experience, eyesight, or hearing acuity required to operate small controls or react appropriately to status indicators or alarms. For wearable devices, giving the patient a simple user interface is always preferable. Leave the complex device setups and interactions to the medical professional who is caring for the patient. If the patient is required to use a smartphone, tablet PC, or custom device to interact with the wearable device despite your best attempts to avoid it, be sure that the user interface is clear, simple and intuitive. This is another aspect of product development that can benefit from usability testing.
- Avoid small, easily-removable parts if possible. Your device will be exposed to all of the activities and chaos of daily life. Any small parts that can intentionally or unintentionally become disconnected from your device will get dislodged and lost, and possibly render your device useless. If a small, easily-removable part is required for the functionality of the product, consider providing the healthcare professional and/or the user with spare parts as appropriate to allow your wearable device to continue functioning.
- Biocompatibility evaluation according to ISO 10993 - Biological evaluation of medical devices may be new to you if this is your first wearable product. The acceptability of materials intended for patient contact is classified based on the amount of time that the material is expected to remain in contact with the patient. Ideally, a material that has already been validated for biocompatibility by the manufacturer can be located for use in the patient-contacting areas of your device. If not, ISO 10993 requires the manufacturer of the device to perform biocompatibility testing on the material in question, generally using the services of a third-party vendor.
- Since the wearable device will be used in the patient’s home, two guidance documents apply; the IEC standard 60601-1-11 - Requirements for Medical Electrical Equipment and Medical Electrical Systems Used in Home Care Applications and the FDA document Draft Guidance for Industry and Food and Drug Administration Staff, Design Considerations for Devices Intended for Home Use. These address many of the safety and usability requirements that a wearable device or a system that includes a wearable device will need to meet.
- Regarding electromagnetic compatibility (EMC), wearable medical devices fall into the same category as other devices intended for use at home, and are generally subject to tighter EMC regulations than equipment intended for use in a healthcare facility. The governing standard is IEC 60601-1-2 - General requirements for basic safety and essential performance - Collateral standard: Electromagnetic compatibility - Requirements and tests.
- If there is an IEC particular standard (IEC 60601-2-X) for the type of device you are planning, you may find that there are clauses of the standard that cannot be applied to a wearable version of the device. Be aware of these details before claiming that your device meets the particular standard. Discuss the implications with your Marketing department if you find that you will be unable to claim compliance to the particular standard.
- If your company has already released non-wearable devices with functions and features similar to your planned wearable device, take advantage of the product history and records available to you. You may find areas where problems or failure modes will be exacerbated when the device becomes a wearable. Similarly, opportunities may arise to further mitigate failures, improve performance or reduce costs. An example might be an electrode cable for an Electrocardiogram (ECG) device. Much effort is spent ensuring acceptability of ECG cables for a particular application, for example flexibility, triboelectric effects, and EMC. If the ECG electrode can be integrated into the device itself, the ECG cable and its associated functional requirements, failure modes and costs no longer need to be considered.
- If at all possible, your wearable design should be wireless and self-contained within a single housing. Device components that are wired together to create a system when the device is worn will inevitably cause patient discomfort or disconnect as the wires tug on the device components or become tangled in the patient’s clothing. The connections between components also create possible failure points.
- Consider the environment in which your wearable device will be expected to operate, as well as environments where a failure to operate under a specific condition is acceptable. As a wearable product, the device will be exposed to sweat regularly, and perhaps to other bodily fluids or rain. The device will be squeezed, dropped, and mechanically shocked constantly. Where a failure to operate in a specific environment is found to be acceptable, ensure that the device fails in a safe manner.
- Have a clear understanding of how the patient will be expected to deal with the wearable device during showering or bathing. Ensure that this information is clearly communicated to the patient.
- If necessary, consider how the wearable device will be cleaned by the patient or healthcare professional. In the event that the device or part of the device is to be washed by the user or healthcare professional, ensure that the instructions for use are clear regarding device preparation, water temperatures, detergent or cleaner types, and drying methods. Should disassembly and re-assembly of the device be required for cleaning, simplify these actions as much as possible.
- Carefully consider the scheme that will be used to provide power to the wearable device. To optimize your patient’s experience, consider how to best integrate the charging or replacing of batteries into the workflow, use requirements, and allowable “down-time” of your particular device’s functionality. This information will help to optimize design trade-offs such as device size/weight vs. battery run-time. Ideally, this familiarity with your target patient’s abilities or limitations, the requirements of the device, and the optimum use scenario will allow you to determine if your device is best designed with the battery permanently enclosed within the device or user-replaceable. Consider the use of wireless charging methods, as this technology has the potential to greatly simplify ease of use. Of course, battery safety and regulatory requirements must be strictly followed in all cases.
- Your wearable device may require active accessories to provide for data display, data storage, communications, or recharging. Carefully consider how best to allow for the transfer of data or power between system components, while minimizing the quantity and complexity of the tasks your user is required to perform. Ideally these connections and communication between system components should be implemented using wireless technology and occur automatically, with no patient involvement.
The medical device industry is entering a “Golden Age” of wearable product development that is being supported by both regulatory progress and significant innovation in battery technology, materials science, and wireless communication. This article has touched on a few of the many critical issues to be considered in the design and production of a safe, usable, and successful wearable medical device.
The term "Big Data" is a few years old, but its implications for medical devices is at an inflection point.
As the reader may know, Big Data refers to data sets of enormous scope (think terabytes of data). Historically, many of these databases have contained highly dimensional data about the activity of people online; the advertising platforms of Facebook and Google - the backbone of their businesses - are examples of products that are informed by big data sets. Applications aren't limited to the web however; Walmart's transaction databases are estimated at 2.5 petabytes (1 PB = 1,000,000 GB).
What we can learn from this is that many successful companies have discovered that their ability to collect and analyze enormous amounts of data gives them a significant competitive advantage; if not the core value proposition of their organization.
So, how does this apply to medical devices?
Health-related data is becoming more abundant. People care less about privacy than they used to and, for those who do, HIPAA has well-documented guidance for de-identifying Protected Health Information. Still more data is produced and stored by devices from consumer health companies such as Fitbit and Withings that cater to the growing quantified-self market.
Bottom line: there is precedent for collecting data on a large scale... so what data should you capture?
The answer depends on your specific application, but here are some ideas you should consider:
- Could you make a better walking brace with data on millions of steps taken by thousands of users? What if you also had information on their recovery time?
- Could you make a better surgical robot with data on the position of every end-effector at every second of every surgery it ever performed?
- Could you make a better ventilator with flow/pressure/heart rate/O2 Sat/etc. data on millions of inhalations and exhalations?
Now imagine your company had such a data set... and your competitor didn't. As that data set gets larger and your products become more data-driven a network-effect occurs. Who would want to buy a competitor's product that is based on an inferior data set? You would be Google, your competitor would be Yahoo.
Analyzing Big Data can be hard. The hurdles are both logistical and analytical:
- Logistically, Big Data can't be loaded into a laptop’s RAM (i.e., you won't be opening it up in Excel). To be able to “look at” Big Data, specialty tools such as Hierarchical Data Format or Hadoop may be required.
- Analytically, machine learning techniques such as neural networks may be required if a pattern or trend can't be isolated using strictly mathematical methods. Such techniques, while well-understood, differ from traditional statistics and can have a bit of a learning curve.
A company cannot perpetually compete while operating on less data than their competitors. Ever-increasing saturation of connectivity (RFID, Wi-Fi, Bluetooth, NFC) is allowing us to collect and store more and different data than ever before. The critical question is; what kind of data can give you an edge?
Over the past 2-3 years, most health care providers in the U.S. have completed the transition to Electronic Medical Records (EMR). Ultimately, the adoption of EMR is meant to make health care more efficient and less expensive while improving a patient’s quality of care by making their medical history readily available to all of their healthcare providers.
As users in the field are gaining experience with EMR, however, usability problems have emerged that are a result of the User Interaction design of some of the many EMR software packages that are currently in use. The designers and UI engineers developing EMR software need to address some of these problems in order to make EMR more effective and to reduce the likelihood of dangerous mistakes. The following common usability problems exist today:
1. Patient Identification Errors
Patient identifiers (e.g., EMR Number; patient name; date of birth) are not clearly displayed or selectable onscreen, resulting in treatment actions with potentially harmful consequences performed for one patient that were intended for another patient. (1)
Information is displayed in a confusing format, which can lead to a patient’s receiving the wrong medication. Data related to medication is displayed in a manner that makes it easy to miss. For example, a physician prescribes a medication containing sulfate to a patient with a sulfate allergy because allergy information within the EMR is not clearly emphasized or is difficult to locate on the page. (1)
2. Delay in Treatment Events
Poor EMR page design leads to a delay in critical patient care activities. For example, a patient’s surgery is delayed because an alert about an abnormal lab test result was not displayed clearly and in a manner designed to signify its importance. (1)
Clinicians perform critical tasks, or steps in a task, out of order. For example, a patient with a fever may have a blood culture performed, followed by intravenous antibiotics. If antibiotics are given prior to the blood culture, the sensitivity of the culture decreases dramatically. EMRs that support providers in the order of events are more likely to reduce order errors. (1)
3. Use of Technical Jargon
Based on several interviews with physicians who are currently working with EMRs, we discovered that onscreen lab reports often contain technical jargon that may be not be familiar to all healthcare providers, prompting the user to look up a term online. Most current EMR reports do not have an embedded appendix or glossary of terms.
4. Lack of Readability
Developers often base their EMR design on alphanumeric data fields rather than on compelling and easy-to-scan visual elements like charts, graphs, and color schemes that can be helpful to users who must quickly read and process information displayed onscreen. For example, if the report contains an abnormal score, it should be clearly displayed using alerting colors and contrasting type styles, to capture the physician’s attention. (4)
5. Inconsistent Formats
Because each hospital or practice may use a different vendor’s EMR, doctors can often encounter several different EMRs during the course of a work day, each of which uses a different format for displaying information. This lack of consistency can make it difficult for physicians to find information quickly.
In order to make EMRs easier to use and reduce the patient treatment errors that result from flawed information design, we offer a few ideas for UI design:
Interactive Graphical Treatment Timelines
By incorporating interactive graphical treatment timelines that track the cause-and-effect details of a patient’s healthcare process, the health care provider is able to quickly see the patient’s pathology and treatment history in a way that is useful and intuitive. (2,5)
Effective Use of Color
Use a color system to differentiate data points and make it easier for the user to visually map fields and values. For example, if a patient’s white blood cell count comes back as trending lower, it could be indicated in red. (6)
Group Data Fields Where Appropriate
Information should be placed onscreen near other data with which it is often viewed. For example, a patient’s blood pressure report should be placed near the lipids (cholesterol) report as they are often linked and reviewed in the context of the patient’s overall cardiovascular health. (4,5,6)
Help and Reference Documentation
Incorporate a Help section and reference appendices into the EMR screens so that the user can find them quickly and access them easily. (6)
- B. Shneiderman, "Tragic Errors: Usability and Electronic Health Records," Feature, Nov. and Dec. 2011.
- L. Lins, M. Heibrun, C. Silva, "VISCARETRAILS: Visualizing Trails in the Electronic Health Record with Timed Word Trees, a Pancreas Cancer Use Case," Workshop on Visual Analytics in Healthcare, pp. 13-16, 2011.
- C.B. Teston, "Investigating Usability and ‘Meaningful Use’ of Electronic Medical Records," Sigdoc, pp. 227-232, Oct. 2012.
- K. Wongsuphaswat, D. Gotz, "Outflow: Visualizing Patient Flow by Symptoms and Outcome," pp. 25-27, Aug, 2011.
- Z. Zhang, F. Ahmed, A.Mittal, "AnameVis: A Framework for the Visualization of Patient History and Medical Diagnostics Chains," Visual Analytics in Healthcare, pp. 17-20, 2011.
- R. Pereira, J. Duarte, M. Salazar, "Usability Evaluation of Electronic Health Record," Int. Conf. Biomedical Eng. Sciences, pp. 361-364, Dec. 2012.
These are exciting times in the drug delivery industry. A host of new delivery platforms is in development, some of which have recently reached the market. The primary goal of these developments is to create systems that optimize a drug’s therapeutic value, but it’s also believed that finding better ways to get a drug into a patient’s system in a safer and more consistent way will lead to better compliance and outcome. Additionally, it’s estimated that up to 50 percent of new drugs can’t be taken orally, so the impetus to create innovative delivery platforms is strong and growing. Finally, an aging population, a growing demand for medications that can be self-administered at home, and the increased incidence of chronic diseases such as diabetes are other important factors driving growth in drug delivery techniques.
It’s estimated by one study that the worldwide market for the 10 most popular drug delivery methods (including oral) will reach $81 billion in 2015. Another report puts the market significantly higher, at $142 billion in 2012. Whatever the market size, it’s clear that these new technologies have the potential to revolutionize patient care. Here’s a brief rundown of promising and novel drug delivery systems.
Nanotechnology, according to one definition, is the “engineering and manufacturing of materials at the atomic and molecular scale.” As defined by the National Nanotechnology Initiative, nanotechnology refers to structures measuring roughly 1-100 nanometers (nm) in at least one dimension, and are developed by top-down or bottom-up engineering of individual components. So-called “nanomedicine” is considered to be one of the most promising drug delivery platforms ever developed, and is being used to deliver both new compounds and previously approved drugs:
- siRNA (small inhibitory RNA) is a bit of genetic material that interferes with gene expression. Researchers at several institutions have been loading siRNA into silicon nanoparticles to deliver it to an ovarian cancer gene. Results so far indicate it may reduce ovarian tumor size by up to 83 percent;
- A lipid nanoparticle is being studied as a drug delivery system for orphan diseases, such as rare liver disease;
- Magneto-electric nanoparticles are being developed as vehicles for delivering and releasing the anti-HIV drug AZTTP into the brain; and
- Sugar-sensitive nanoparticles that release glucose may revolutionize diabetes treatment.
Skin patches are another hot area in drug delivery development:
- The FDA recently approved NuPathe’s patch for treating migraine headaches. Zecuity™, says the company, is a "single-use, battery-powered patch that delivers the most widely prescribed migraine medication through the skin";
- The Nanopatch, a silicon patch that’s smaller than a fingernail, is made of thousands of microprojections coated with a vaccine. It’s held against the patient’s skin and the microprojections penetrate the outer layer of skin to deliver the vaccine; and
- Purdue University researchers, (perhaps inspired by beer), have created a tiny fermentation-powered pump that requires no batteries and may be useful for powering transdermal drug patches to deliver drugs for treating cancer and autoimmune diseases that previously couldn’t be delivered with a patch due to the large molecule size of these medications.
Powder inhalation delivery has long been used for treating diseases such as asthma. A promising new application for this technology is in treating diabetes through inhaled insulin therapy. MannKind Corporation’s Phase 3 clinical trials are investigating the performance of its insulin delivery treatment. The product is a simple inhaler device combined with insulin inhalation powder pre-measured into single-use cartridges.
While this technology is not new, current research efforts are focusing on devices that are lighter and easier to use. Needle-free jet injection devices produce a high-velocity “drug jet” that enables today’s larger molecule, protein-based drugs to penetrate the skin. One such device, developed by MIT, is said to improve on older jet-injection platforms by delivering programmable and adjustable doses, making this delivery system more useful for treating sensitive populations, such as elderly or pediatric patients.
CeQur has received European approval for its PaQ insulin delivery technology. CeQur’s device attaches to a patient’s abdomen and insulin is delivered subcutaneously through a cannula from an onboard reservoir.
A novel gel material capable of releasing drugs in response to patient-applied pressure is getting close attention from researchers. This new gel releases a test drug in response to a stimulus that mimics finger pressure. Delivery platforms like this may help patients who need fast drug administration, such as asthma sufferers or those with acute cancer pain.
These new generation drug delivery technologies hold great promise to deliver better care to patients around the globe.
Farm's Director of Research and Usability, Beth Loring, and Senior Industrial Designer, James Rudolph, recently presented at the UXPA Boston 12th Annual User Experience Conference on May 29, 2013, at the Sheraton Boston in Boston, MA.
The UXPA Boston annual conference covers critical topics in usability and user-centered design with practitioners, students, and experts in the field. Beth Loring and James Rudolph will present "Watch the Sterile Field! Conducting Research in the OR."
The presentation offers practical advice and tips based on recent experiences and lessons learned through more than 75 international OR observations. The presentation covers:
- Recruiting surgeons and their teams
- Gaining access to procedures, including credentialing
- What to expect when you arrive at the hospital
- Patient confidentiality and HIPAA Etiquette and attire
- What happens before, during, and after the case
- Taking photos and recordings
- Differences between the U.S. and other countries
- A technique for visualizing, exploring, and analyzing data
It is well documented that different evaluators conducting usability evaluations of the same product often come up with disparate findings. Does the suspect reliability of usability evaluations mean we should stop conducting them all together? The answer is a clear and resounding no.
The reality is that usability evaluations can yield different results, depending on the way in which they are conducted. More importantly, however, usability evaluations identify important use-related challenges and hazards, and provide insight into how a design can be improved.
In this blog, we’ll explore challenges associated with conducting reliable usability evaluations and offer insights as to how to overcome these challenges. We’ll also discuss how to improve usability testing practices to ensure we are identifying the most important issues.
Usability Evaluation Methods
There are two primary methods of evaluating the usability of a product: (1) usability tests and (2) expert reviews. The key difference between the two is that usability tests are conducted with representative users, while expert reviews (e.g., heuristic analyses, cognitive walkthroughs) are typically performed by usability professionals and/or domain experts. Both methods use a set of tasks that help evaluators identify usability issues, and in both methods usability professionals analyze the data in order categorize problems, rate them according to a defined severity, and offer recommendations for design improvement. While there are numerous methods for evaluating usability, one thing is clear: different evaluators can produce different results.
Overlap of Usability Issues
One might reasonably assume that expert usability professionals conducting different evaluations of the same product would uncover the same usability problems. Unfortunately, the reality is quite different. Numerous studies have explored this issue, the most prominent being the Comparative Usability Evaluation (CUE) series. A striking portrait of the lack of overlap is painted when you look at the number of unique issues reported by single teams across the first four studies of the series:
- CUE-1 – 91%
- CUE-2 – 75%
- CUE-3 – 61%
- CUE-4 – 60%
The CUE-4 study represents the most comprehensive comparison of usability studies to date, involving 17 usability teams in total, nine of which performed expert reviews and eight of which conducted usability tests (Molich & Dumas, 2008). As seen above, 60% of all problems reported were identified by only one team. Many others have found strikingly similar overlap in their own research. See, for example, Jeff Sauro’s blog “How Effective are Heuristic Evaluations?”
Factors Affecting the Reliability of Usability Evaluations
A large part of the problem can be attributed to the variables affecting both types of usability evaluations. We’ll discuss several of the most important variables affecting usability evaluations, and offer some practical insights as to how to reduce their impact on reliability. This list is far from comprehensive, and we invite readers to add additional variables to the commentary at the end.
Task selection. Task selection is an important aspect of both usability tests and expert reviews because the tasks performed greatly affect the interaction that test participants and/or evaluators experience. As Molich and Dumas point out, “A usability problem—even a critical one—normally will only be discovered if there are tasks that focus on the interface where the problem occurs” (p.275). Development teams should define the primary operating functions and frequently used functions, and then create tasks that will allow participants to interact with these areas. In medical device development, tasks should also be created to address potential use-related hazards defined during risk analysis.
Test protocol. Unfortunately, evaluation teams may use different instructions and ways of interacting with participants during a usability test, and these subtle differences can bias the results. At Farm each moderator closely follows the same protocol. We also conduct pilot sessions to ensure participants fully understand the questions and are not biased by the way questions are framed. During formative testing, the think-aloud protocol may also uncover instances where the task instructions are misleading the user and/or causing confusion. For a more comprehensive discussion of creating a successful protocol, see Beth Loring and Joe Dumas’ “Effectively Moderating Usability Tests.”
Categorization of problems. The categorization of usability problems is also important. In the CUE-4 study, participants were asked to use predefined categories. The authors found that identical issues were sometimes classified as positive findings and sometimes classified as usability issues. In other studies, including previous CUE studies, evaluators were asked to define their own categories and scales. The language used to define problems will undoubtedly affect the way people, including clients, understand test results. It is important that categories be easily defined and understood by the development team.
Domain knowledge. In a study of heuristic evaluations conducted by Nielsen (1992), “double experts,” or usability experts with extensive knowledge of the specific domain being studied, performed better than usability professionals without domain expertise. It is sometimes suggested that usability professionals who become too knowledgeable about a device can lead test participants during the evaluation, but we have found that evaluators who take the time to understand a product will produce a better and more relevant list of issues than those who do not.
Providing recommendations. There is no hiding the subjective nature of providing recommendations. Nevertheless, this is a critical juncture in the process, one that represents a shift from research to solving problems. Some common problems include: overly vague recommendations, recommendations that are in direct conflict with business goals, recommendations that reflect personal opinion alone, and implicit recommendations. To avoid some of these pitfalls, it is critical that evaluators provide solid evidence for how a recommendation supports the issue that is uncovered. Similarly, we have found that recommendations are most useful when the usability team has been closely involved in the development process.
It is important to know that in a medical device summative report, third-party evaluators such as Farm are not supposed to suggest how an issue will be mitigated. We simply report the issue and provide the root cause from the user’s perspective. It is up to the device manufacturer to report to the FDA how they fixed the problem and re-tested the issue.
The Value of Usability Testing
According to available research, the results of usability testing and expert reviews can be inconsistent across evaluators. Fortunately, they can be made more reliable by applying rigor to various aspects of usability evaluations, including the test protocol and task selection. A usability evaluation, while based on the fundamental principles of behavioral science, is a tool used to provide better and safer products, and it should be judged on its ability to inform design change, to improve the user experience, and to improve the safety of medical devices. The science of evaluating products is not perfect, but if we keep the end goal in mind, we will have a better appreciation for the positive impact that usability evaluations have on product development.
Molich, R., & Dumas, J.S. (2008). Comparative usability evaluation (CUE-4). Behavior & Information Technology, 27(3), 263-281.
Nielsen, J., and Molich, R. (1990, April). Heuristic evaluation of user interfaces. CHI 1990 Proceedings, 249-256. Seatle, Washington.
Massive Market and Growth
The growth of the mHealth industry has product developers and software engineers increasingly focused on the regulatory acceptance criteria they may face when developing these apps. Research2Guidance, a global mobile research group, states that 500 million smartphone users are expected to be using mHealth medical apps by 2015. Today there are approximately 97,000 mHealth medical apps available, and sales of medical apps are expected to reach $26 million by 2017.
In the Farm blog The Rise of Mobile Health and the Importance of Human Factors, many of the drivers contributing to the explosive growth of mobile software development are highlighted, including economic and technology trends and the ubiquity and convenience of mobile platforms. It’s clear that healthcare information, made available to both consumers and healthcare providers via mobile devices, has the potential to reduce healthcare costs and improve care across multiple point-of-care environments.
To provide clarity and direction for mHealth medical app developers, the U.S. Food and Drug Administration (FDA) has developed guidance outlining the suggested path to approval and including a set of definitions on what is and is not regulated. The FDA differentiates between a wellness app and a potential medical device (such as a device that uses an mHealth medical app). This is an important distinction for developers, since the FDA has said it will not regulate wellness apps, and thus the documentation burden for the developer falls within more traditional development practices. Per these guidelines, an mHealth medical app will be regulated if it is:
- An app that displays, stores, analyzes, or transmits patient-specific medical device data
- An app that transforms or makes a mobile platform into a regulated medical device
- An app that performs actual medical device functions
- An app that allows users to input patient-specific information and provides patient-specific results, diagnosis, or treatment recommendations used in clinical practice or to assist in making clinical decisions
Present and Evolving Guidance
The current guidance for mobile medical applications was issued in 2011. This guidance provides some support to manufacturers by categorizing mHealth medical apps by risk to patient and addressing the level of development (including testing) documentation expected for each risk class. This guidance helps the developer determine accurate costing and time estimates. These guidelines also help determine which of the existing mHealth medical apps are to be classified as Class I (general controls), Class II (special controls in addition to general controls), or Class III devices awaiting premarket approval (PMA). Class I devices or MDDS (medical device data systems) are considered to be the least risky, and the FDA has exempted them from any regulatory requirements. mHealth medical apps that provide electronic transfer, storage, or display of medical device data qualify as Class I devices. Class II devices require filing for 510(k) clearance and include products such as mHealth medical apps that display radiological images for diagnosis. Class III devices requiring PMA may require clinical trials if the app is novel (with no predicates). An mHealth medical app also must be FDA approved if it is an extension of an FDA-approved and regulated medical device.
The FDA is making an effort to address concerns and provide oversight on guidance issues. It is important to be aware that congress is also asking for clarification on this effort. According to MobileHealthNews, as of March 15, 2013, a letter has been sent to the FDA seeking clarification on the agency’s intentions:
In an attempt to fill the gap, the guidance goes on to call out additional standards and regulations for subjects such as:
- Software verification and validation
- Off-the-shelf software used in medical devices
- Cybersecurity for networked devices containing off-the-shelf software
- Radio frequency wireless technology in medical devices
These regulations and guidelines, as well as the guidance for the content of premarket submissions for software contained in medical devices, provide insight into what is expected in a premarket submission for performance and process documentation, the potential high-level risks an app may encounter, and areas where field issues have occurred. By reviewing these publications, an mHealth developer who has little or no experience working in the regulated medical device market will have the key information required to develop a plan and a process for creating, testing, and documenting an app that can be submitted with the best chance for clearance.
Software Development Regulation
Critical to mHealth software development is guidance for the development process itself. The FDA does not author its own standards, but has chosen to recognize the European standard for software development ANSI/AAMI/IEC 62304:2006. A detailed discussion of this standard is included in the online article Developing Medical Device Software to IEC 62304. The European standard uses a patient risk method for identifying risk associated with device software, and relies heavily on ISO 14101:2007 for a risk management approach to be followed throughout software development.
For mHealth developers who are unfamiliar with regulated software development, Happtique, a mobile health software development company, has created a voluntary program that provides a set of interoperability guidelines. Designers who follow these guidelines and apply to have their mHealth medical apps tested will receive certification for their mHealth medical apps, informing the consumer that they have achieved this level of interoperability. The program has been reviewed by the FDA and is seen as a complement to its regulatory requirements. This is covered in the article Happtique Publishes Final Standards for Mobile Health App Certification.
While these numerous standards, regulations, and guidances are complex and may prove confusing, there are some basic steps that developers can take in order to provide a clear and predictable development path:
- Thoroughly understand the app’s intended use. It’s important to be able to define the intended use for an mHealth application and communicate its use precisely, including benefits the app provides to the end user and patient and, if possible, including what the app is not intended to do. Not only will this have a positive impact on the marketing of an mHealth app, it helps define the depth of process and documentation needed throughout the development process
- Follow existing FDA and international guidance relating to communication, electrical, and platform hardware, so that key risks can be avoided. Create a development process that will identify risk factors contained within hardware components, wireless protocols, operating systems, and platform-specific interferences, and from both a technical and a user perspective. Identify critical issues and create an approach that will minimize or eliminate them. Consider using FDA’s MAUDE database in order to mitigate risk
- Create a software development process according to the ANSI/AAMI/IEC 62304:2006 standard in order to support safe software that meets performance requirements
- As defined in the blog article The Rise of Mobile Health and the Importance of Human Factors, follow ANSI/AAMI HE75:2009 to ensure that the mHealth app will be safe and easy to use
- Consider an iterative implementation approach, rank risks and design mitigations around the highest risks. Implement and test those first, both for performance and for usability (by testing with real users). Build small increments and test those, continuing until all functionality is implemented and the risks have been minimized or eliminated
- For mHealth apps that will be used on multiple platforms, target implementation by most-to-least market impact, then go through the same iterative development path
- According to regulations, developers must monitor released applications for safety and effectiveness issues. mHealth developers should follow FDA CAPA guidance to improve the development process
- Finally, use well-established development and design, and incorporate testing tools, to reduce the probability of defect introduction