Conflict in the EMR

Conflict in the EMR

Conflict in the EMR



Conflict in the EMR
By Selena Chavis
For The Record
Vol. 30 No. 5 P. 12

Industry professionals weigh in on the challenges and opportunities of marrying clinical and billing documentation in EMRs.

The foundational objective of any clinical documentation improvement (CDI) program is to produce the most complete and accurate documentation possible. It seems a reasonable goal, yet the industry at large continues to struggle with achieving a balanced documentation product that adequately supports both billing and clinical information needs.

Widespread use of EMRs provides an effective way of collecting data, but it also introduces challenges, according to Mark Morsch, vice president of technology with Optum. “While the EMR has provided numerous benefits to the health care market, the goal of improved documentation and automated coding derived from that documentation has not yet been fully realized,” he says. “One of the biggest challenges is quantity of information vs quality. With electronic records, it is common to have extensive content and data where quantity is trumping quality.”

In addition, health care organizations can become misguided with their CDI strategies, tilting the scale too far in the direction of optimal reimbursement, says Jon Elion, MD, FACC, president and CEO of ChartWise Medical Systems. “Some hospitals got a little excited and found that they could have an opportunity with coaching and creativity to game the system,” he says, pointing out that these practices not only result in the potential for negative newspaper headlines but they also do not align with the paradigm shift from fee for service to value and quality.

Value-based care and the emergence of multiple reimbursement methodologies exacerbates the problem of increased documentation requirements, Morsch says, increasing physician clamor about problematic EMR workflows and the need to focus more on patient care.

“Rather than causing further physician disruption or requiring physicians to become revenue cycle experts, the common denominator in this equation must be accurate documentation that is reflective of patient acuity, the medical necessity of the care provided, and the quality of care the patient received,” he says. “This not only ensures the documentation can support the complexity of the revenue cycle but also provides better communication amongst providers and benefits the most important part of the health system—the patient.”

While the right equation continues to elude the industry at large, Elion says a proper balance exists. “You are supposed to have the most complete chart you can,” he says, pointing to consistent guidance handed down from regulatory and accreditation organizations. The key is finding an equilibrium that addresses all the data needed for billing and reporting, while maintaining the quality of the physician note.

Drowning in Data, Thirsty for Knowledge
While HIT has introduced ways of collecting and aggregating sizeable amounts of information, industry professionals suggest that key data elements are often missing or hard to find. “EMRs have really made it difficult for physicians to communicate their patient care,” says Glenn Krauss, BBA, RHIA, CCS, CCS-P, president of Core CDI and creator and founder of “It’s not aligned with how they think clinically; it interrupts their thought processes and, unfortunately, it’s the patient who suffers. Downstream, the coding and billing suffers.”

Krauss explains that EMRs have inadvertently created more work for physicians. “I see physicians sitting in front of their computer and they are pointing and clicking and dealing with drop-down menus,” he says, adding that the way templates are structured in the EMR can hinder data entry workflows and impact the manner in which a diagnosis is entered. “The EMR was not designed by clinicians. Clinicians have no say in how it’s populated.”

In a policy position paper released in January 2015, the American College of Physicians addresses the current challenges of documentation and the purpose of EMRs:

“Electronic health records should be leveraged for what they can do to improve care and documentation, including effectively displaying prior information that shows historical information in rich context, supporting critical thinking, enabling efficient and effective documentation, and supporting appropriate and secure sharing of useful and usable information with others, including patients, families, and caregivers. These features are unlikely to be optimized as long as the format and content of clinical documentation are primarily based on coding and other regulatory requirements. Furthermore, under these circumstances, EHRs lose much of their potential to improve care and documentation and instead are relegated to doing nothing that could not be done with paper records—only less efficiently.”

Unfortunately, Krauss believes that electronic patient records are predominately data repositories for the submission of quality measures with little focus on communicating patient care.

Elion, also a board-certified, practicing cardiologist, compares physician use of EMRs to a pilot in the cockpit of a 747. “[Physicians] are surrounded by gauges, instruments, lights, and switches. It’s all complete, but dang, it can be difficult to navigate around,” he says. “It’s very possible to write a note that is technically correct but is useless to the reader. Physicians can fill up a giant note but really haven’t written anything. Copy and paste within EMRs propagates errors.”

Offering an example, Elion points to a physician referencing “K 2.3” in a chart, a reference to low potassium levels that most clinicians would recognize. “Every physician will know what that means, but coding departments cannot code properly from that. I have to use the word ‘hypokalemia,’ which means low potassium,” he says.

In another example, Elion points out that coders often cannot use documentation from other clinicians to support a documentation code. If a radiologist provides a detailed description of a fractured hip but the physician does not specifically call out how and where the fracture occurred, coders cannot include the needed specificity for the highest reimbursement.

Morsch says that coded data embedded within an EMR may not fully reflect the coding guidelines required for billing, such as combination codes, symptom-of relationships, and code specificity, which can lead to a disconnect between final coding for billing and the documentation. “A smooth billing process is contingent upon accurate coding, so this disconnect can cause delays, requiring additional time and effort to review the documentation and ensure accurate coding,” he notes.

Morsch adds that the current structure and quantity of EMR data creates challenges for clinicians and administrative staff who need to quickly read and interpret medical records to support patient care and revenue cycle activities. He says that “care provider organizations are experiencing increased scrutiny from health plans as they more closely review documentation for clinical justification for diagnoses, treatment decisions, and coding selection, as well as denials and rework as a result of their attempt to use technology to auto-capture coding within the EMR as a replacement for coding professionals.”

Understanding the Risks
Krauss says current EMR documentation practices introduce notable risks in the form of overdocumentation or hyperdocumentation that is perpetuated by cut and paste. “Cut and paste is out of control and dangerous for patient safety,” he cautions. “The other notable risk is overdocumentation that leads to overcoding. But the real issue is that the clinical facts, information, and context is not in the chart. How do you defend a diagnosis with information that is not there?”

While health care organizations can unintentionally engage in overcoding within the parameters of EMR documentation, Elion points out that some CDI practices have also been overly “creative,” resulting in notable consequences.

In one well-publicized incident, a small hospital in rural northern California claimed to have treated 1,030 kwashiorkor cases—18% of the organization’s patients—over a two-year period. A severe form of malnutrition, kwashiorkor is most often seen in children who come from areas where severe starvation is common, such as Africa. It is rarely seen in the United States. “I have never seen a case of kwashiorkor,” Elion says.

Notably, kwashiorkor brings with it an increased reimbursement of more than $11,000 per patient. With little documentation to support the diagnosis, it was a case of blatant upcoding, Elion says. “They got caught because of an algorithmic audit—they were an outlier,” he says.

In less obvious cases, hospitals can get into trouble for documentation practices that appear to exist solely for the purpose of getting paid. Sepsis is a prime example. Defined as a life-threatening condition that arises when the body’s response to infection causes injury to its own tissue and organs, sepsis can carry common symptoms such as fever, but it’s often accompanied by other more serious symptoms.

In one case, a hospital had a 33% incidence of sepsis, which, when not acquired during a hospital stay, carries a higher level of reimbursement. “You can do this in a compliant way,” Elion suggests. “Compliance officers often fuss about this because they don’t want to be in the headlines.”

Turning Challenges Into Opportunity
Many industry professionals believe that continued education and training of CDI professionals and physicians can go a long way toward improving the documentation outlook. “One opportunity is for us CDI people to become more knowledgeable about what is a good note, what is a good H&P,” Krauss says. “We could help improve the quality of the documentation. We cannot change the workflow process, but we can help develop better templates.”

Krauss suggests that templates are designed to primarily address billing needs as opposed to communicating care. “We need to be physician advocates,” he says. “All we do is use physicians as targets of queries. The fundamentals of documentation should be evaluation management. Evaluation management is the exchange of clinically reasonable and necessary information between doctor and patient and other health care providers and nurses, and the use of information and management of the patient.”

Elion emphasizes the need for better physician education. “If [the physician] note in the chart is so important, why aren’t they teaching this in medical school?” he asks, noting that a gap remains between the way physicians want to talk and what coders need to hear. “You can train physicians to be better at it.”

Elion suggests there are two words that physicians should use to make their notes infinitely better: “due to.” For example, a note should read that a patient “has anemia due to a GI bleed.” Otherwise, if a physician documents only that a patient is anemic, the coder has no idea it’s associated with a bleed.

Additionally, Elion recommends CDI specialists limit the use of “unspecified codes,” which typically exist due to a lack of specificity in the patient note. “Doctors need a report card on this,” he says, pointing out that there are more than 9,000 unspecified diagnoses in ICD-10. “Specificity is critical for new reimbursement models.”

Concurrent documentation practices that ensure CDI specialists are tracking and identifying potential issues before a patient leaves the hospital continue to be among the best solutions employed by hospitals, Elion says. “I want to see doctors uncoupled from the coding. We want to produce documentation that is coder friendly, that captures everything we can possibly capture,” he says.

Many health care organizations are turning to HIT solutions to automate and speed the documentation process, according to Morsch, who cautions against becoming overreliant on these tools. “While solutions like customizable clinical documentation templates in EMRs have streamlined the structure and completion of a patient’s history and physical exam, it can become easy to rely on the template and overlook the clinical clarity of the patient story,” he says. “A high-performing health system includes not only a well-functioning revenue cycle but, more importantly, the consistent delivery of quality care to patients. For care providers whose priority and expertise are their patients and providing quality medical care, when it comes to documentation their focus is on accurately recording that care. When this is done well, it directly feeds into the documentation required for billing purposes.”

Morsch says the greatest risk associated with documentation challenges is “to do nothing.” He adds that technology should focus on driving the accuracy of documentation as close to the point of care as possible, potentially reshaping the traditional revenue cycle and better positioning health care organizations to bridge the gap between fee-for-service and fee-for-value models.

“Accurate documentation supports accurate coding, helping to ensure appropriate payment and accurate quality scoring,” he says, adding that artificial intelligence and natural language processing can provide feedback and corrective action at the point of care. “In addition, a proactive approach to ensuring accuracy can reduce costly rework and denials, improve cash flow, and promote information integrity.”

— Selena Chavis is a Florida-based freelance journalist whose writing appears regularly in various trade and consumer publications, covering everything from corporate and managerial topics to health care and travel.

The Price of EHR Downtime

The Price of EHR Downtime

The Price of EHR Downtime

The Price of EHR Downtime
By Elizabeth S. Goar
For The Record
Vol. 29 No. 11 P. 24

Costs come in various forms, but no matter how it’s calculated, an idle EHR takes a heavy toll.

As hospitals step up adoption of advanced EHR systems and applications that draw from the technology, the cost of unplanned downtime has escalated rapidly, rising an estimated 30% in the past seven years to more than $634 per physician per hour. That’s according to Mark Anderson, CEO of AC Group, whose 2011 study placed the average cost of system downtime at $488 an hour per physician.

“Originally, users were [primarily] clinical informaticists and nurses. But now that doctors are the ones using EHRs, costs have increased,” Anderson says, adding that the original research “looked at hospital costs, but physician salaries were not included because they were not a major factor. … Today, more hospitals are hiring doctors, so now we have to include their offices” in the calculations.

Calculating Costs
The AC Group study looked primarily at software outages and centered on the costs associated with moving to paper processes during downtime and converting back to electronic when systems came back online.

“All the work you’d normally do takes four or five times longer when the system is down, so each minute you spend taking care of patients takes four or five minutes doing the same work [during outages],” Anderson says. “We looked at every department, how many people were on the system. It was a time-and-motion study in four hospitals, so it wasn’t just theory.”

Anderson points out that there are other factors, such as hardware, batteries, and generators, that contribute to downtime that the original software-specific study did not take into account. There are also multiple other systems—radiology, laboratory, surveillance, clinical decision support—that interface with the EHR that are also impacted by outages at a far higher rate than five years ago.

The end user has also evolved, inflating the bottom-line costs of downtime, Anderson says. “Back then, it was nurses [managing] documentation. Even with CPOE [computerized physician order entry], nurses were doing the entry. Now doctors have to do it,” he says. “Also, more patients are being treated in ambulatory settings, so hospitals have a higher acuity level, which requires even more accuracy.”

Data’s influence on health care has risen exponentially over the past decade as demand has grown for real-time access to patient records and encounter-specific clinical decision support at the point of care. This, coupled with broader access by clinical and operations teams, has sent downtime costs skyrocketing—which has, in turn, exacerbated the complexity of calculating those costs.

Everbridge, an emergency management communications company that serves a wide range of industries, set out earlier this year to determine the current price of downtime. Its survey found that the average cost per minute was $8,662 across 20 industries, including health care.

That includes “productivity losses when clinicians cannot do their jobs as usual because the EHR is not available and the same for productivity of the IT team, and revenue loss incurred by the software,” says Vincent Geffray, senior director of IT alerting for Everbridge, who adds that the figure tracks with the $8,900 per minute identified by the Poneman Institute.

“The cause for any system to go down can be more complex than just the software or vendor, especially with the digital transformation that means many components must be up and running to make the EHR available to the clinical teams,” he says. “There are many causes, like network outages, but what we see a lot of right now is an increased number of cyber attacks in the health care industry. Those systems have to be up and running all the time or patient safety is at risk, so hackers know hospitals will have to pay the ransom or respond to the attack.”

Eric Chetwynd, Everbridge’s general manager of healthcare solutions, notes that the industry’s relationship to the EHR has evolved since the advent of the meaningful use incentive program, followed by a plethora of other government-backed initiatives that have driven EHR adoption to its current ubiquitous status. As hospitals become more reliant on EHRs, clinical and operational teams are no longer as familiar with the time-consuming paper-based processes that typically serve as backups when systems are down.

“The reliance on the EHR for day-to-day functions makes it more critical and raises the impact of overall downtime on patient outcomes, and also on patient care workflows. All these things make the EHR critical in the hospital setting,” Chetwynd says. “They’ve become so critical that every moment of downtime has an impact on the care and how it’s delivered. If I can’t record my medications or know what the medication regimen is, if it’s down for an hour, I can’t give patients medications. Clinicians have a very low tolerance for downtime. The higher the acuity setting, the more important [uptime] is.”

Peak 10 + ViaWest, which provides data center, networking, managed, and cloud services to multiple industries, offers its customers a gap analysis to help pinpoint the true cost of downtime. According to sales director Joe DeBlasio, the impact of outages is highly individualized and must consider both tangible and intangible costs (see sidebar).

It’s a complex calculation that takes into consideration an organization’s size, patient volume, and connected systems. In terms of intangibles, among the most significant cost drivers are patient wait times and the cost of any adverse events caused by treatment delays.

“There was a statistic released recently that indicated 70% of hospitals had one or more patients injured during downtime. That could result in millions of dollars in malpractice [litigation],” DeBlasio says. “Every year more legislation comes down that [makes hospitals] more susceptible to higher fines and costs. Irresponsible EHR management is often enough [to trigger them]. It doesn’t even have to be downtime.”

Mitigating the Impact
There are ways to lessen the blow of unplanned EHR downtime. The first is to have an effective disaster recovery (DR) and business continuity (BC) plan in place. Like any problem, the first step is recognizing it exists, says Carolyn Byerly, a member of the C-level advisory services team at the HCI Group.

She notes that downtime costs fall into several categories, each of which can be addressed in a “well-thought-out, coordinated” DR/BC plan. These include operational, in particular protecting the supply chain, surgery, and other ancillary services. Other areas that require protection are physician productivity, which comes down to protecting access to critical patient data, and costs associated with lost opportunities when downtime results in patients opting to take their needs elsewhere and slow bill time.

“There have been a number of studies done over the last five years that show that as we become more of a digital enterprise, organizations are more focused on DR and BC than they were 10 years ago,” Byerly says.

Once the need for a DR/BC plan has been recognized, assign development roles to the appropriate individuals. The goal should be to identify team members who are certified experts in DR disciplines and able to handle the project’s planning aspects.

“A lot of health care organizations don’t recognize that you can’t just hand that role off to anyone. You need someone who knows what’s needed in the plan and can ask the right questions,” Byerly says. “The second step is identifying the criticality of processes and applications, and the technology environment. What are the priorities in terms of planning for a business recovery? More processes are critical now than they were 20 years ago, so it’s an exercise you need to go through with a team of people who can help with the identification and prioritizing.”

Complicating the process of identifying critical processes and systems is the “burgeoning of the ‘Internet of Things’ that is going on,” Chetwynd says. “Everything from bed alarms to bed management to patient monitoring [is] integrating back to the EHR in some way. If your source of truth is viewed by clinicians and nurses through the EHR world, and that EHR goes offline, it all goes offline. That’s something like 13 devices per bed.”

The Vendor’s Role
Achieving the near 100% uptime today’s EHR-driven environment demands requires redundancy, which in turn requires a substantial capital investment. For this reason, many organizations turn to service providers that can provide the infrastructure necessary to achieve downtimes of 0.1% and lower.

These vendors can provide hospitals with a higher level of protection against financially devastating downtimes. They produce an environment that has already undergone close scrutiny to ensure the highest levels of security and compliance, and have the experts in place to manage those systems to ensure peak operational efficiencies.

Even so, vendors cannot guarantee 100% uptime, which is why “there is a lot more contractual language with vendors about the uptime,” Chetwynd says. “Five years ago, there might have been a brief amount of language in the contract about doing some maintenance. Now there is language about guaranteeing uptime. From a penalty perspective, because hospitals have the risk, they are demanding more things around ‘if it’s down for so long there’s a financial penalty.’ [Hospitals] can get out of the contract in some cases or do a charge back, but today they are more savvy about asking for some kind of penalty. … It’s about holding vendors more accountable.”

Regardless of contractual language, turning over DR/BC to an outsourced service provider does not absolve the hospital of all responsibility. The organization must continue to ensure it has established the proper policies and has a trained team of internal personnel to carry out those policies when downtimes do occur.

“As a vendor, we like to say, ‘outsource everything to us,’ but in reality we need intelligent teams inside, too—actual IT professionals internally,” DeBlasio says. “Vendors take on a lot of the risk and provide a lot of the resources … but you need to spell out your business needs and manage [the relationship] yourself.”

Whether a hospital outsources its DR/BC systems or manages them in house, Anderson says the price is too high not to pay attention to the minutia, down to ensuring batteries are charged, generators are functioning, and disaster plans are up to date and tested. Any weakness could spell disaster—financial and otherwise.

“If you lose one piece of data, if you miss an allergy because a system went down for one minute, that could kill someone,” he says. “You need to assume the worst, and then you’ll be okay.”

— Elizabeth S. Goar is a Tampa, Florida-based freelance writer specializing in health care and HIT.


Joe DeBlasio, sales director with Peak 10 + ViaWest, shares the following industry-accepted steps to help organizations calculate the cost of unplanned EHR downtime:

1. Compute the average annual salary costs (including benefits).

2. Multiply that value by 2.15 (the calculated cost [in dollars]/minute of system unavailability).

3. Divide by 2,080 (average number of hours paid per staff member annually [52 weeks x 40 hours per week]).

4. Determine the number of hours during which the system needs to be available to staff. *Note: Even though operational hours may be 9 AM to 5 PM, users may need access to the system before and after this period.

5. Multiply the value from step 4 by 52 (weeks/year) and again by 1% or the expected percentage of downtime given your server platform. The product of this equation represents the expected hours/year of downtime.

6. Take the value from step 3 (which represents the cost of staff per hour) multiplied by the estimated downtime per year found in step 5. The final value is the estimated cost per year of unplanned EHR downtime.


Bacteria taints 71 percent of commonly used medical scopes, study finds

Bacteria taints 71 percent of commonly used medical scopes, study finds

Bacteria taints 71 percent of commonly used medical scopes, study finds

In an ominous sign for patient safety, 71 percent of reusable medical scopes deemed ready for use on patients tested positive for bacteria at three major U.S. hospitals, according to a new study.

The paper, published last month in the American Journal of Infection Control, underscores the infection risk posed by a wide range of endoscopes commonly used to peer deep into the body. It signals a lack of progress by manufacturers, hospitals and regulators in reducing contamination despite numerous reports of superbug outbreaks and patient deaths, experts say.

“These results are pretty scary,” said Janet Haas, president of the Association for Professionals in Infection Control and Epidemiology. “These are very complicated pieces of equipment, and even when hospitals do everything right we still have a risk associated with these devices. None of us have the answer right now.”

The study found problems in scopes used for colonoscopies, lung procedures, kidney stone removal and other routine operations. Researchers said the findings confirm earlier work showing that these issues aren’t simply confined to duodenoscopes, gastrointestinal devices tied to at least 35 deaths in the U.S. since 2013, including three at UCLA’s Ronald Reagan Medical Center. Scope-related infections also were reported in 2015 at Cedars-Sinai Medical Center in Los Angeles and Pasadena’s Huntington Hospital.

The bacteria this latest study found weren’t superbugs, but researchers said there were potential pathogens that would put patients at high risk of infection. The study didn’t track whether the patients became sick from possible exposure.

The study’s authors said the intricate design of many endoscopes continues to hinder effective cleaning and those problems are compounded when health-care workers skip steps or ignore basic protocols in a rush to get scopes ready for the next patient. The study identified issues with colonoscopes, bronchoscopes, ureteroscopes and gastroscopes, among others.

“Sadly, in the 10 years since we’ve been looking into the quality of endoscope reprocessing, we haven’t seen improvement in the field,” said Cori Ofstead, the study’s lead author and an epidemiologist in St. Paul, Minn., referring to how the devices are prepared for reuse.

“If anything, the situation is worse because more people are having these minimally invasive procedures and physicians are doing more complicated procedures with endoscopes that, frankly, are not even clean,” Ofstead said.

The rise of antibiotic-resistant superbugs such as CRE (carbapenem-resistant Enterobacteriaceae), which can be fatal in up to half of patients, has made addressing these problems more urgent. About 2 million Americans are sickened by drug-resistant bacteria each year and 23,000 die, according to the federal Centers for Disease Control and Prevention.

“We’re not moving fast enough to a safer world of reusable medical devices,” Michael Drues, an industry consultant in Grafton, Mass., who advises device companies and regulators. “There is plenty of fault to go around on device companies, hospitals, clinicians, on basically everybody.”

Despite the potential risks, medical experts caution patients not to cancel or postpone lifesaving procedures involving endoscopes. These snakelike devices often spare patients from the complications of more invasive surgeries.

“Patients should speak to their provider and think about the risks versus the benefits,” said Haas, who is also director of epidemiology at Lenox Hill Hospital in New York City.

The Food and Drug Administration and Olympus Corp., a leading endoscope manufacturer in the U.S. and worldwide, both said they are reviewing the study.

Last month, the FDA issued warning letters to Olympus and two other scope makers for failing to conduct real-world studies on whether health-care facilities can effectively clean and disinfect their duodenoscopes. The FDA ordered the manufacturers to conduct those reviews in 2015 after several scope-related outbreaks in Los Angeles, Seattle and Chicago made national headlines.

Olympus spokesman Mark Miller said the Tokyo-based company intends to “meet the milestones set forth by the FDA. … Patient safety has always been and remains our highest priority.”

The latest study examined 45 endoscopes, with all but two manufactured by Olympus. The other two were Karl Storz models.

Last year, researchers visited three hospitals, which weren’t named, and performed visual examinations and tests to detect fluid and contamination on reusable endoscopes marked ready for use on patients. One hospital met the current guidelines for cleaning and disinfecting scopes, while the other two committed numerous breaches in protocol.

Nevertheless, 62 percent of the disinfected scopes at the top-performing hospital tested positive for bacteria, including potential pathogens. It was even worse at the other two — 85 and 92 percent.

The study painted a troubling picture at the two lower-performing hospitals, which were well aware researchers were watching.

Among the safety issues: Hospital technicians wore the same gloves for handling soiled scopes fresh after a procedure and later, when they were disinfected and employees wiped down scopes with reused towels. Storage cabinets for scopes were visibly dirty and dripping wet scopes were hung up to dry, which is a known risk because bacteria thrive on the moisture left inside. The two hospitals also turned off a cleaning cycle on a commonly used “washing machine,” known as an automated endoscope reprocessor, to save time.

“It was very disturbing to find such improper practices in big health systems, especially since these institutions were accredited and we assumed that meant everything would have been done properly,” said Ofstead, chief executive of the medical research firm Ofstead & Associates.

Ofstead and her co-authors recommended moving faster toward sterilization of all medical scopes using gas or chemicals. That would be a step above the current requirements for high-level disinfection, which involves manual scrubbing and automated washing. A shift to sterilization would likely require significant changes in equipment design and major investments by hospitals and clinics.

In their current form, many endoscopes aren’t built to withstand repeated sterilization. Some also have long, narrow channels where blood, tissue and other debris can get trapped inside.

In some cases, disposable, single-use scopes are an option, and new products are starting to gain acceptance. In other instances, certain parts of a scope might be disposable or removable to aid cleaning.

The Joint Commission, which accredits many U.S. hospitals and surgery centers, issued a safety alert last year about disinfection and sterilization of medical devices in response to a growing rate of noncompliance. In 2016, the Joint Commission cited 60 percent of accredited hospitals for noncompliance and 74 percent of all “immediate threat to life” citations from surveyors related to improperly sterilized or disinfected equipment.

Michelle Alfa, a professor in the department of medical microbiology at the University of Manitoba, said accreditors may need to conduct more frequent inspections and endoscopy labs should be shut down “if they don’t get their act together. These results are totally unacceptable now,” says NYU’s Dr. Triola. Instead, what’s important is teaching “information-seeking behavior,” he says, such as what sources to trust and how to avoid information overload.

Technology is also changing how med students learn. Simulators that look like patients and can be programmed to go into cardiac arrest, have strokes, spike fevers, cry, vomit and eliminate are particularly useful for teaching.

“Some schools don’t use cadavers anymore,” says the AMA’s Dr. Skochelak. “But others think it’s an important way to learn respect” for the real human body. “They tell students, ‘This is your first patient.’ ”

Some schools are condensing the typical four-year curriculum into three years, to let students start their residencies sooner and graduate with less debt. The Association of American Medical Colleges is also studying ways to let students master needed skills and competencies at their own pace—an innovation that has come to medical residency programs as well.

“We should have done this 10 years ago,” Dr. Decker says of the many med school changes. Then he quotes a Chinese proverb: “The best time to plant a tree is 20 years ago. The next best time is tomorrow.”

Ms. Beck is a health reporter and columnist for The Wall Street Journal in New York. She can be reached at

EHR Implementations: Don’t Be Left Out

EHR Implementations: Don’t Be Left Out

EHR Implementations: Don’t Be Left Out

Lost in the shuffle of minding budgets, keeping the C-suite happy, and ensuring schedules are met, there’s another often-overlooked reality of any EHR implementation: HIM professionals—the legal guardians of the health record—must be key participants in any process involving the integrity of patient records.

While this is undoubtedly true, it isn’t necessarily the reality at most health care organizations. For various reasons, HIM professionals are frequently left out of the EHR implementation process.

Often, this oversight is beyond the HIM manager’s control. Nevertheless, there are several strategies HIM managers can employ to rectify this situation.

The following are a few reasons HIM should advocate to become involved in EHR implementations:

Be seen: There is a tendency for overworked, overstressed HIM professionals who are barely managing to keep up with countless to-do lists, budgets, productivity reports, and committee meetings to lose sight of how to be recognized (seen). When executives witness the strategic value of a subject matter expert, the invitation to be part of the team occurs more frequently.

How is value demonstrated? Education, promoting awareness (via marketing), and report cards demonstrating quantitative improvement trends are among the best tools to illustrate HIM’s EHR sophistication and prowess.

Be involved: Volunteer to talk about new legislative changes (eg, hierarchical condition category coding, MACRA, MIPS) to core executives or a physician group or explain the need to improve documentation and coding skills to accommodate the demand these changes will have on the EHR’s documentation requirements.

Introduce an “open house” day where hospital staff can learn about different HIM functions such as release of information, deficiency analysis, and transcription. Viewing HIM functions through a self-marketing lens allows the HIM manager to enjoy a pat on the back without bragging and boosts the image of expertise, ie, someone who should be involved in an EHR implementation.

Be informative: Providing report cards focused on both the quality and quantity of HIM metrics alerts the executive team to the importance of such data. This makes it easier for executives to recognize the variables and reasons behind productivity slowdowns that may be pinpointed to the EHR.

What are the benefits of being seen, being involved, and being informative? It may seem like an inordinate amount of time and hard work just to be involved in EHR implementations. However, the long-term payoff can be significant.

Benefits of Being on the EHR Team 
Those involved in the EHR implementation process help set the rules. Did you ever volunteer to take minutes at a committee meeting? Some individuals refuse to take on this task, believing it will result in them being perceived as a less skilled participant.

Not true. The person assigned this responsibility gains a greater awareness of the topics discussed and can be sure to include thoughts from an HIM perspective that may otherwise not get noted in the minutes. With items on record, there’s a greater chance HIM will be noticed when the project begins to shape.

HIM managers involved with EHR implementation can ensure certain standards are met during workflow design. For example, are episodes and encounters listed correctly and in chronological order to represent hospital visits and not intermingled with physician office encounters and notes? Do coders and clinical documentation improvement staff view the same documentation and physician queries? Do coders have to reenter diagnoses and procedures into the encoder and query system separately? Are physicians permitted to use copy and paste, potentially resulting in data integrity problems and fraudulent billing?

Another benefit of HIM being part of the EHR team is that it keeps the department in the loop with regard to what’s happening in the organization. For example, being on the team helps ensure staffing and workflow issues surrounding the implementation do not conflict with other HIM-related activities. Imagine attempting to implement an EHR system during a scheduled audit or a Joint Commission survey.

Educating the team—and the EHR vendor—of conflicts when planning master schedules can avert a scheduling disaster.

Understanding Expectations and Responsibilities
When the moment of truth arrives and HIM managers become active members of the EHR implementation team, it carries great responsibilities, some of which may be unfamiliar. To stand out in the new role, preparation is key.

Be sure to network with other colleagues or consultants that have already been through the process of installing an EHR or converting from one system to another.  Vendors are also helpful in providing references and site visits and even a more thorough understanding of any confusing functionality.   Don’t forget vendor user groups—those who participate get to provide input in determining future enhancements and functionality improvement.

The following best practice tips can catapult HIM managers into becoming valuable participants in EHR projects, which builds confidence levels and leads to a rewarding experience.

Understand basic EHR/computer terms. Neophytes to the EHR world must familiarize themselves with the technical and clinical terms they may encounter during the implementation. The quickest way to have one’s input be disregarded is to appear uneducated about the system. Study the available vendor information to help grow that expertise.

HIM professionals are typically accomplished communicators and documenters; once entrenched in the implementation, don’t be afraid to take a leadership role.

Understand the clinician point of view. A thorough understanding of clinical workflow (current and ideal) in each service area allows HIM managers to better represent their constituent workflow needs as they become involved in designing screen flow. This includes inpatient, outpatient, and postacute care settings, as well as physician clinic and ambulatory environments. HIM professionals are unique in that they’re well versed in the clinical, financial, and administrative functionality of the health care environment.

Understand the active vs discharge legal health record. Don’t forget the needs of the active care encounter as well as the postdischarge documentation-based encounter. Once a patient is discharged, all documentation must be visualized in a postdischarge record that should no longer be dynamic in nature. Additionally, all subsequent amended/completed documentation must be added to the record without displacing, replacing, or changing any prior documentation.

Facilities that do not have two types of records (dynamic and postdischarge stable) in place may find it difficult to defend their archived medical records. HIM professionals can help ensure these needs are addressed.

Understand archive policies. Remember to discuss issues such as long-term retention and the purging of records. These topics often get ignored; however, it is just as critical to have these types of policies in place as it was when paper records were commonplace. With interoperability an industrywide goal, defining how records are systematically destroyed is paramount.

One of the most common complaints heard about EHRs is the fragmentation of content. If locating this information is difficult in the active record, how well thought out and documented must be the retention and destruction of all electronic documentation? HIM professionals are trained to identify how to approach this challenge from a policy point of view.

Regardless of environment, participation in an EHR implementation can be a rewarding, career-changing experience for HIM managers who dedicate the time and exert the energy to actively make a difference in the success of such a critical project.

— Darice M. Grzybowski, MA, RHIA, FAHIMA, an AHIMA-approved ICD-10-CM /PCS, trainer, and ambassador, is the president of HIMentors.

Fake News About Health You Need to Stop Believing

Fake News About Health You Need to Stop Believing

Fake News About Health You Need to Stop Believing

Thousands of well-educated people share erroneous beliefs. With the help of my father, Jack Gorman, M.D., I began to explore why people develop these mindsets and wrote the book Denying to the Grave: Why We Ignore the Facts That Will Save Us. You’re about to learn the neurological basis for how such thinking “narrows” the brain and how to reverse the process in yourself and others. But first, let’s take a look at six prevailing health myths that some people still believe.

I met Luke at a professional conference. I’m a public health and behavioral science expert; Luke’s a statistician for a hospital system in New York City. I thought he was brilliant — two Ivy League degrees, decades of experience working with top medical professionals, married to a cardiologist. We stayed in touch. A few months later, after I began speaking out about irrational health beliefs, including the myth that vaccines cause autism, I ran into Luke again.

“I read your piece,” he said.

I smiled awkwardly. His son had autism, but we’d never discussed it. What came next shocked me. “How can you say vaccines don’t cause autism?”

At first I thought he was joking. But no: Luke is an anti-vaxxer, convinced that childhood vaccines are a pharmaceutical conspiracy. He blames vaccines for his son’s autism. He mistrusts doctors in general, and he and his wife (a cardiologist, remember) follow a natural lifestyle that minimizes interaction with them. He also believes eggs and milk cause cancer.

I didn’t know what to say. I never thought someone with Luke’s background and intellect could defend beliefs that science had thoroughly debunked.

Luke isn’t alone. Thousands of well-educated people share such erroneous beliefs. With the help of my father, Jack Gorman, M.D., I began to explore why people develop these mindsets and wrote the book Denying to the Grave: Why We Ignore the Facts That Will Save Us. You’re about to learn the neurological basis for how such thinking “narrows” the brain and how to reverse the process in yourself and others. But first, let’s take a look at six prevailing health myths that some people still believe.



In 1998, the British gastroenterologist Andrew Wakefield published a study in The Lancet claiming a link between autism and the MMR (measles, mumps, rubella) vaccine children receive. The finding terrified parents and reverberates to this day. According to CDC data, nonmedical exemptions for school-required immunizations are rising in 11 states. Measles is an illness that can kill. Despite that, a 2015 CNN poll found anti-vaxxer parents more likely to be wealthier, white, and college educated. Of those surveyed, 57 percent cited “concerns about autism” as the reason for not vaccinating their kids.


The Lancet retracted the Wakefield study in 2010, citing invalid science. As it turns out, Wakefield had committed fraud by taking money from vaccine injury lawyers and falsifying data. He later lost his medical license. The American Academy of Pediatrics now lists 20 pages of studies and other evidence showing no link between vaccines and autism. Researchers also warn that a mere 5 percent decline in MMR immunizations among 2- to 11-year-olds would triple the annual number of measles cases in this age group nationwide, resulting in $2.1 million in yearly health care expenditures and needlessly claiming young lives.



In the 1940s, heart disease was the top killer in the United States. To identify the causes, many studies were launched, including the landmark Framingham Heart Study and the Seven Countries Study. The latter examined risk factors across cultures and linked diets high in saturated fat to heart disease. The American Heart Association endorsed the findings and sounded the alarm on saturated fat. Companies responded with low-fat processed foods. Belief in the heart-healthy benefits of a low-fat diet still persists today, even though heart disease remains the leading cause of death in the nation.


Hundreds of millions of dollars have been spent trying to replicate the Seven Countries finding, without success. In fact, the study’s methodology has come into question. In November, new research in The Lancet spanning 18 countries across five continents concluded that “total fats and types of fat were not associated with cardiovascular disease.” Ironically, reaction to that original flawed science turned out to be the real killer. Since low-fat food is bland, food producers added sugar. We now know that sugar is extremely harmful to health. According to a 2014 JAMA Internal Medicinestudy, people who get 25 percent or more of their daily calories from added sugar are more than twice as likely to die of heart disease as those who get 10 percent or less. And that’s regardless of age, sex, physical activity, and body mass index.



Prostate cancer is the most common cancer among men, and it’s second only to lung cancer in deadliness. PSA stands for “prostate-specific antigen,” and the test measures blood levels of this protein; a high number can suggest prostate cancer. In 1994, the FDA approved routine PSA screening for asymptomatic men. Since it’s such a simple, noninvasive way to apparently detect a dreaded disease, physicians prescribe and patients request an estimated 20 million PSA tests annually.


Mass PSA screening does more harm than good. In a 2010 New York Times editorial, even the doctor who discovered PSA called large-scale testing a “profit-driven public health disaster.” In 2012, the U.S. Preventive Services Task Force advised against routine testing, citing evidence that about 80 percent of results are false positives. Unnecessary treatments can include painful biopsies, surgery, and radiotherapy. Experts estimate that up to five of every 1,000 men who undergo prostate cancer surgery die within a month, and at least 200 of every 1,000 men who have radiotherapy and surgery for prostate cancer suffer complications like urinary incontinence, erectile dysfunction, and/or bowel problems. The American Urological Association no longer recommends routine PSA screening for men younger than 55 unless they have risk factors (like smoking, being overweight, or having a family history of prostate cancer).



It’s difficult to pinpoint when or where this idea started, but the raw milk movement has been gaining momentum. Twelve U.S. states now permit sales of raw milk in stores, and 13 more allow it to be sold on the farm where it was produced. Pasteurization involves heating milk to kill harmful bacteria, but raw milk proponents claim the heat also kills important nutrients and that consuming pasteurized dairy products causes allergic reactions and symptoms of lactose intolerance.


The FDA says no there’s evidence that raw milk is any more beneficial than the pasteurized stuff. In fact, the opposite is true: While unpasteurized dairy products such as milk and cheese are consumed by a little more than 3 percent of the population, they cause 96 percent of the illnesses originating from contaminated dairy products, according to the CDC. That’s 840 times more illness and 45 times more hospitalization than from pasteurized dairy products. Infection from harmful bacteria in raw milk (like salmonella, E. coli, and listeria) can result in organ failure, miscarriage, paralysis, or death.



The vertebrae lining your backbone are cushioned by shock-absorbing discs. Over time, disc damage leads to micromovements that can trigger pain. Also, spinal nerves can be pinched, resulting in radiating leg pain. On an X-ray, CT scan, or MRI, the narrowing of space between vertebrae is often interpreted as disc degeneration and the source of pain. It would seem logical, then, that fusing vertebrae and/or removing bone would limit micromovements, make space for nerves, and ease the ache. This thinking led to a 70 percent spike in such surgeries from 2001 to 2011. About 400,000 are done annually.


A recent study determined that up to 40 percent of people who undergo back surgery could continue to have significant pain afterward. Plus, rates of complication from such operations — some life-threatening — can be upwards of 20 percent. In fact, even with today’s medical advances, finding the cause of lower-back pain is very difficult. About 85 percent is the “nonspecific” type–no cause identified–and surgery is typically not the best option for these cases. In 2017, the American College of Physicians published new guidelines, recommending that doctors choose non-pharmacologic, nonsurgical options, such as exercise, physical therapy, and cognitive behavioral therapy, to treat both acute and chronic lower-back pain.



Passing an electric current through the brain to spark a chemistry-shifting seizure seems crude if not barbaric. Hollywood portrays restrained patients convulsing in pain before becoming passive and sustaining permanent memory loss and personality changes. (See: One Flew Over the Cuckoo’s Nest and American Horror Story: Asylum.) Indeed, in 1950s psychiatric hospitals, electroconvulsive therapy (ECT) was used to “treat” everything from unruly behavior to homosexuality. And those perceptions persist: In a U.K. survey, about 20 percent of respondents said they fear dying if they receive ECT.


For people dealing with severe, treatment-resistant depression, ECT is often the only option. While antidepressant meds generally have a 50 to 60 percent success rate, ECT is effective 70 to 90 percent of the time. Studies consistently show that memory loss from ECT is usually temporary and that the treatment is safe. Patients also experience no pain from the current, and there’s no visible convulsing. Overall, ECT is a highly effective antidepressant treatment–and for suicide prevention, it’s significantly superior to drug therapy. What’s more, patients often see dramatic improvement after just a week or two of ECT, versus the six to eight weeks needed for antidepressants to take full effect. Then there are the drugs’ possible side effects: weight gain, sex drive changes, sleep disturbances, and upset stomach.

Original article: